WorldWideScience

Sample records for processes remains largely

  1. Aerosol pH buffering in the southeastern US: Fine particles remain highly acidic despite large reductions in sulfate

    Science.gov (United States)

    Weber, R. J.; Guo, H.; Russell, A. G.; Nenes, A.

    2015-12-01

    pH is a critical aerosol property that impacts many atmospheric processes, including biogenic secondary organic aerosol formation, gas-particle phase partitioning, and mineral dust or redox metal mobilization. Particle pH has also been linked to adverse health effects. Using a comprehensive data set from the Southern Oxidant and Aerosol Study (SOAS) as the basis for thermodynamic modeling, we have shown that particles are currently highly acidic in the southeastern US, with pH between 0 and 2. Sulfate and ammonium are the main acid-base components that determine particle pH in this region, however they have different sources and their concentrations are changing. Over 15 years of network data show that sulfur dioxide emission reductions have resulted in a roughly 70 percent decrease in sulfate, whereas ammonia emissions, mainly link to agricultural activities, have been largely steady, as have gas phase ammonia concentrations. This has led to the view that particles are becoming more neutralized. However, sensitivity analysis, based on thermodynamic modeling, to changing sulfate concentrations indicates that particles have remained highly acidic over the past decade, despite the large reductions in sulfate. Furthermore, anticipated continued reductions of sulfate and relatively constant ammonia emissions into the future will not significantly change particle pH until sulfate drops to clean continental background levels. The result reshapes our expectation of future particle pH and implies that atmospheric processes and adverse health effects linked to particle acidity will remain unchanged for some time into the future.

  2. Large transverse momentum hadronic processes

    International Nuclear Information System (INIS)

    Darriulat, P.

    1977-01-01

    The possible relations between deep inelastic leptoproduction and large transverse momentum (psub(t)) processes in hadronic collisions are usually considered in the framework of the quark-parton picture. Experiments observing the structure of the final state in proton-proton collisions producing at least one large transverse momentum particle have led to the following conclusions: a large fraction of produced particles are uneffected by the large psub(t) process. The other products are correlated to the large psub(t) particle. Depending upon the sign of scalar product they can be separated into two groups of ''towards-movers'' and ''away-movers''. The experimental evidence are reviewed favouring such a picture and the properties are discussed of each of three groups (underlying normal event, towards-movers and away-movers). Some phenomenological interpretations are presented. The exact nature of away- and towards-movers must be further investigated. Their apparent jet structure has to be confirmed. Angular correlations between leading away and towards movers are very informative. Quantum number flow, both within the set of away and towards-movers, and between it and the underlying normal event, are predicted to behave very differently in different models

  3. Efficient querying of large process model repositories

    NARCIS (Netherlands)

    Jin, Tao; Wang, Jianmin; La Rosa, M.; Hofstede, ter A.H.M.; Wen, Lijie

    2013-01-01

    Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business

  4. Premortal data in the process of skeletal remains identification

    Directory of Open Access Journals (Sweden)

    Marinković Nadica

    2012-01-01

    Full Text Available Background/Aim. The basic task of a forensic examiner during the exhumation of mass graves or in mass accidents is to establish identity of a person. The results obtained through these procedures depend on the level of perceptibility of post mortal changes and they are compared with premortal data obtained from family members of those missing or killed. Experience with exhumations has shown significant differences between the results obtained through exhumation and the premortal data. The aim of the study was to suggest the existance of the difference between premortal data and the results obtained by exhumation regarding the some parameters, as well as to direct premortal data colection to the specific skeletal forms. Methods. We performed comparative analysis of the results of exhumation of skeletal remains in a mass grave and the premortal data concerning the identified persons. The least number of individuals in this mass grave was calculated according to the upper parts of the right femur and it helped in calculating the smallest number of individuals in mass graves to be 48. A total of 27 persons were identified. Sex was determined by metrics and morphology of the pelvis. Personal age in the moment of death was determined by morphology features of groin symphisis and morphology of sternal edge of ribs and other parts of scelets observations. The hight was calculated as average results of length of long bones and Rollet coefficients. Results. There was a complete match in terms of sex and age matched within an interval that could be established based on the skeletal remains. All the other parameters were different, however, which made identification significantly more difficult. Conclusion. The premortal data is an important element of identification process and it should be obtained by the forensic doctor and directed towards more detailed examination of the skeletal system.

  5. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  6. Anammox-based technologies for nitrogen removal: Advances in process start-up and remaining issues.

    Science.gov (United States)

    Ali, Muhammad; Okabe, Satoshi

    2015-12-01

    Nitrogen removal from wastewater via anaerobic ammonium oxidation (anammox)-based process has been recognized as efficient, cost-effective and low energy alternative to the conventional nitrification and denitrification processes. To date, more than one hundred full-scale anammox plants have been installed and operated for treatment of NH4(+)-rich wastewater streams around the world, and the number is increasing rapidly. Since the discovery of anammox process, extensive researches have been done to develop various anammox-based technologies. However, there are still some challenges in practical application of anammox-based treatment process at full-scale, e.g., longer start-up period, limited application to mainstream municipal wastewater and poor effluent water quality. This paper aimed to summarize recent status of application of anammox process and researches on technological development for solving these remaining problems. In addition, an integrated system of anammox-based process and microbial fuel cell is proposed for sustainable and energy-positive wastewater treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Medical students perceive better group learning processes when large classes are made to seem small.

    Science.gov (United States)

    Hommes, Juliette; Arah, Onyebuchi A; de Grave, Willem; Schuwirth, Lambert W T; Scherpbier, Albert J J A; Bos, Gerard M J

    2014-01-01

    Medical schools struggle with large classes, which might interfere with the effectiveness of learning within small groups due to students being unfamiliar to fellow students. The aim of this study was to assess the effects of making a large class seem small on the students' collaborative learning processes. A randomised controlled intervention study was undertaken to make a large class seem small, without the need to reduce the number of students enrolling in the medical programme. The class was divided into subsets: two small subsets (n=50) as the intervention groups; a control group (n=102) was mixed with the remaining students (the non-randomised group n∼100) to create one large subset. The undergraduate curriculum of the Maastricht Medical School, applying the Problem-Based Learning principles. In this learning context, students learn mainly in tutorial groups, composed randomly from a large class every 6-10 weeks. The formal group learning activities were organised within the subsets. Students from the intervention groups met frequently within the formal groups, in contrast to the students from the large subset who hardly enrolled with the same students in formal activities. Three outcome measures assessed students' group learning processes over time: learning within formally organised small groups, learning with other students in the informal context and perceptions of the intervention. Formal group learning processes were perceived more positive in the intervention groups from the second study year on, with a mean increase of β=0.48. Informal group learning activities occurred almost exclusively within the subsets as defined by the intervention from the first week involved in the medical curriculum (E-I indexes>-0.69). Interviews tapped mainly positive effects and negligible negative side effects of the intervention. Better group learning processes can be achieved in large medical schools by making large classes seem small.

  8. Remaining Useful Life Prediction for Lithium-Ion Batteries Based on Gaussian Processes Mixture

    Science.gov (United States)

    Li, Lingling; Wang, Pengchong; Chao, Kuei-Hsiang; Zhou, Yatong; Xie, Yang

    2016-01-01

    The remaining useful life (RUL) prediction of Lithium-ion batteries is closely related to the capacity degeneration trajectories. Due to the self-charging and the capacity regeneration, the trajectories have the property of multimodality. Traditional prediction models such as the support vector machines (SVM) or the Gaussian Process regression (GPR) cannot accurately characterize this multimodality. This paper proposes a novel RUL prediction method based on the Gaussian Process Mixture (GPM). It can process multimodality by fitting different segments of trajectories with different GPR models separately, such that the tiny differences among these segments can be revealed. The method is demonstrated to be effective for prediction by the excellent predictive result of the experiments on the two commercial and chargeable Type 1850 Lithium-ion batteries, provided by NASA. The performance comparison among the models illustrates that the GPM is more accurate than the SVM and the GPR. In addition, GPM can yield the predictive confidence interval, which makes the prediction more reliable than that of traditional models. PMID:27632176

  9. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP...

  10. Large quantity production of carbon and boron nitride nanotubes by mechano-thermal process

    International Nuclear Information System (INIS)

    Chen, Y.; Fitzgerald, J.D.; Chadderton, L.; Williams, J.S.; Campbell, S.J.

    2002-01-01

    Full text: Nanotube materials including carbon and boron nitride have excellent properties compared with bulk materials. The seamless graphene cylinders with a high length to diameter ratio make them as superstrong fibers. A high amount of hydrogen can be stored into nanotubes as future clean fuel source. Theses applications require large quantity of nanotubes materials. However, nanotube production in large quantity, fully controlled quality and low costs remains challenges for most popular synthesis methods such as arc discharge, laser heating and catalytic chemical decomposition. Discovery of new synthesis methods is still crucial for future industrial application. The new low-temperature mechano-thermal process discovered by the current author provides an opportunity to develop a commercial method for bulk production. This mechano-thermal process consists of a mechanical ball milling and a thermal annealing processes. Using this method, both carbon and boron nitride nanotubes were produced. I will present the mechano-thermal method as the new bulk production technique in the conference. The lecture will summarise main results obtained. In the case of carbon nanotubes, different nanosized structures including multi-walled nanotubes, nanocells, and nanoparticles have been produced in a graphite sample using a mechano-thermal process, consisting of I mechanical milling at room temperature for up to 150 hours and subsequent thermal annealing at 1400 deg C. Metal particles have played an important catalytic effect on the formation of different tubular structures. While defect structure of the milled graphite appears to be responsible for the formation of small tubes. It is found that the mechanical treatment of graphite powder produces a disordered and microporous structure, which provides nucleation sites for nanotubes as well as free carbon atoms. Multiwalled carbon nanotubes appear to grow via growth of the (002) layers during thermal annealing. In the case of BN

  11. Medical Students Perceive Better Group Learning Processes when Large Classes Are Made to Seem Small

    Science.gov (United States)

    Hommes, Juliette; Arah, Onyebuchi A.; de Grave, Willem; Schuwirth, Lambert W. T.; Scherpbier, Albert J. J. A.; Bos, Gerard M. J.

    2014-01-01

    Objective Medical schools struggle with large classes, which might interfere with the effectiveness of learning within small groups due to students being unfamiliar to fellow students. The aim of this study was to assess the effects of making a large class seem small on the students' collaborative learning processes. Design A randomised controlled intervention study was undertaken to make a large class seem small, without the need to reduce the number of students enrolling in the medical programme. The class was divided into subsets: two small subsets (n = 50) as the intervention groups; a control group (n = 102) was mixed with the remaining students (the non-randomised group n∼100) to create one large subset. Setting The undergraduate curriculum of the Maastricht Medical School, applying the Problem-Based Learning principles. In this learning context, students learn mainly in tutorial groups, composed randomly from a large class every 6–10 weeks. Intervention The formal group learning activities were organised within the subsets. Students from the intervention groups met frequently within the formal groups, in contrast to the students from the large subset who hardly enrolled with the same students in formal activities. Main Outcome Measures Three outcome measures assessed students' group learning processes over time: learning within formally organised small groups, learning with other students in the informal context and perceptions of the intervention. Results Formal group learning processes were perceived more positive in the intervention groups from the second study year on, with a mean increase of β = 0.48. Informal group learning activities occurred almost exclusively within the subsets as defined by the intervention from the first week involved in the medical curriculum (E-I indexes>−0.69). Interviews tapped mainly positive effects and negligible negative side effects of the intervention. Conclusion Better group learning processes can be

  12. Retrospective comparative ten-year study of cumulative survival rates of remaining teeth in large edentulism treated with implant-supported fixed partial dentures or removable partial dentures.

    Science.gov (United States)

    Yamazaki, Seiya; Arakawa, Hikaru; Maekawa, Kenji; Hara, Emilio Satoshi; Noda, Kinji; Minakuchi, Hajime; Sonoyama, Wataru; Matsuka, Yoshizo; Kuboki, Takuo

    2013-07-01

    This study aimed to compare the survival rates of remaining teeth between implant-supported fixed dentures (IFDs) and removable partial dentures (RPDs) in patients with large edentulous cases. The second goal was to assess the risk factors for remaining tooth loss. The study subjects were selected among those who received prosthodontic treatment at Okayama University Dental Hospital for their edentulous space exceeding at least four continuous missing teeth. Twenty-one patients were included in the IFD group and 82 patients were included in the RPD group. Survival rates of remaining teeth were calculated in three subcategories: (1) whole remaining teeth, (2) adjacent teeth to intended edentulous space, and (3) opposing teeth to intended edentulous space. The ten-year cumulative survival rate of the whole remaining teeth was significantly higher in the IFD group (40.0%) than in the RPD group (24.4%). On the other hand, there was no significant difference between two groups in the survival rate of teeth adjacent or opposing to intended edentulous space. A Cox proportional hazard analysis revealed that RPD restoration and gender (male) were the significant risk factors for remaining tooth loss (whole remaining teeth). These results suggest that IFD treatment can reduce the incidence of remaining tooth loss in large edentulous cases. Copyright © 2013 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  13. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  14. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  15. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  16. Large forging manufacturing process

    Science.gov (United States)

    Thamboo, Samuel V.; Yang, Ling

    2002-01-01

    A process for forging large components of Alloy 718 material so that the components do not exhibit abnormal grain growth includes the steps of: a) providing a billet with an average grain size between ASTM 0 and ASTM 3; b) heating the billet to a temperature of between 1750.degree. F. and 1800.degree. F.; c) upsetting the billet to obtain a component part with a minimum strain of 0.125 in at least selected areas of the part; d) reheating the component part to a temperature between 1750.degree. F. and 1800.degree. F.; e) upsetting the component part to a final configuration such that said selected areas receive no strains between 0.01 and 0.125; f) solution treating the component part at a temperature of between 1725.degree. F. and 1750.degree. F.; and g) aging the component part over predetermined times at different temperatures. A modified process achieves abnormal grain growth in selected areas of a component where desirable.

  17. Large deviations for Gaussian processes in Hoelder norm

    International Nuclear Information System (INIS)

    Fatalov, V R

    2003-01-01

    Some results are proved on the exact asymptotic representation of large deviation probabilities for Gaussian processes in the Hoeder norm. The following classes of processes are considered: the Wiener process, the Brownian bridge, fractional Brownian motion, and stationary Gaussian processes with power-law covariance function. The investigation uses the method of double sums for Gaussian fields

  18. Remaining useful life prediction based on the Wiener process for an aviation axial piston pump

    Directory of Open Access Journals (Sweden)

    Xingjian Wang

    2016-06-01

    Full Text Available An aviation hydraulic axial piston pump’s degradation from comprehensive wear is a typical gradual failure model. Accurate wear prediction is difficult as random and uncertain characteristics must be factored into the estimation. The internal wear status of the axial piston pump is characterized by the return oil flow based on fault mechanism analysis of the main frictional pairs in the pump. The performance degradation model is described by the Wiener process to predict the remaining useful life (RUL of the pump. Maximum likelihood estimation (MLE is performed by utilizing the expectation maximization (EM algorithm to estimate the initial parameters of the Wiener process while recursive estimation is conducted utilizing the Kalman filter method to estimate the drift coefficient of the Wiener process. The RUL of the pump is then calculated according to the performance degradation model based on the Wiener process. Experimental results indicate that the return oil flow is a suitable characteristic for reflecting the internal wear status of the axial piston pump, and thus the Wiener process-based method may effectively predicate the RUL of the pump.

  19. The Very Large Array Data Processing Pipeline

    Science.gov (United States)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an

  20. Large-Deviation Results for Discriminant Statistics of Gaussian Locally Stationary Processes

    Directory of Open Access Journals (Sweden)

    Junichi Hirukawa

    2012-01-01

    Full Text Available This paper discusses the large-deviation principle of discriminant statistics for Gaussian locally stationary processes. First, large-deviation theorems for quadratic forms and the log-likelihood ratio for a Gaussian locally stationary process with a mean function are proved. Their asymptotics are described by the large deviation rate functions. Second, we consider the situations where processes are misspecified to be stationary. In these misspecified cases, we formally make the log-likelihood ratio discriminant statistics and derive the large deviation theorems of them. Since they are complicated, they are evaluated and illustrated by numerical examples. We realize the misspecification of the process to be stationary seriously affecting our discrimination.

  1. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Drell–Yan process at Large Hadron Collider

    Indian Academy of Sciences (India)

    the Drell–Yan process [1] first studied with muon final states. In Standard .... Two large-statistics sets of signal events, based on the value of the dimuon invariant mass, .... quality control criteria are applied to this globally reconstructed muon.

  3. Identification of low order models for large scale processes

    NARCIS (Netherlands)

    Wattamwar, S.K.

    2010-01-01

    Many industrial chemical processes are complex, multi-phase and large scale in nature. These processes are characterized by various nonlinear physiochemical effects and fluid flows. Such processes often show coexistence of fast and slow dynamics during their time evolutions. The increasing demand

  4. Remaining useful life prediction based on variation coefficient consistency test of a Wiener process

    Directory of Open Access Journals (Sweden)

    Juan LI

    2018-01-01

    Full Text Available High-cost equipment is often reused after maintenance, and whether the information before the maintenance can be used for the Remaining Useful Life (RUL prediction after the maintenance is directly determined by the consistency of the degradation pattern before and after the maintenance. Aiming at this problem, an RUL prediction method based on the consistency test of a Wiener process is proposed. Firstly, the parameters of the Wiener process estimated by Maximum Likelihood Estimation (MLE are proved to be biased, and a modified unbiased estimation method is proposed and verified by derivation and simulations. Then, the h statistic is constructed according to the reciprocal of the variation coefficient of the Wiener process, and the sampling distribution is derived. Meanwhile, a universal method for the consistency test is proposed based on the sampling distribution theorem, which is verified by simulation data and classical crack degradation data. Finally, based on the consistency test of the degradation model, a weighted fusion RUL prediction method is presented for the fuel pump of an airplane, and the validity of the presented method is verified by accurate computation results of real data, which provides a theoretical and practical guidance for engineers to predict the RUL of equipment after maintenance.

  5. Fish Remains from Excavations near the Riverfront at Newcastle upon Tyne, England

    Directory of Open Access Journals (Sweden)

    Rebecca A. Nicholson

    1997-12-01

    Full Text Available The City of Newcastle, situated some 10 miles inland on the River Tyne in north-east England, is not now an important fishing port. Most of the fresh fish marketed in the city has been landed at the nearby coastal ports of North and South Shields. Excavations at two sites behind the present Quayside in Newcastle, however, have yielded quantities of fish bones, representing a wide variety of species. This is in contrast to excavations in other parts of the city, where few fish remains have been recovered, and suggests that the quayside in Newcastle was an important centre for the fishing industry during the medieval period. It seems likely that most of the fish remains represent waste from landing and processing fish on or near the quayside. Yet, when taphonomic factors are taken into account, the limitations of using even large bone assemblages to interpret processing activities is demonstrated. As always, the need for a programme of on-site sieving to obtain representative samples of fish bone is evident.

  6. Informational support of the investment process in a large city economy

    Directory of Open Access Journals (Sweden)

    Tamara Zurabovna Chargazia

    2016-12-01

    Full Text Available Large cities possess a sufficient potential to participate in the investment processes both at the national and international levels. A potential investor’s awareness of the possibilities and prospects of a city development is of a great importance for him or her to make a decision. So, providing a potential investor with relevant, laconic and reliable information, the local authorities increase the intensity of the investment process in the city economy and vice-versa. As a hypothesis, there is a proposition that a large city administration can sufficiently activate the investment processes in the economy of a corresponding territorial entity using the tools of the information providing. The purpose of this article is to develop measures for the improvement of the investment portal of a large city as an important instrument of the information providing, which will make it possible to brisk up the investment processes at the level under analysis. The reasons of the unsatisfactory information providing on the investment process in a large city economy are deeply analyzed; the national and international experience in this sphere is studied; advantages and disadvantages of the information providing of the investment process in the economy of the city of Makeyevka are considered; the investment portals of different cities are compared. There are suggested technical approaches for improving the investment portal of a large city. The research results can be used to improve the investment policy of large cities.

  7. [Dual process in large number estimation under uncertainty].

    Science.gov (United States)

    Matsumuro, Miki; Miwa, Kazuhisa; Terai, Hitoshi; Yamada, Kento

    2016-08-01

    According to dual process theory, there are two systems in the mind: an intuitive and automatic System 1 and a logical and effortful System 2. While many previous studies about number estimation have focused on simple heuristics and automatic processes, the deliberative System 2 process has not been sufficiently studied. This study focused on the System 2 process for large number estimation. First, we described an estimation process based on participants’ verbal reports. The task, corresponding to the problem-solving process, consisted of creating subgoals, retrieving values, and applying operations. Second, we investigated the influence of such deliberative process by System 2 on intuitive estimation by System 1, using anchoring effects. The results of the experiment showed that the System 2 process could mitigate anchoring effects.

  8. Data-aware remaining time prediction of business process instances

    NARCIS (Netherlands)

    Polato, M.; Sperduti, A.; Burattin, A.; Leoni, de M.

    2014-01-01

    Accurate prediction of the completion time of a business process instance would constitute a valuable tool when managing processes under service level agreement constraints. Such prediction, however, is a very challenging task. A wide variety of factors could influence the trend of a process

  9. Large wood mobility processes in low-order Chilean river channels

    Science.gov (United States)

    Iroumé, Andrés; Mao, Luca; Andreoli, Andrea; Ulloa, Héctor; Ardiles, María Paz

    2015-01-01

    Large wood (LW) mobility was studied over several time periods in channel segments of four low-order mountain streams, southern Chile. All wood pieces found within the bankfull channels and on the streambanks extending into the channel with dimensions more than 10 cm in diameter and 1 m in length were measured and their position was referenced. Thirty six percent of measured wood pieces were tagged to investigate log mobility. All segments were first surveyed in summer and then after consecutive rainy winter periods. Annual LW mobility ranged between 0 and 28%. Eighty-four percent of the moved LW had diameters ≤ 40 cm and 92% had lengths ≤ 7 m. Large wood mobility was higher in periods when maximum water level (Hmax) exceeded channel bankfull depth (HBk) than in periods with flows less than HBk, but the difference was not statistically significant. Dimensions of moved LW showed no significant differences between periods with flows exceeding and with flows less than bankfull stage. Statistically significant relationships were found between annual LW mobility (%) and unit stream power (for Hmax) and Hmax/HBk. The mean diameter of transported wood pieces per period was significantly correlated with unit stream power for H15% and H50% (the level above which the flow remains for 15 and 50% of the time, respectively). These results contribute to an understanding of the complexity of LW mobilization processes in mountain streams and can be used to assess and prevent potential damage caused by LW mobilization during floods.

  10. Storage process of large solid radioactive wastes

    International Nuclear Information System (INIS)

    Morin, Bruno; Thiery, Daniel.

    1976-01-01

    Process for the storage of large size solid radioactive waste, consisting of contaminated objects such as cartridge filters, metal swarf, tools, etc, whereby such waste is incorporated in a thermohardening resin at room temperature, after prior addition of at least one inert charge to the resin. Cross-linking of the resin is then brought about [fr

  11. Evaluating the impact of water processing on wood charcoal remains: Tell Qarassa North, a case study

    DEFF Research Database (Denmark)

    Otaegui, Amaia Arranz; Zapata, Lydia; Colledge, Sue

    .5 l) were recovered. The aim of the work is to evaluate if water processing affects similarly all of taxa or instead, differences exists in the preservation of certain types of remains. To evaluate this, taxonomic and taphonomic analyses were carried out, including the recording of alterations...... the taxa present at the site. The results presented here warn against straightforward interpretations of wood charcoal frequencies in terms of original composition of past vegetation, and suggest that it would be advisable to use more than one recovery technique, along with recording of different types...

  12. Formation of Large-scale Coronal Loops Interconnecting Two Active Regions through Gradual Magnetic Reconnection and an Associated Heating Process

    Science.gov (United States)

    Du, Guohui; Chen, Yao; Zhu, Chunming; Liu, Chang; Ge, Lili; Wang, Bing; Li, Chuanyang; Wang, Haimin

    2018-06-01

    Coronal loops interconnecting two active regions (ARs), called interconnecting loops (ILs), are prominent large-scale structures in the solar atmosphere. They carry a significant amount of magnetic flux and therefore are considered to be an important element of the solar dynamo process. Earlier observations showed that eruptions of ILs are an important source of CMEs. It is generally believed that ILs are formed through magnetic reconnection in the high corona (>150″–200″), and several scenarios have been proposed to explain their brightening in soft X-rays (SXRs). However, the detailed IL formation process has not been fully explored, and the associated energy release in the corona still remains unresolved. Here, we report the complete formation process of a set of ILs connecting two nearby ARs, with successive observations by STEREO-A on the far side of the Sun and by SDO and Hinode on the Earth side. We conclude that ILs are formed by gradual reconnection high in the corona, in line with earlier postulations. In addition, we show evidence that ILs brighten in SXRs and EUVs through heating at or close to the reconnection site in the corona (i.e., through the direct heating process of reconnection), a process that has been largely overlooked in earlier studies of ILs.

  13. Molecular genetic identification of skeletal remains of apartheid ...

    African Journals Online (AJOL)

    The Truth and Reconciliation Commission made significant progress in examining abuses committed during the apartheid era in South Africa. Despite information revealed by the commission, a large number of individuals remained missing when the commission closed its proceedings. This provided the impetus for the ...

  14. Large 3D resistivity and induced polarization acquisition using the Fullwaver system: towards an adapted processing methodology

    Science.gov (United States)

    Truffert, Catherine; Leite, Orlando; Gance, Julien; Texier, Benoît; Bernard, Jean

    2017-04-01

    Driven by needs in the mineral exploration market for ever faster and ever easier set-up of large 3D resistivity and induced polarization, autonomous and cableless recorded systems come to the forefront. Opposite to the traditional centralized acquisition, this new system permits a complete random distribution of receivers on the survey area allowing to obtain a real 3D imaging. This work presents the results of a 3 km2 large experiment up to 600m of depth performed with a new type of autonomous distributed receivers: the I&V-Fullwaver. With such system, all usual drawbacks induced by long cable set up over large 3D areas - time consuming, lack of accessibility, heavy weight, electromagnetic induction, etc. - disappear. The V-Fullwavers record the entire time series of voltage on two perpendicular axes, for a good determination of the data quality although I-Fullwaver records injected current simultaneously. For this survey, despite good assessment of each individual signal quality, on each channel of the set of Fullwaver systems, a significant number of negative apparent resistivity and chargeability remains present in the dataset (around 15%). These values are commonly not taken into account in the inversion software although they may be due to complex geological structure of interest (e.g. linked to the presence of sulfides in the earth). Taking into account that such distributed recording system aims to restitute the best 3D resistivity and IP tomography, how can 3D inversion be improved? In this work, we present the dataset, the processing chain and quality control of a large 3D survey. We show that the quality of the data selected is good enough to include it into the inversion processing. We propose a second way of processing based on the modulus of the apparent resistivity that stabilizes the inversion. We then discuss the results of both processing. We conclude that an effort could be made on the inclusion of negative apparent resistivity in the inversion

  15. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    Science.gov (United States)

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software

  16. Recent Successes and Remaining Challenges in Predicting Phosphorus Loading to Surface Waters at Large Scales

    Science.gov (United States)

    Harrison, J.; Metson, G.; Beusen, A.

    2017-12-01

    Over the past century humans have greatly accelerated phosphorus (P) flows from land to aquatic ecosystems, causing eutrophication and associated effects such as harmful algal blooms and hypoxia. Effectively addressing this challenge requires understanding geographic and temporal distribution of aquatic P loading, knowledge of major controls on P loading, and the relative importance of various potential P sources. The Global (N)utrient (E)xport from (W)ater(S)heds) NEWS model and recent improvements and extensions of this modeling system can be used to generate this understanding. This presentation will focus on insights global NEWS models grant into past, present, and potential future P sources and sinks, with a focus on the world's large rivers. Early results suggest: 1) that while aquatic P loading is globally dominated by particulate forms, dissolved P can be locally dominant; 2) that P loading has increased substantially at the global scale, but unevenly between world regions, with hotspots in South and East Asia; 3) that P loading is likely to continue to increase globally, but decrease in certain regions that are actively pursuing proactive P management; and 4) that point sources, especially in urban centers, play an important (even dominant) role in determining loads of dissolved inorganic P. Despite these insights, substantial unexplained variance remains when model predictions and measurements are compared at global and regional scales, for example within the U.S. Disagreements between model predictions and measurements suggest opportunities for model improvement. In particular, explicit inclusion of soil characteristics and the concept of temporal P legacies in future iterations of NEWS (and other) models may help improve correspondence between models and measurements.

  17. RESOURCE SAVING TECHNOLOGICAL PROCESS OF LARGE-SIZE DIE THERMAL TREATMENT

    Directory of Open Access Journals (Sweden)

    L. A. Glazkov

    2009-01-01

    Full Text Available The given paper presents a development of a technological process pertaining to hardening large-size parts made of die steel. The proposed process applies a water-air mixture instead of a conventional hardening medium that is industrial oil.While developing this new technological process it has been necessary to solve the following problems: reduction of thermal treatment duration, reduction of power resource expense (natural gas and mineral oil, elimination of fire danger and increase of process ecological efficiency. 

  18. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  19. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  20. Nonterrestrial material processing and manufacturing of large space systems

    Science.gov (United States)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  1. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  2. QCD phenomenology of the large P/sub T/ processes

    International Nuclear Information System (INIS)

    Stroynowski, R.

    1979-11-01

    Quantum Chromodynamics (QCD) provides a framework for the possible high-accuracy calculations of the large-p/sub T/ processes. The description of the large-transverse-momentum phenomena is introduced in terms of the parton model, and the modifications expected from QCD are described by using as an example single-particle distributions. The present status of available data (π, K, p, p-bar, eta, particle ratios, beam ratios, direct photons, nuclear target dependence), the evidence for jets, and the future prospects are reviewed. 80 references, 33 figures, 3 tables

  3. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Process variations in surface nano geometries manufacture on large area substrates

    DEFF Research Database (Denmark)

    Calaon, Matteo; Hansen, Hans Nørgaard; Tosello, Guido

    2014-01-01

    The need of transporting, treating and measuring increasingly smaller biomedical samples has pushed the integration of a far reaching number of nanofeatures over large substrates size in respect to the conventional processes working area windows. Dimensional stability of nano fabrication processe...

  5. Comparison of decomposition rates between autopsied and non-autopsied human remains.

    Science.gov (United States)

    Bates, Lennon N; Wescott, Daniel J

    2016-04-01

    Penetrating trauma has been cited as a significant factor in the rate of decomposition. Therefore, penetrating trauma may have an effect on estimations of time-since-death in medicolegal investigations and on research examining decomposition rates and processes when autopsied human bodies are used. The goal of this study was to determine if there are differences in the rate of decomposition between autopsied and non-autopsied human remains in the same environment. The purpose is to shed light on how large incisions, such as those from a thorocoabdominal autopsy, effect time-since-death estimations and research on the rate of decomposition that use both autopsied and non-autopsied human remains. In this study, 59 non-autopsied and 24 autopsied bodies were studied. The number of accumulated degree days required to reach each decomposition stage was then compared between autopsied and non-autopsied remains. Additionally, both types of bodies were examined for seasonal differences in decomposition rates. As temperature affects the rate of decomposition, this study also compared the internal body temperatures of autopsied and non-autopsied remains to see if differences between the two may be leading to differential decomposition. For this portion of this study, eight non-autopsied and five autopsied bodies were investigated. Internal temperature was collected once a day for two weeks. The results showed that differences in the decomposition rate between autopsied and non-autopsied remains was not statistically significant, though the average ADD needed to reach each stage of decomposition was slightly lower for autopsied bodies than non-autopsied bodies. There was also no significant difference between autopsied and non-autopsied bodies in the rate of decomposition by season or in internal temperature. Therefore, this study suggests that it is unnecessary to separate autopsied and non-autopsied remains when studying gross stages of human decomposition in Central Texas

  6. Broadband Reflective Coating Process for Large FUVOIR Mirrors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZeCoat Corporation will develop and demonstrate a set of revolutionary coating processes for making broadband reflective coatings suitable for very large mirrors (4+...

  7. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  8. Processing and properties of large grain (RE)BCO

    International Nuclear Information System (INIS)

    Cardwell, D.A.

    1998-01-01

    The potential of high temperature superconductors to generate large magnetic fields and to carry current with low power dissipation at 77 K is particularly attractive for a variety of permanent magnet applications. As a result large grain bulk (RE)-Ba-Cu-O ((RE)BCO) materials have been developed by melt process techniques in an attempt to fabricate practical materials for use in high field devices. This review outlines the current state of the art in this field of processing, including seeding requirements for the controlled fabrication of these materials, the origin of striking growth features such as the formation of a facet plane around the seed, platelet boundaries and (RE) 2 BaCuO 5 (RE-211) inclusions in the seeded melt grown microstructure. An observed variation in critical current density in large grain (RE)BCO samples is accounted for by Sm contamination of the material in the vicinity of the seed and with the development of a non-uniform growth morphology at ∼4 mm from the seed position. (RE)Ba 2 Cu 3 O 7-δ (RE-123) dendrites are observed to form and bro[en preferentially within the a/b plane of the lattice in this growth regime. Finally, trapped fields in excess of 3 T have been reported in irr[iated U-doped YBCO and (RE) 1+x Ba 2-x Cu 3 O y (RE=Sm, Nd) materials have been observed to carry transport current in fields of up to 10 T at 77 K. This underlines the potential of bulk (RE)BCO materials for practical permanent magnet type applications. (orig.)

  9. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    Science.gov (United States)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  10. Constructing large scale SCI-based processing systems by switch elements

    International Nuclear Information System (INIS)

    Wu, B.; Kristiansen, E.; Skaali, B.; Bogaerts, A.; Divia, R.; Mueller, H.

    1993-05-01

    The goal of this paper is to study some of the design criteria for the switch elements to form the interconnection of large scale SCI-based processing systems. The approved IEEE standard 1596 makes it possible to couple up to 64K nodes together. In order to connect thousands of nodes to construct large scale SCI-based processing systems, one has to interconnect these nodes by switch elements to form different topologies. A summary of the requirements and key points of interconnection networks and switches is presented. Two models of the SCI switch elements are proposed. The authors investigate several examples of systems constructed for 4-switches with simulations and the results are analyzed. Some issues and enhancements are discussed to provide the ideas behind the switch design that can improve performance and reduce latency. 29 refs., 11 figs., 3 tabs

  11. Forensic considerations when dealing with incinerated human dental remains.

    Science.gov (United States)

    Reesu, Gowri Vijay; Augustine, Jeyaseelan; Urs, Aadithya B

    2015-01-01

    Establishing the human dental identification process relies upon sufficient post-mortem data being recovered to allow for a meaningful comparison with ante-mortem records of the deceased person. Teeth are the most indestructible components of the human body and are structurally unique in their composition. They possess the highest resistance to most environmental effects like fire, desiccation, decomposition and prolonged immersion. In most natural as well as man-made disasters, teeth may provide the only means of positive identification of an otherwise unrecognizable body. It is imperative that dental evidence should not be destroyed through erroneous handling until appropriate radiographs, photographs, or impressions can be fabricated. Proper methods of physical stabilization of incinerated human dental remains should be followed. The maintenance of integrity of extremely fragile structures is crucial to the successful confirmation of identity. In such situations, the forensic dentist must stabilise these teeth before the fragile remains are transported to the mortuary to ensure preservation of possibly vital identification evidence. Thus, while dealing with any incinerated dental remains, a systematic approach must be followed through each stage of evaluation of incinerated dental remains to prevent the loss of potential dental evidence. This paper presents a composite review of various studies on incinerated human dental remains and discusses their impact on the process of human identification and suggests a step by step approach. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  12. Feasibility of large volume casting cementation process for intermediate level radioactive waste

    International Nuclear Information System (INIS)

    Chen Zhuying; Chen Baisong; Zeng Jishu; Yu Chengze

    1988-01-01

    The recent tendency of radioactive waste treatment and disposal both in China and abroad is reviewed. The feasibility of the large volume casting cementation process for treating and disposing the intermediate level radioactive waste from spent fuel reprocessing plant in shallow land is assessed on the basis of the analyses of the experimental results (such as formulation study, solidified radioactive waste properties measurement ect.). It can be concluded large volume casting cementation process is a promising, safe and economic process. It is feasible to dispose the intermediate level radioactive waste from reprocessing plant it the disposal site chosen has resonable geological and geographical conditions and some additional effective protection means are taken

  13. Large-scale membrane transfer process: its application to single-crystal-silicon continuous membrane deformable mirror

    International Nuclear Information System (INIS)

    Wu, Tong; Sasaki, Takashi; Hane, Kazuhiro; Akiyama, Masayuki

    2013-01-01

    This paper describes a large-scale membrane transfer process developed for the construction of large-scale membrane devices via the transfer of continuous single-crystal-silicon membranes from one substrate to another. This technique is applied for fabricating a large stroke deformable mirror. A bimorph spring array is used to generate a large air gap between the mirror membrane and the electrode. A 1.9 mm × 1.9 mm × 2 µm single-crystal-silicon membrane is successfully transferred to the electrode substrate by Au–Si eutectic bonding and the subsequent all-dry release process. This process provides an effective approach for transferring a free-standing large continuous single-crystal-silicon to a flexible suspension spring array with a large air gap. (paper)

  14. Modelling hydrologic and hydrodynamic processes in basins with large semi-arid wetlands

    Science.gov (United States)

    Fleischmann, Ayan; Siqueira, Vinícius; Paris, Adrien; Collischonn, Walter; Paiva, Rodrigo; Pontes, Paulo; Crétaux, Jean-François; Bergé-Nguyen, Muriel; Biancamaria, Sylvain; Gosset, Marielle; Calmant, Stephane; Tanimoun, Bachir

    2018-06-01

    Hydrological and hydrodynamic models are core tools for simulation of large basins and complex river systems associated to wetlands. Recent studies have pointed towards the importance of online coupling strategies, representing feedbacks between floodplain inundation and vertical hydrology. Especially across semi-arid regions, soil-floodplain interactions can be strong. In this study, we included a two-way coupling scheme in a large scale hydrological-hydrodynamic model (MGB) and tested different model structures, in order to assess which processes are important to be simulated in large semi-arid wetlands and how these processes interact with water budget components. To demonstrate benefits from this coupling over a validation case, the model was applied to the Upper Niger River basin encompassing the Niger Inner Delta, a vast semi-arid wetland in the Sahel Desert. Simulation was carried out from 1999 to 2014 with daily TMPA 3B42 precipitation as forcing, using both in-situ and remotely sensed data for calibration and validation. Model outputs were in good agreement with discharge and water levels at stations both upstream and downstream of the Inner Delta (Nash-Sutcliffe Efficiency (NSE) >0.6 for most gauges), as well as for flooded areas within the Delta region (NSE = 0.6; r = 0.85). Model estimates of annual water losses across the Delta varied between 20.1 and 30.6 km3/yr, while annual evapotranspiration ranged between 760 mm/yr and 1130 mm/yr. Evaluation of model structure indicated that representation of both floodplain channels hydrodynamics (storage, bifurcations, lateral connections) and vertical hydrological processes (floodplain water infiltration into soil column; evapotranspiration from soil and vegetation and evaporation of open water) are necessary to correctly simulate flood wave attenuation and evapotranspiration along the basin. Two-way coupled models are necessary to better understand processes in large semi-arid wetlands. Finally, such coupled

  15. Process γ*γ → σ at large virtuality of γ*

    International Nuclear Information System (INIS)

    Volkov, M.K.; Radzhabov, A.E.; Yudichev, V.L.

    2004-01-01

    The process γ*γ → σ is investigated in the framework of the SU(2) x SU(2) chiral NJL model, where γ*γ are photons with the large and small virtuality, respectively, and σ is a pseudoscalar meson. The form factor of the process is derived for arbitrary virtuality of γ* in the Euclidean kinematic domain. The asymptotic behavior of this form factor resembles the asymptotic behavior of the γ*γ → π form factor [ru

  16. The testing of thermal-mechanical-hydrological-chemical processes using a large block

    International Nuclear Information System (INIS)

    Lin, W.; Wilder, D.G.; Blink, J.A.; Blair, S.C.; Buscheck, T.A.; Chesnut, D.A.; Glassley, W.E.; Lee, K.; Roberts, J.J.

    1994-01-01

    The radioactive decay heat from nuclear waste packages may, depending on the thermal load, create coupled thermal-mechanical-hydrological-chemical (TMHC) processes in the near-field environment of a repository. A group of tests on a large block (LBT) are planned to provide a timely opportunity to test and calibrate some of the TMHC model concepts. The LBT is advantageous for testing and verifying model concepts because the boundary conditions are controlled, and the block can be characterized before and after the experiment. A block of Topopah Spring tuff of about 3 x 3 x 4.5 m will be sawed and isolated at Fran Ridge, Nevada Test Site. Small blocks of the rock adjacent to the large block will be collected for laboratory testing of some individual thermal-mechanical, hydrological, and chemical processes. A constant load of about 4 MPa will be applied to the top and sides of the large block. The sides will be sealed with moisture and thermal barriers. The large block will be heated with one heater in each borehole and guard heaters on the sides so that a dry-out zone and a condensate zone will exist simultaneously. Temperature, moisture content, pore pressure, chemical composition, stress and displacement will be measured throughout the block during the heating and cool-down phases. The results from the experiments on small blocks and the tests on the large block will provide a better understanding of some concepts of the coupled TMHC processes

  17. Processing and properties of large-sized ceramic slabs

    Energy Technology Data Exchange (ETDEWEB)

    Raimondo, M.; Dondi, M.; Zanelli, C.; Guarini, G.; Gozzi, A.; Marani, F.; Fossa, L.

    2010-07-01

    Large-sized ceramic slabs with dimensions up to 360x120 cm{sup 2} and thickness down to 2 mm are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites). Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD) and microstructural (SEM) viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated facades, tunnel coverings, insulating panelling), indoor furnitures (table tops, doors), support for photovoltaic ceramic panels. (Author) 24 refs.

  18. Processing and properties of large-sized ceramic slabs

    International Nuclear Information System (INIS)

    Raimondo, M.; Dondi, M.; Zanelli, C.; Guarini, G.; Gozzi, A.; Marani, F.; Fossa, L.

    2010-01-01

    Large-sized ceramic slabs with dimensions up to 360x120 cm 2 and thickness down to 2 mm are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites). Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD) and microstructural (SEM) viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated facades, tunnel coverings, insulating panelling), indoor furnitures (table tops, doors), support for photovoltaic ceramic panels. (Author) 24 refs.

  19. Study on remain actinides recovery in pyro reprocessing

    International Nuclear Information System (INIS)

    Suharto, Bambang

    1996-01-01

    The spent fuel reprocessing by dry process called pyro reprocessing have been studied. Most of U, Pu and MA (minor actinides) from the spent fuel will be recovered and be fed back to the reactor as new fuel. Accumulation of remain actinides will be separated by extraction process with liquid cadmium solvent. The research was conducted by computer simulation to calculate the stage number required. The calculation's results showed on the 20 stages extractor more than 99% actinides can be separated. (author)

  20. Processing graded feedback: electrophysiological correlates of learning from small and large errors.

    Science.gov (United States)

    Luft, Caroline Di Bernardi; Takase, Emilio; Bhattacharya, Joydeep

    2014-05-01

    Feedback processing is important for learning and therefore may affect the consolidation of skills. Considerable research demonstrates electrophysiological differences between correct and incorrect feedback, but how we learn from small versus large errors is usually overlooked. This study investigated electrophysiological differences when processing small or large error feedback during a time estimation task. Data from high-learners and low-learners were analyzed separately. In both high- and low-learners, large error feedback was associated with higher feedback-related negativity (FRN) and small error feedback was associated with a larger P300 and increased amplitude over the motor related areas of the left hemisphere. In addition, small error feedback induced larger desynchronization in the alpha and beta bands with distinctly different topographies between the two learning groups: The high-learners showed a more localized decrease in beta power over the left frontocentral areas, and the low-learners showed a widespread reduction in the alpha power following small error feedback. Furthermore, only the high-learners showed an increase in phase synchronization between the midfrontal and left central areas. Importantly, this synchronization was correlated to how well the participants consolidated the estimation of the time interval. Thus, although large errors were associated with higher FRN, small errors were associated with larger oscillatory responses, which was more evident in the high-learners. Altogether, our results suggest an important role of the motor areas in the processing of error feedback for skill consolidation.

  1. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  2. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  3. On Building and Processing of Large Digitalized Map Archive

    Directory of Open Access Journals (Sweden)

    Milan Simunek

    2011-07-01

    Full Text Available A tall list of problems needs to be solved during a long-time work on a virtual model of Prague aim of which is to show historical development of the city in virtual reality. This paper presents an integrated solution to digitalizing, cataloguing and processing of a large number of maps from different periods and from variety of sources. A specialized (GIS software application was developed to allow for a fast georeferencing (using an evolutionary algorithm, for cataloguing in an internal database, and subsequently for an easy lookup of relevant maps. So the maps could be processed further to serve as a main input for a proper modeling of a changing face of the city through times.

  4. A mesh density study for application to large deformation rolling process evaluation

    International Nuclear Information System (INIS)

    Martin, J.A.

    1997-12-01

    When addressing large deformation through an elastic-plastic analysis the mesh density is paramount in determining the accuracy of the solution. However, given the nonlinear nature of the problem, a highly-refined mesh will generally require a prohibitive amount of computer resources. This paper addresses finite element mesh optimization studies considering accuracy of results and computer resource needs as applied to large deformation rolling processes. In particular, the simulation of the thread rolling manufacturing process is considered using the MARC software package and a Cray C90 supercomputer. Both mesh density and adaptive meshing on final results for both indentation of a rigid body to a specified depth and contact rolling along a predetermined length are evaluated

  5. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.

  6. Large-scale methanol plants. [Based on Japanese-developed process

    Energy Technology Data Exchange (ETDEWEB)

    Tado, Y

    1978-02-01

    A study was made on how to produce methanol economically which is expected as a growth item for use as a material for pollution-free energy or for chemical use, centering on the following subjects: (1) Improvement of thermal economy, (2) Improvement of process, and (3) Problems of hardware attending the expansion of scale. The results of this study were already adopted in actual plants, obtaining good results, and large-scale methanol plants are going to be realized.

  7. Decontamination and management of human remains following incidents of hazardous chemical release.

    Science.gov (United States)

    Hauschild, Veronique D; Watson, Annetta; Bock, Robert

    2012-01-01

    To provide specific guidance and resources for systematic and orderly decontamination of human remains resulting from a chemical terrorist attack or accidental chemical release. A detailed review and health-based decision criteria protocol is summarized. Protocol basis and logic are derived from analyses of compound-specific toxicological data and chemical/physical characteristics. Guidance is suitable for civilian or military settings where human remains potentially contaminated with hazardous chemicals may be present, such as sites of transportation accidents, terrorist operations, or medical examiner processing points. Guidance is developed from data-characterizing controlled experiments with laboratory animals, fabrics, and materiel. Logic and specific procedures for decontamination and management of remains, protection of mortuary affairs personnel, and decision criteria to determine when remains are sufficiently decontaminated are presented. Established procedures as well as existing materiel and available equipment for decontamination and verification provide reasonable means to mitigate chemical hazards from chemically exposed remains. Unique scenarios such as those involving supralethal concentrations of certain liquid chemical warfare agents may prove difficult to decontaminate but can be resolved in a timely manner by application of the characterized systematic approaches. Decision criteria and protocols to "clear" decontaminated remains for transport and processing are also provided. Once appropriate decontamination and verification have been accomplished, normal procedures for management of remains and release can be followed.

  8. Large Data at Small Universities: Astronomical processing using a computer classroom

    Science.gov (United States)

    Fuller, Nathaniel James; Clarkson, William I.; Fluharty, Bill; Belanger, Zach; Dage, Kristen

    2016-06-01

    The use of large computing clusters for astronomy research is becoming more commonplace as datasets expand, but access to these required resources is sometimes difficult for research groups working at smaller Universities. As an alternative to purchasing processing time on an off-site computing cluster, or purchasing dedicated hardware, we show how one can easily build a crude on-site cluster by utilizing idle cycles on instructional computers in computer-lab classrooms. Since these computers are maintained as part of the educational mission of the University, the resource impact on the investigator is generally low.By using open source Python routines, it is possible to have a large number of desktop computers working together via a local network to sort through large data sets. By running traditional analysis routines in an “embarrassingly parallel” manner, gains in speed are accomplished without requiring the investigator to learn how to write routines using highly specialized methodology. We demonstrate this concept here applied to 1. photometry of large-format images and 2. Statistical significance-tests for X-ray lightcurve analysis. In these scenarios, we see a speed-up factor which scales almost linearly with the number of cores in the cluster. Additionally, we show that the usage of the cluster does not severely limit performance for a local user, and indeed the processing can be performed while the computers are in use for classroom purposes.

  9. Applicability of vector processing to large-scale nuclear codes

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.

    1982-03-01

    To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)

  10. Dehydrogenation in large ingot casting process

    International Nuclear Information System (INIS)

    Ubukata, Takashi; Suzuki, Tadashi; Ueda, Sou; Shibata, Takashi

    2009-01-01

    Forging components (for nuclear power plants) have become larger and larger because of decreased weld lines from a safety point of view. Consequently they have been manufactured from ingots requirement for 200 tons or more. Dehydrogenation is one of the key issues for large ingot manufacturing process. In the case of ingots of 200 tons or heavier, mold stream degassing (MSD) has been applied for dehydrogenation. Although JSW had developed mold stream degassing by argon (MSD-Ar) as a more effective dehydrogenating practice, MSD-Ar was not applied for these ingots, because conventional refractory materials of a stopper rod for the Ar blowing hole had low durability. In this study, we have developed a new type of stopper rod through modification of both refractory materials and the stopper rod construction and have successfully expanded the application range of MSD-Ar up to ingots weighting 330 tons. Compared with the conventional MSD, the hydrogen content in ingots after MSD-Ar has decreased by 24 percent due to the dehydrogenation rate of MSD-Ar increased by 34 percent. (author)

  11. Controlled elaboration of large-area plasmonic substrates by plasma process

    International Nuclear Information System (INIS)

    Pugliara, A; Despax, B; Makasheva, K; Bonafos, C; Carles, R

    2015-01-01

    Elaboration in a controlled way of large-area and efficient plasmonic substrates is achieved by combining sputtering of silver nanoparticles (AgNPs) and plasma polymerization of the embedding dielectric matrix in an axially asymmetric, capacitively coupled RF discharge maintained at low gas pressure. The plasma parameters and deposition conditions were optimized according to the optical response of these substrates. Structural and optical characterizations of the samples confirm the process efficiency. The obtained results indicate that to deposit a single layer of large and closely situated AgNPs, a high injected power and short sputtering times must be privileged. The plasma-elaborated plasmonic substrates appear to be very sensitive to any stimuli that affect their plasmonic response. (paper)

  12. Extraterrestrial processing and manufacturing of large space systems. Volume 3: Executive summary

    Science.gov (United States)

    Miller, R. H.; Smith, D. B. S.

    1979-01-01

    Facilities and equipment are defined for refining processes to commercial grade of lunar material that is delivered to a 'space manufacturing facility' in beneficiated, primary processed quality. The manufacturing facilities and the equipment for producing elements of large space systems from these materials and providing programmatic assessments of the concepts are also defined. In-space production processes of solar cells (by vapor deposition) and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, converters, and others are described.

  13. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part II

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    In Part I, an efficient method for identifying faults in large processes was presented. The whole plant is divided into sectors by using structural, functional, or causal decomposition. A signed directed graph (SDG) is the model used for each sector. The SDG represents interactions among process variables. This qualitative model is used to carry out qualitative simulation for all possible faults. The output of this step is information about the process behaviour. This information is used to build rules. When a symptom is detected in one sector, its rules are evaluated using on-line data and fuzzy logic to yield the diagnosis. In this paper the proposed methodology is applied to a multiple stage flash (MSF) desalination process. This process is composed of sequential flash chambers. It was designed for a pilot plant that produces drinkable water for a community in Argentina; that is, it is a real case. Due to the large number of variables, recycles, phase changes, etc., this process is a good challenge for the proposed diagnosis method

  14. Microarray Data Processing Techniques for Genome-Scale Network Inference from Large Public Repositories.

    Science.gov (United States)

    Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas

    2016-09-19

    Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.

  15. Decomposition Technique for Remaining Useful Life Prediction

    Science.gov (United States)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)

    2014-01-01

    The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.

  16. Resin infusion of large composite structures modeling and manufacturing process

    Energy Technology Data Exchange (ETDEWEB)

    Loos, A.C. [Michigan State Univ., Dept. of Mechanical Engineering, East Lansing, MI (United States)

    2006-07-01

    The resin infusion processes resin transfer molding (RTM), resin film infusion (RFI) and vacuum assisted resin transfer molding (VARTM) are cost effective techniques for the fabrication of complex shaped composite structures. The dry fibrous preform is placed in the mold, consolidated, resin impregnated and cured in a single step process. The fibrous performs are often constructed near net shape using highly automated textile processes such as knitting, weaving and braiding. In this paper, the infusion processes RTM, RFI and VARTM are discussed along with the advantages of each technique compared with traditional composite fabrication methods such as prepreg tape lay up and autoclave cure. The large number of processing variables and the complex material behavior during infiltration and cure make experimental optimization of the infusion processes costly and inefficient. Numerical models have been developed which can be used to simulate the resin infusion processes. The model formulation and solution procedures for the VARTM process are presented. A VARTM process simulation of a carbon fiber preform was presented to demonstrate the type of information that can be generated by the model and to compare the model predictions with experimental measurements. Overall, the predicted flow front positions, resin pressures and preform thicknesses agree well with the measured values. The results of the simulation show the potential cost and performance benefits that can be realized by using a simulation model as part of the development process. (au)

  17. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  18. Large break frequency for the SRS (Savannah River Site) production reactor process water system

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Bush, S.H.

    1989-01-01

    The objective of this paper is to present the results and conclusions of an evaluation of the large break frequency for the process water system (primary coolant system), including the piping, reactor tank, heat exchangers, expansion joints and other process water system components. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break. This evaluation encompasses three specific areas: the failure probability of large process water piping directly from imposed loads, the indirect failure probability of piping caused by the seismic-induced failure of surrounding structures, and the failure of all other process water components. The first two of these areas are discussed in detail in other papers. This paper primarily addresses the failure frequency of components other than piping, and includes the other two areas as contributions to the overall process water system break frequency

  19. Fossil human remains from Bolomor Cave (Valencia, Spain).

    Science.gov (United States)

    Arsuaga, Juan Luis; Fernández Peris, Josep; Gracia-Téllez, Ana; Quam, Rolf; Carretero, José Miguel; Barciela González, Virginia; Blasco, Ruth; Cuartero, Felipe; Sañudo, Pablo

    2012-05-01

    Systematic excavations carried out since 1989 at Bolomor Cave have led to the recovery of four Pleistocene human fossil remains, consisting of a fibular fragment, two isolated teeth, and a nearly complete adult parietal bone. All of these specimens date to the late Middle and early Late Pleistocene (MIS 7-5e). The fibular fragment shows thick cortical bone, an archaic feature found in non-modern (i.e. non-Homo sapiens) members of the genus Homo. Among the dental remains, the lack of a midtrigonid crest in the M(1) represents a departure from the morphology reported for the majority of Neandertal specimens, while the large dimensions and pronounced shoveling of the marginal ridges in the C(1) are similar to other European Middle and late Pleistocene fossils. The parietal bone is very thick, with dimensions that generally fall above Neandertal fossils and resemble more closely the Middle Pleistocene Atapuerca (SH) adult specimens. Based on the presence of archaic features, all the fossils from Bolomor are attributed to the Neandertal evolutionary lineage. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Visualization of the Flux Rope Generation Process Using Large Quantities of MHD Simulation Data

    Directory of Open Access Journals (Sweden)

    Y Kubota

    2013-03-01

    Full Text Available We present a new concept of analysis using visualization of large quantities of simulation data. The time development of 3D objects with high temporal resolution provides the opportunity for scientific discovery. We visualize large quantities of simulation data using the visualization application 'Virtual Aurora' based on AVS (Advanced Visual Systems and the parallel distributed processing at "Space Weather Cloud" in NICT based on Gfarm technology. We introduce two results of high temporal resolution visualization: the magnetic flux rope generation process and dayside reconnection using a system of magnetic field line tracing.

  1. Large sample hydrology in NZ: Spatial organisation in process diagnostics

    Science.gov (United States)

    McMillan, H. K.; Woods, R. A.; Clark, M. P.

    2013-12-01

    A key question in hydrology is how to predict the dominant runoff generation processes in any given catchment. This knowledge is vital for a range of applications in forecasting hydrological response and related processes such as nutrient and sediment transport. A step towards this goal is to map dominant processes in locations where data is available. In this presentation, we use data from 900 flow gauging stations and 680 rain gauges in New Zealand, to assess hydrological processes. These catchments range in character from rolling pasture, to alluvial plains, to temperate rainforest, to volcanic areas. By taking advantage of so many flow regimes, we harness the benefits of large-sample and comparative hydrology to study patterns and spatial organisation in runoff processes, and their relationship to physical catchment characteristics. The approach we use to assess hydrological processes is based on the concept of diagnostic signatures. Diagnostic signatures in hydrology are targeted analyses of measured data which allow us to investigate specific aspects of catchment response. We apply signatures which target the water balance, the flood response and the recession behaviour. We explore the organisation, similarity and diversity in hydrological processes across the New Zealand landscape, and how these patterns change with scale. We discuss our findings in the context of the strong hydro-climatic gradients in New Zealand, and consider the implications for hydrological model building on a national scale.

  2. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  3. Large-scale functional networks connect differently for processing words and symbol strings.

    Science.gov (United States)

    Liljeström, Mia; Vartiainen, Johanna; Kujala, Jan; Salmelin, Riitta

    2018-01-01

    Reconfigurations of synchronized large-scale networks are thought to be central neural mechanisms that support cognition and behavior in the human brain. Magnetoencephalography (MEG) recordings together with recent advances in network analysis now allow for sub-second snapshots of such networks. In the present study, we compared frequency-resolved functional connectivity patterns underlying reading of single words and visual recognition of symbol strings. Word reading emphasized coherence in a left-lateralized network with nodes in classical perisylvian language regions, whereas symbol processing recruited a bilateral network, including connections between frontal and parietal regions previously associated with spatial attention and visual working memory. Our results illustrate the flexible nature of functional networks, whereby processing of different form categories, written words vs. symbol strings, leads to the formation of large-scale functional networks that operate at distinct oscillatory frequencies and incorporate task-relevant regions. These results suggest that category-specific processing should be viewed not so much as a local process but as a distributed neural process implemented in signature networks. For words, increased coherence was detected particularly in the alpha (8-13 Hz) and high gamma (60-90 Hz) frequency bands, whereas increased coherence for symbol strings was observed in the high beta (21-29 Hz) and low gamma (30-45 Hz) frequency range. These findings attest to the role of coherence in specific frequency bands as a general mechanism for integrating stimulus-dependent information across brain regions.

  4. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  5. Remotely operated replaceable process equipment

    International Nuclear Information System (INIS)

    Westendorf, H.

    1987-01-01

    The coupling process of pneumatic and electrical auxiliary lines of a pneumatic control pressure line in a large cell of the reprocessing plant is carried out, together with the coupling process of the connecting flange of the process equipment. The coupling places of the auxiliary lines, such as control or supply lines, are laid in the flange parts of the flanges to be connected. The pipe flange on the frame side remains flush with the connecting flange of the process equipment. (DG) [de

  6. Processes with large Psub(T) in the quantum chromodynamics

    International Nuclear Information System (INIS)

    Slepchenko, L.A.

    1981-01-01

    Necessary data on deep inelastic processes and processes of hard collision of hadrons and their interpretation in QCD are stated. Low of power reduction of exclusive and inclusive cross sections at large transverse momenta, electromagnetic and inelastic (structural functions) formfactors of hadrons have been discussed. When searching for a method of taking account of QCD effects scaling disturbance was considered. It is shown that for the large transverse momenta the deep inelastic l-h scatterina is represented as the scattering with a compound system (hadron) in the pulse approximation. In an assumption of a parton model obtained was a hadron cross section calculated through a renormalized structural parton function was obtained. Proof of the factorization in the principal logarithmic approximation of QCD has been obtained by means of a quark-gluon diagram technique. The cross section of the hadron reaction in the factorized form, which is analogous to the l-h scattering, has been calculated. It is shown that a) the diagram summing with the gluon emission generates the scaling disturbance in renormalized structural functions (SF) of quarks and gluons and a running coupling constant arises simultaneously; b) the disturbance character of the Bjorken scaling of SF is the same as in the deep inelasic lepton scattering. QCD problems which can not be solved within the framework of the perturbation theory, are discussed. The evolution of SF describing the bound state of a hadron and the hadron light cone have been studied. Radiation corrections arising in two-loop and higher approximations have been evaluated. QCD corrections for point-similar power asymptotes of processes with high energies and transfers of momenta have been studied on the example of the inclusive production of quark and gluon jets. Rules of the quark counting of anomalous dimensionalities of QCD have been obtained. It is concluded that the considered limit of the inclusive cross sections is close to

  7. Calibration of C-14 dates: some remaining uncertainties and limitations

    International Nuclear Information System (INIS)

    Burleigh, R.

    1975-01-01

    A brief review is presented of the interpretation of radiocarbon dates in terms of calendar years. An outline is given of the factors that make such correlations necessary and of the work that has so far been done to make them possible. The calibration of the C-14 timescale very largely depends at present on the bristlecone pine chronology, but it is clear that many detailed uncertainties still remain. These are discussed. (U.K.)

  8. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  9. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  10. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  11. Large Eddy Simulation of Cryogenic Injection Processes at Supercritical Pressure

    Science.gov (United States)

    Oefelein, Joseph C.

    2002-01-01

    This paper highlights results from the first of a series of hierarchical simulations aimed at assessing the modeling requirements for application of the large eddy simulation technique to cryogenic injection and combustion processes in liquid rocket engines. The focus is on liquid-oxygen-hydrogen coaxial injectors at a condition where the liquid-oxygen is injected at a subcritical temperature into a supercritical environment. For this situation a diffusion dominated mode of combustion occurs in the presence of exceedingly large thermophysical property gradients. Though continuous, these gradients approach the behavior of a contact discontinuity. Significant real gas effects and transport anomalies coexist locally in colder regions of the flow, with ideal gas and transport characteristics occurring within the flame zone. The current focal point is on the interfacial region between the liquid-oxygen core and the coaxial hydrogen jet where the flame anchors itself.

  12. The key network communication technology in large radiation image cooperative process system

    International Nuclear Information System (INIS)

    Li Zheng; Kang Kejun; Gao Wenhuan; Wang Jingjin

    1998-01-01

    Large container inspection system (LCIS) based on radiation imaging technology is a powerful tool for the customs to check the contents inside a large container without opening it. An image distributed network system is composed of operation manager station, image acquisition station, environment control station, inspection processing station, check-in station, check-out station, database station by using advanced network technology. Mass data, such as container image data, container general information, manifest scanning data, commands and status, must be on-line transferred between different stations. Advanced network communication technology is presented

  13. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  14. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  15. Increasing a large petrochemical company efficiency by improvement of decision making process

    OpenAIRE

    Kirin Snežana D.; Nešić Lela G.

    2010-01-01

    The paper shows the results of a research conducted in a large petrochemical company, in a state under transition, with the aim to "shed light" on the decision making process from the aspect of personal characteristics of the employees, in order to use the results to improve decision making process and increase company efficiency. The research was conducted by a survey, i.e. by filling out a questionnaire specially made for this purpose, in real conditions, during working hours. The sample of...

  16. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  17. Quality Improvement Process in a Large Intensive Care Unit: Structure and Outcomes.

    Science.gov (United States)

    Reddy, Anita J; Guzman, Jorge A

    2016-11-01

    Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors. © The Author(s) 2015.

  18. APD arrays and large-area APDs via a new planar process

    CERN Document Server

    Farrell, R; Vanderpuye, K; Grazioso, R; Myers, R; Entine, G

    2000-01-01

    A fabrication process has been developed which allows the beveled-edge-type of avalanche photodiode (APD) to be made without the need for the artful bevel formation steps. This new process, applicable to both APD arrays and to discrete detectors, greatly simplifies manufacture and should lead to significant cost reduction for such photodetectors. This is achieved through a simple innovation that allows isolation around the device or array pixel to be brought into the plane of the surface of the silicon wafer, hence a planar process. A description of the new process is presented along with performance data for a variety of APD device and array configurations. APD array pixel gains in excess of 10 000 have been measured. Array pixel coincidence timing resolution of less than 5 ns has been demonstrated. An energy resolution of 6% for 662 keV gamma-rays using a CsI(T1) scintillator on a planar processed large-area APD has been recorded. Discrete APDs with active areas up to 13 cm sup 2 have been operated.

  19. Analysis of reforming process of large distorted ring in final enlarging forging

    International Nuclear Information System (INIS)

    Miyazawa, Takeshi; Murai, Etsuo

    2002-01-01

    In the construction of reactors or pressure vessels for oil chemical plants and nuclear power stations, mono block open-die forging rings are often utilized. Generally, a large forged ring is manufactured by means of enlarging forging with reductions of the wall thickness. During the enlarging process the circular ring is often distorted and becomes an ellipse in shape. However the shape control of the ring is a complicated work. This phenomenon makes the matter still worse in forging of larger rings. In order to make precision forging of large rings, we have developed the forging method using a v-shape anvil. The v-shape anvil is geometrically adjusted to fit the distorted ring in the final circle and reform automatically the shape of the ring during enlarging forging. This paper has analyzed the reforming process of distorted ring by computer program based on F.E.M. and examined the effect on the precision of ring forging. (author)

  20. Work-related factors influencing home care nurse intent to remain employed.

    Science.gov (United States)

    Tourangeau, Ann E; Patterson, Erin; Saari, Margaret; Thomson, Heather; Cranley, Lisa

    Health care is shifting out of hospitals into community settings. In Ontario, Canada, home care organizations continue to experience challenges recruiting and retaining nurses. However, factors influencing home care nurse retention that can be modified remain largely unexplored. Several groups of factors have been identified as influencing home care nurse intent to remain employed including job characteristics, work structures, relationships and communication, work environment, responses to work, and conditions of employment. The aim of this study was to test and refine a model that identifies which factors are related to home care nurse intentions to remain employed for the next 5 years with their current home care employer organization. A cross-sectional survey design was implemented to test and refine a hypothesized model of home care nurse intent to remain employed. Logistic regression was used to determine which factors influence home care nurse intent to remain employed. Home care nurse intent to remain employed for the next 5 years was associated with increasing age, higher nurse-evaluated quality of care, having greater variety of patients, experiencing greater meaningfulness of work, having greater income stability, having greater continuity of client care, experiencing more positive relationships with supervisors, experiencing higher work-life balance, and being more satisfied with salary and benefits. Home care organizations can promote home care nurse intent to remain employed by (a) ensuring nurses have adequate training and resources to provide quality client care, (b) improving employment conditions to increase income stability and satisfaction with pay and benefits, (c) ensuring manageable workloads to facilitate improved work-life balance, and (d) ensuring leaders are accessible and competent.

  1. Hydrothermal processes above the Yellowstone magma chamber: Large hydrothermal systems and large hydrothermal explosions

    Science.gov (United States)

    Morgan, L.A.; Shanks, W.C. Pat; Pierce, K.L.

    2009-01-01

    Hydrothermal explosions are violent and dramatic events resulting in the rapid ejection of boiling water, steam, mud, and rock fragments from source craters that range from a few meters up to more than 2 km in diameter; associated breccia can be emplaced as much as 3 to 4 km from the largest craters. Hydrothermal explosions occur where shallow interconnected reservoirs of steam- and liquid-saturated fluids with temperatures at or near the boiling curve underlie thermal fields. Sudden reduction in confi ning pressure causes fluids to fl ash to steam, resulting in signifi cant expansion, rock fragmentation, and debris ejection. In Yellowstone, hydrothermal explosions are a potentially signifi cant hazard for visitors and facilities and can damage or even destroy thermal features. The breccia deposits and associated craters formed from hydrothermal explosions are mapped as mostly Holocene (the Mary Bay deposit is older) units throughout Yellowstone National Park (YNP) and are spatially related to within the 0.64-Ma Yellowstone caldera and along the active Norris-Mammoth tectonic corridor. In Yellowstone, at least 20 large (>100 m in diameter) hydrothermal explosion craters have been identifi ed; the scale of the individual associated events dwarfs similar features in geothermal areas elsewhere in the world. Large hydrothermal explosions in Yellowstone have occurred over the past 16 ka averaging ??1 every 700 yr; similar events are likely in the future. Our studies of large hydrothermal explosion events indicate: (1) none are directly associated with eruptive volcanic or shallow intrusive events; (2) several historical explosions have been triggered by seismic events; (3) lithic clasts and comingled matrix material that form hydrothermal explosion deposits are extensively altered, indicating that explosions occur in areas subjected to intense hydrothermal processes; (4) many lithic clasts contained in explosion breccia deposits preserve evidence of repeated fracturing

  2. A Study on Generic Representation of Skeletal Remains Replication of Prehistoric Burial

    Directory of Open Access Journals (Sweden)

    C.-W. Shao

    2015-08-01

    Full Text Available Generic representation of skeletal remains from burials consists of three dimensions which include physical anthropologists, replication technicians, and promotional educators. For the reason that archaeological excavation is irreversible and disruptive, detail documentation and replication technologies are surely needed for many purposes. Unearthed bones during the process of 3D digital scanning need to go through reverse procedure, 3D scanning, digital model superimposition, rapid prototyping, mould making, and the integrated errors generated from the presentation of colours and textures are important issues for the presentation of replicate skeleton remains among professional decisions conducted by physical anthropologists, subjective determination of makers, and the expectations of viewers. This study presents several cases and examines current issues on display and replication technologies for human skeletal remains of prehistoric burials. This study documented detail colour changes of human skeleton over time for the reference of reproduction. The tolerance errors of quantification and required technical qualification is acquired according to the precision of 3D scanning, the specification requirement of rapid prototyping machine, and the mould making process should following the professional requirement for physical anthropological study. Additionally, the colorimeter is adopted to record and analyse the “colour change” of the human skeletal remains from wet to dry condition. Then, the “colure change” is used to evaluate the “real” surface texture and colour presentation of human skeletal remains, and to limit the artistic presentation among the human skeletal remains reproduction. The“Lingdao man No.1”, is a well preserved burial of early Neolithic period (8300 B.P. excavated from Liangdao-Daowei site, Matsu, Taiwan , as the replicating object for this study. In this study, we examined the reproduction procedures step by

  3. Mummified remains from the Archaeological Museum in Zagreb, Croatia - Reviewing peculiarities and limitations of human and non-human radiological identification and analysis in mummified remains.

    Science.gov (United States)

    Petaros, Anja; Janković, Ivor; Cavalli, Fabio; Ivanac, Gordana; Brkljačić, Boris; Čavka, Mislav

    2015-10-01

    Forensic protocols and medico-legal techniques are increasingly being employed in investigations of museological material. The final findings of such investigations may reveal interesting facts on historical figures, customs and habits, as well as provide meaningful data for forensic use. Herein we present a case review where forensic experts were requested to identify taxonomic affinities, stage of preservation and provide skeletal analysis of mummified non-human archaeological remains, and verify whether two mummified hands are human or not. The manuscript offers a short review on the process and particularities of radiological species identification, the impact of post-mortem changes in the analysis and imaging of mummified remains as well as the macroscopical interpretation of trauma, pathology and authenticity in mummified remains, which can all turn useful when dealing with forensic cases. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  4. Optimization of DNA recovery and amplification from non-carbonized archaeobotanical remains

    DEFF Research Database (Denmark)

    Wales, Nathan; Andersen, Kenneth; Cappellini, Enrico

    2014-01-01

    Ancient DNA (aDNA) recovered from archaeobotanical remains can provide key insights into many prominent archaeological research questions, including processes of domestication, past subsistence strategies, and human interactions with the environment. However, it is often difficult to isolate a...... extracted from non-charred ancient plant remains. Based upon the criteria of resistance to enzymatic inhibition, behavior in quantitative real-time PCR, replication fidelity, and compatibility with aDNA damage, we conclude these polymerases have nuanced properties, requiring researchers to make educated...... on the interactions between humans and past plant communities....

  5. Modeling of large-scale oxy-fuel combustion processes

    DEFF Research Database (Denmark)

    Yin, Chungen

    2012-01-01

    Quite some studies have been conducted in order to implement oxy-fuel combustion with flue gas recycle in conventional utility boilers as an effective effort of carbon capture and storage. However, combustion under oxy-fuel conditions is significantly different from conventional air-fuel firing......, among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes. Oxy-fuel combustion of natural gas in a 609MW utility boiler is numerically studied, in which...... calculation of the oxy-fuel WSGGM remarkably over-predicts the radiative heat transfer to the furnace walls and under-predicts the gas temperature at the furnace exit plane, which also result in a higher incomplete combustion in the gray calculation. Moreover, the gray and non-gray calculations of the same...

  6. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  7. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  8. Remaining useful life estimation in heterogeneous fleets working under variable operating conditions

    International Nuclear Information System (INIS)

    Al-Dahidi, Sameer; Di Maio, Francesco; Baraldi, Piero; Zio, Enrico

    2016-01-01

    The availability of condition monitoring data for large fleets of similar equipment motivates the development of data-driven prognostic approaches that capitalize on the information contained in such data to estimate equipment Remaining Useful Life (RUL). A main difficulty is that the fleet of equipment typically experiences different operating conditions, which influence both the condition monitoring data and the degradation processes that physically determine the RUL. We propose an approach for RUL estimation from heterogeneous fleet data based on three phases: firstly, the degradation levels (states) of an homogeneous discrete-time finite-state semi-markov model are identified by resorting to an unsupervised ensemble clustering approach. Then, the parameters of the discrete Weibull distributions describing the transitions among the states and their uncertainties are inferred by resorting to the Maximum Likelihood Estimation (MLE) method and to the Fisher Information Matrix (FIM), respectively. Finally, the inferred degradation model is used to estimate the RUL of fleet equipment by direct Monte Carlo (MC) simulation. The proposed approach is applied to two case studies regarding heterogeneous fleets of aluminium electrolytic capacitors and turbofan engines. Results show the effectiveness of the proposed approach in predicting the RUL and its superiority compared to a fuzzy similarity-based approach of literature. - Highlights: • The prediction of the remaining useful life for heterogeneous fleets is addressed. • A data-driven prognostics approach based on a Markov model is proposed. • The proposed approach is applied to two different heterogeneous fleets. • The results are compared with those obtained by a fuzzy similarity-based approach.

  9. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  10. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  11. Large catchment area recharges Titan's Ontario Lacus

    Science.gov (United States)

    Dhingra, Rajani D.; Barnes, Jason W.; Yanites, Brian J.; Kirk, Randolph L.

    2018-01-01

    We seek to address the question of what processes are at work to fill Ontario Lacus while other, deeper south polar basins remain empty. Our hydrological analysis indicates that Ontario Lacus has a catchment area spanning 5.5% of Titan's surface and a large catchment area to lake surface area ratio. This large catchment area translates into large volumes of liquid making their way to Ontario Lacus after rainfall. The areal extent of the catchment extends to at least southern mid-latitudes (40°S). Mass conservation calculations indicate that runoff alone might completely fill Ontario Lacus within less than half a Titan year (1 Titan year = 29.5 Earth years) assuming no infiltration. Cassini Visual and Infrared Mapping Spectrometer (VIMS) observations of clouds over the southern mid and high-latitudes are consistent with precipitation feeding Ontario's large catchment area. This far-flung rain may be keeping Ontario Lacus filled, making it a liquid hydrocarbon oasis in the relatively dry south polar region.

  12. COPASutils: an R package for reading, processing, and visualizing data from COPAS large-particle flow cytometers.

    Directory of Open Access Journals (Sweden)

    Tyler C Shimko

    Full Text Available The R package COPASutils provides a logical workflow for the reading, processing, and visualization of data obtained from the Union Biometrica Complex Object Parametric Analyzer and Sorter (COPAS or the BioSorter large-particle flow cytometers. Data obtained from these powerful experimental platforms can be unwieldy, leading to difficulties in the ability to process and visualize the data using existing tools. Researchers studying small organisms, such as Caenorhabditis elegans, Anopheles gambiae, and Danio rerio, and using these devices will benefit from this streamlined and extensible R package. COPASutils offers a powerful suite of functions for the rapid processing and analysis of large high-throughput screening data sets.

  13. Mechanisms of action of brief alcohol interventions remain largely unknown - a narrative review.

    Science.gov (United States)

    Gaume, Jacques; McCambridge, Jim; Bertholet, Nicolas; Daeppen, Jean-Bernard

    2014-01-01

    A growing body of evidence has shown the efficacy of brief intervention (BI) for hazardous and harmful alcohol use in primary health care settings. Evidence for efficacy in other settings and effectiveness when implemented at larger scale are disappointing. Indeed, BI comprises varying content; exploring BI content and mechanisms of action may be a promising way to enhance efficacy and effectiveness. Medline and PsychInfo, as well as references of retrieved publications were searched for original research or review on active ingredients (components or mechanisms) of face-to-face BIs [and its subtypes, including brief advice and brief motivational interviewing (BMI)] for alcohol. Overall, BI active ingredients have been scarcely investigated, almost only within BMI, and mostly among patients in the emergency room, young adults, and US college students. This body of research has shown that personalized feedback may be an effective component; specific MI techniques showed mixed findings; decisional balance findings tended to suggest a potential detrimental effect; while change plan exercises, advice to reduce or stop drinking, presenting alternative change options, and moderation strategies are promising but need further study. Client change talk is a potential mediator of BMI effects; change in norm perceptions and enhanced discrepancy between current behavior and broader life goals and values have received preliminary support; readiness to change was only partially supported as a mediator; while enhanced awareness of drinking, perceived risks/benefits of alcohol use, alcohol treatment seeking, and self-efficacy were seldom studied and have as yet found no significant support as such. Research is obviously limited and has provided no clear and consistent evidence on the mechanisms of alcohol BI. How BI achieves the effects seen in randomized trials remains mostly unknown and should be investigated to inform the development of more effective interventions.

  14. Large deviations for the Fleming-Viot process with neutral mutation and selection

    OpenAIRE

    Dawson, Donald; Feng, Shui

    1998-01-01

    Large deviation principles are established for the Fleming-Viot processes with neutral mutation and selection, and the corresponding equilibrium measures as the sampling rate goes to 0. All results are first proved for the finite allele model, and then generalized, through the projective limit technique, to the infinite allele model. Explicit expressions are obtained for the rate functions.

  15. New fossil remains of Homo naledi from the Lesedi Chamber, South Africa

    Science.gov (United States)

    Hawks, John; Elliott, Marina; Schmid, Peter; Churchill, Steven E; de Ruiter, Darryl J; Roberts, Eric M; Hilbert-Wolf, Hannah; Garvin, Heather M; Williams, Scott A; Delezene, Lucas K; Feuerriegel, Elen M; Randolph-Quinney, Patrick; Kivell, Tracy L; Laird, Myra F; Tawane, Gaokgatlhe; DeSilva, Jeremy M; Bailey, Shara E; Brophy, Juliet K; Meyer, Marc R; Skinner, Matthew M; Tocheri, Matthew W; VanSickle, Caroline; Walker, Christopher S; Campbell, Timothy L; Kuhn, Brian; Kruger, Ashley; Tucker, Steven; Gurtov, Alia; Hlophe, Nompumelelo; Hunter, Rick; Morris, Hannah; Peixotto, Becca; Ramalepa, Maropeng; van Rooyen, Dirk; Tsikoane, Mathabela; Boshoff, Pedro; Dirks, Paul HGM; Berger, Lee R

    2017-01-01

    The Rising Star cave system has produced abundant fossil hominin remains within the Dinaledi Chamber, representing a minimum of 15 individuals attributed to Homo naledi. Further exploration led to the discovery of hominin material, now comprising 131 hominin specimens, within a second chamber, the Lesedi Chamber. The Lesedi Chamber is far separated from the Dinaledi Chamber within the Rising Star cave system, and represents a second depositional context for hominin remains. In each of three collection areas within the Lesedi Chamber, diagnostic skeletal material allows a clear attribution to H. naledi. Both adult and immature material is present. The hominin remains represent at least three individuals based upon duplication of elements, but more individuals are likely present based upon the spatial context. The most significant specimen is the near-complete cranium of a large individual, designated LES1, with an endocranial volume of approximately 610 ml and associated postcranial remains. The Lesedi Chamber skeletal sample extends our knowledge of the morphology and variation of H. naledi, and evidence of H. naledi from both recovery localities shows a consistent pattern of differentiation from other hominin species. DOI: http://dx.doi.org/10.7554/eLife.24232.001 PMID:28483039

  16. Developing Large Web Applications

    CERN Document Server

    Loudon, Kyle

    2010-01-01

    How do you create a mission-critical site that provides exceptional performance while remaining flexible, adaptable, and reliable 24/7? Written by the manager of a UI group at Yahoo!, Developing Large Web Applications offers practical steps for building rock-solid applications that remain effective even as you add features, functions, and users. You'll learn how to develop large web applications with the extreme precision required for other types of software. Avoid common coding and maintenance headaches as small websites add more pages, more code, and more programmersGet comprehensive soluti

  17. Parallel Index and Query for Large Scale Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver; Howison, Mark; Qiang, Ji; Prabhat,; Austin, Brian; Bethel, E. Wes; Ryne, Rob D.; Shoshani, Arie

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing of a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.

  18. High-Temperature-Short-Time Annealing Process for High-Performance Large-Area Perovskite Solar Cells.

    Science.gov (United States)

    Kim, Minjin; Kim, Gi-Hwan; Oh, Kyoung Suk; Jo, Yimhyun; Yoon, Hyun; Kim, Ka-Hyun; Lee, Heon; Kim, Jin Young; Kim, Dong Suk

    2017-06-27

    Organic-inorganic hybrid metal halide perovskite solar cells (PSCs) are attracting tremendous research interest due to their high solar-to-electric power conversion efficiency with a high possibility of cost-effective fabrication and certified power conversion efficiency now exceeding 22%. Although many effective methods for their application have been developed over the past decade, their practical transition to large-size devices has been restricted by difficulties in achieving high performance. Here we report on the development of a simple and cost-effective production method with high-temperature and short-time annealing processing to obtain uniform, smooth, and large-size grain domains of perovskite films over large areas. With high-temperature short-time annealing at 400 °C for 4 s, the perovskite film with an average domain size of 1 μm was obtained, which resulted in fast solvent evaporation. Solar cells fabricated using this processing technique had a maximum power conversion efficiency exceeding 20% over a 0.1 cm 2 active area and 18% over a 1 cm 2 active area. We believe our approach will enable the realization of highly efficient large-area PCSs for practical development with a very simple and short-time procedure. This simple method should lead the field toward the fabrication of uniform large-scale perovskite films, which are necessary for the production of high-efficiency solar cells that may also be applicable to several other material systems for more widespread practical deployment.

  19. Enhanced Contaminated Human Remains Pouch: initial development and preliminary performance assessments

    Energy Technology Data Exchange (ETDEWEB)

    Iseli, A.M.; Kwen, H.D.; Ul-Alam, M.; Balasubramanian, M.; Rajagopalan, S.

    2011-11-07

    The objective is to produce a proof of concept prototype Enhanced Contaminated Human Remains Pouch (ECHRP) with self-decontamination capability to provide increased protection to emergency response personnel. The key objective was to decrease the concentration of toxic chemicals through the use of an absorbent and reactive nanocellulose liner. Additionally, nanomaterials with biocidal properties were developed and tested as a 'stand-alone' treatment. The setting was a private company research laboratory. The main outcome measures were production of a functional prototype. A functional prototype capable of mitigating the threats due to sulfur mustard, Soman, and a large variety of liquid and vapor toxic industrial chemicals was produced. Stand-alone biocidal treatment efficacy was validated. The ECHRP provides superior protection from both chemical and biological hazards to various emergency response personnel and human remains handlers.

  20. Large-scale continuous process to vitrify nuclear defense waste: operating experience with nonradioactive waste

    International Nuclear Information System (INIS)

    Cosper, M.B.; Randall, C.T.; Traverso, G.M.

    1982-01-01

    The developmental program underway at SRL has demonstrated the vitrification process proposed for the sludge processing facility of the DWPF on a large scale. DWPF design criteria for production rate, equipment lifetime, and operability have all been met. The expected authorization and construction of the DWPF will result in the safe and permanent immobilization of a major quantity of existing high level waste. 11 figures, 4 tables

  1. Large deviations

    CERN Document Server

    Varadhan, S R S

    2016-01-01

    The theory of large deviations deals with rates at which probabilities of certain events decay as a natural parameter in the problem varies. This book, which is based on a graduate course on large deviations at the Courant Institute, focuses on three concrete sets of examples: (i) diffusions with small noise and the exit problem, (ii) large time behavior of Markov processes and their connection to the Feynman-Kac formula and the related large deviation behavior of the number of distinct sites visited by a random walk, and (iii) interacting particle systems, their scaling limits, and large deviations from their expected limits. For the most part the examples are worked out in detail, and in the process the subject of large deviations is developed. The book will give the reader a flavor of how large deviation theory can help in problems that are not posed directly in terms of large deviations. The reader is assumed to have some familiarity with probability, Markov processes, and interacting particle systems.

  2. Accelerated decomposition techniques for large discounted Markov decision processes

    Science.gov (United States)

    Larach, Abdelhadi; Chafik, S.; Daoui, C.

    2017-12-01

    Many hierarchical techniques to solve large Markov decision processes (MDPs) are based on the partition of the state space into strongly connected components (SCCs) that can be classified into some levels. In each level, smaller problems named restricted MDPs are solved, and then these partial solutions are combined to obtain the global solution. In this paper, we first propose a novel algorithm, which is a variant of Tarjan's algorithm that simultaneously finds the SCCs and their belonging levels. Second, a new definition of the restricted MDPs is presented to ameliorate some hierarchical solutions in discounted MDPs using value iteration (VI) algorithm based on a list of state-action successors. Finally, a robotic motion-planning example and the experiment results are presented to illustrate the benefit of the proposed decomposition algorithms.

  3. Application of large radiation sources in chemical processing industry

    International Nuclear Information System (INIS)

    Krishnamurthy, K.

    1977-01-01

    Large radiation sources and their application in chemical processing industry are described. A reference has also been made to the present developments in this field in India. Radioactive sources, notably 60 Co, are employed in production of wood-plastic and concrete-polymer composites, vulcanised rubbers, polymers, sulfochlorinated paraffin hydrocarbons and in a number of other applications which require deep penetration and high reliability of source. Machine sources of electrons are used in production of heat shrinkable plastics, insulation materials for cables, curing of paints etc. Radiation sources have also been used for sewage hygienisation. As for the scene in India, 60 Co sources, gamma chambers and batch irradiators are manufactured. A list of the on-going R and D projects and organisations engaged in research in this field is given. (M.G.B.)

  4. Curbing variations in packaging process through Six Sigma way in a large-scale food-processing industry

    Science.gov (United States)

    Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar

    2015-03-01

    Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.

  5. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    Science.gov (United States)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  6. Grafting on nuclear tracks using the active sites that remain after the etching process

    International Nuclear Information System (INIS)

    Mazzei, R.; Bermudez, G. Garcia; Chappa, V.C.; Grosso, M.F. del; Fernandez, A.

    2006-01-01

    Poly(propylene) foils were irradiated with Ag ions and then chemically etched to produce samples with structured surfaces. After the etching procedure the active sites that remain on the latent track were used to graft acrylic acid. Nuclear tracks before grafting were visualised using a transmission electron microscope. The grafting yields were determined by weight measurements as a function of ion fluence, etching and grafting time, and were also analysed using Fourier transform infrared spectroscopy. Both measurements suggest that the acrylic acid was grafted on etched tracks using the active sites produced by the swift heavy ion beam

  7. Grafting on nuclear tracks using the active sites that remain after the etching process

    Energy Technology Data Exchange (ETDEWEB)

    Mazzei, R. [Unidad de Aplicaciones Tecnologicas y Agropecuarias, CNEA, 1429 Buenos Aires (Argentina) and Universidad Tecnologica Nacional, Buenos Aires (Argentina)]. E-mail: mazzei@cae.cnea.gov.ar; Bermudez, G. Garcia [U. A. de Fisica, Tandar, CNEA, 1429 Buenos Aires (Argentina); Escuela de Ciencia y Tecnologia, UNSAM, 1653 Buenos Aires (Argentina); Consejo Nacional de Investigaciones Cientificas y Tecnicas (Argentina); Chappa, V.C. [U. A. de Fisica, Tandar, CNEA, 1429 Buenos Aires (Argentina); Grosso, M.F. del [U. A. de Fisica, Tandar, CNEA, 1429 Buenos Aires (Argentina); U. A. de Materiales, CNEA, 1429 Buenos Aires (Argentina); Fernandez, A. [Universidad Tecnologica Nacional, Buenos Aires (Argentina)

    2006-09-15

    Poly(propylene) foils were irradiated with Ag ions and then chemically etched to produce samples with structured surfaces. After the etching procedure the active sites that remain on the latent track were used to graft acrylic acid. Nuclear tracks before grafting were visualised using a transmission electron microscope. The grafting yields were determined by weight measurements as a function of ion fluence, etching and grafting time, and were also analysed using Fourier transform infrared spectroscopy. Both measurements suggest that the acrylic acid was grafted on etched tracks using the active sites produced by the swift heavy ion beam.

  8. Operational experinece with large scale biogas production at the promest manure processing plant in Helmond, the Netherlands

    International Nuclear Information System (INIS)

    Schomaker, A.H.H.M.

    1992-01-01

    In The Netherlands a surplus of 15 million tons of liquid pig manure is produced yearly on intensive pig breeding farms. The dutch government has set a three-way policy to reduce this excess of manure: 1. conversion of animal fodder into a product with less and better ingestible nutrients; 2. distribution of the surplus to regions with a shortage of animal manure; 3. processing of the remainder of the surplus in large scale processing plants. The first large scale plant for the processing of liquid pig manure was put in operation in 1988 as a demonstration plant at Promest in Helmond. The design capacity of this plant is 100,000 tons of pig manure per year. The plant was initiated by the Manure Steering Committee of the province Noord-Brabant in order to prove at short notice whether large scale manure processing might contribute to the solution of the problem of the manure surplus in The Netherlands. This steering committee is a corporation of the national and provincial government and the agricultural industrial life. (au)

  9. On conservation of the baryon chirality in the processes with large momentum transfer

    International Nuclear Information System (INIS)

    Ioffe, B.L.

    1976-01-01

    The hypothesis of the baryon chirality conservation in the processes with large momentum transfer is suggested and some arguments in its favour are made. Experimental implicatiosns of this assumption for weak and electromagnetic form factors of transitions in the baryon octet and of transitions N → Δ, N → Σsup(*) are considered

  10. Remaining useful life prediction of degrading systems subjected to imperfect maintenance: Application to draught fans

    Science.gov (United States)

    Wang, Zhao-Qiang; Hu, Chang-Hua; Si, Xiao-Sheng; Zio, Enrico

    2018-02-01

    Current degradation modeling and remaining useful life prediction studies share a common assumption that the degrading systems are not maintained or maintained perfectly (i.e., to an as-good-as new state). This paper concerns the issues of how to model the degradation process and predict the remaining useful life of degrading systems subjected to imperfect maintenance activities, which can restore the health condition of a degrading system to any degradation level between as-good-as new and as-bad-as old. Toward this end, a nonlinear model driven by Wiener process is first proposed to characterize the degradation trajectory of the degrading system subjected to imperfect maintenance, where negative jumps are incorporated to quantify the influence of imperfect maintenance activities on the system's degradation. Then, the probability density function of the remaining useful life is derived analytically by a space-scale transformation, i.e., transforming the constructed degradation model with negative jumps crossing a constant threshold level to a Wiener process model crossing a random threshold level. To implement the proposed method, unknown parameters in the degradation model are estimated by the maximum likelihood estimation method. Finally, the proposed degradation modeling and remaining useful life prediction method are applied to a practical case of draught fans belonging to a kind of mechanical systems from steel mills. The results reveal that, for a degrading system subjected to imperfect maintenance, our proposed method can obtain more accurate remaining useful life predictions than those of the benchmark model in literature.

  11. [PALEOPATHOLOGY OF HUMAN REMAINS].

    Science.gov (United States)

    Minozzi, Simona; Fornaciari, Gino

    2015-01-01

    Many diseases induce alterations in the human skeleton, leaving traces of their presence in ancient remains. Paleopathological examination of human remains not only allows the study of the history and evolution of the disease, but also the reconstruction of health conditions in the past populations. This paper describes the most interesting diseases observed in skeletal samples from the Roman Imperial Age necropoles found in urban and suburban areas of Rome during archaeological excavations in the last decades. The diseases observed were grouped into the following categories: articular diseases, traumas, infections, metabolic or nutritional diseases, congenital diseases and tumours, and some examples are reported for each group. Although extensive epidemiological investigation in ancient skeletal records is impossible, the palaeopathological study allowed to highlight the spread of numerous illnesses, many of which can be related to the life and health conditions of the Roman population.

  12. Hadronic processes with large transfer momenta and quark counting rules in multiparticle dual amplitude

    International Nuclear Information System (INIS)

    Akkelin, S.V.; Kobylinskij, N.A.; Martynov, E.S.

    1989-01-01

    A dual N-particle amplitude satisfying the quark counting rules for the processes with large transfer momenta is constructed. The multiparticle channels are shown to give an essential contribution to the amplitude decreasing power in a hard kinematic limit. 19 refs.; 9 figs

  13. Remotely operated replaceable process equipment. Fernbedient austauschbare Prozessapparatur

    Energy Technology Data Exchange (ETDEWEB)

    Westendorf, H.

    1987-07-23

    The coupling process of pneumatic and electrical auxiliary lines of a pneumatic control pressure line in a large cell of the reprocessing plant is carried out, together with the coupling process of the connecting flange of the process equipment. The coupling places of the auxiliary lines, such as control or supply lines, are laid in the flange parts of the flanges to be connected. The pipe flange on the frame side remains flush with the connecting flange of the process equipment.

  14. Graph Processing on GPUs: A Survey

    DEFF Research Database (Denmark)

    Shi, Xuanhua; Zheng, Zhigao; Zhou, Yongluan

    2018-01-01

    hundreds of billions, has attracted much attention in both industry and academia. It still remains a great challenge to process such large-scale graphs. Researchers have been seeking for new possible solutions. Because of the massive degree of parallelism and the high memory access bandwidth in GPU......, utilizing GPU to accelerate graph processing proves to be a promising solution. This article surveys the key issues of graph processing on GPUs, including data layout, memory access pattern, workload mapping, and specific GPU programming. In this article, we summarize the state-of-the-art research on GPU...

  15. Research on Francis Turbine Modeling for Large Disturbance Hydropower Station Transient Process Simulation

    Directory of Open Access Journals (Sweden)

    Guangtao Zhang

    2015-01-01

    Full Text Available In the field of hydropower station transient process simulation (HSTPS, characteristic graph-based iterative hydroturbine model (CGIHM has been widely used when large disturbance hydroturbine modeling is involved. However, by this model, iteration should be used to calculate speed and pressure, and slow convergence or no convergence problems may be encountered for some reasons like special characteristic graph profile, inappropriate iterative algorithm, or inappropriate interpolation algorithm, and so forth. Also, other conventional large disturbance hydroturbine models are of some disadvantages and difficult to be used widely in HSTPS. Therefore, to obtain an accurate simulation result, a simple method for hydroturbine modeling is proposed. By this method, both the initial operating point and the transfer coefficients of linear hydroturbine model keep changing during simulation. Hence, it can reflect the nonlinearity of the hydroturbine and be used for Francis turbine simulation under large disturbance condition. To validate the proposed method, both large disturbance and small disturbance simulations of a single hydrounit supplying a resistive, isolated load were conducted. It was shown that the simulation result is consistent with that of field test. Consequently, the proposed method is an attractive option for HSTPS involving Francis turbine modeling under large disturbance condition.

  16. Large-scale calculations of the beta-decay rates and r-process nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Borzov, I N; Goriely, S [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); Pearson, J M [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); [Lab. de Physique Nucleaire, Univ. de Montreal, Montreal (Canada)

    1998-06-01

    An approximation to a self-consistent model of the ground state and {beta}-decay properties of neutron-rich nuclei is outlined. The structure of the {beta}-strength functions in stable and short-lived nuclei is discussed. The results of large-scale calculations of the {beta}-decay rates for spherical and slightly deformed nuclides of relevance to the r-process are analysed and compared with the results of existing global calculations and recent experimental data. (orig.)

  17. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  18. Process optimization of large-scale production of recombinant adeno-associated vectors using dielectric spectroscopy.

    Science.gov (United States)

    Negrete, Alejandro; Esteban, Geoffrey; Kotin, Robert M

    2007-09-01

    A well-characterized manufacturing process for the large-scale production of recombinant adeno-associated vectors (rAAV) for gene therapy applications is required to meet current and future demands for pre-clinical and clinical studies and potential commercialization. Economic considerations argue in favor of suspension culture-based production. Currently, the only feasible method for large-scale rAAV production utilizes baculovirus expression vectors and insect cells in suspension cultures. To maximize yields and achieve reproducibility between batches, online monitoring of various metabolic and physical parameters is useful for characterizing early stages of baculovirus-infected insect cells. In this study, rAAVs were produced at 40-l scale yielding ~1 x 10(15) particles. During the process, dielectric spectroscopy was performed by real time scanning in radio frequencies between 300 kHz and 10 MHz. The corresponding permittivity values were correlated with the rAAV production. Both infected and uninfected reached a maximum value; however, only infected cell cultures permittivity profile reached a second maximum value. This effect was correlated with the optimal harvest time for rAAV production. Analysis of rAAV indicated the harvesting time around 48 h post-infection (hpi), and 72 hpi produced similar quantities of biologically active rAAV. Thus, if operated continuously, the 24-h reduction in the production process of rAAV gives sufficient time for additional 18 runs a year corresponding to an extra production of ~2 x 10(16) particles. As part of large-scale optimization studies, this new finding will facilitate the bioprocessing scale-up of rAAV and other bioproducts.

  19. Solid-state supercapacitors with rationally designed heterogeneous electrodes fabricated by large area spray processing for wearable energy storage applications

    Science.gov (United States)

    Huang, Chun; Zhang, Jin; Young, Neil P.; Snaith, Henry J.; Grant, Patrick S.

    2016-01-01

    Supercapacitors are in demand for short-term electrical charge and discharge applications. Unlike conventional supercapacitors, solid-state versions have no liquid electrolyte and do not require robust, rigid packaging for containment. Consequently they can be thinner, lighter and more flexible. However, solid-state supercapacitors suffer from lower power density and where new materials have been developed to improve performance, there remains a gap between promising laboratory results that usually require nano-structured materials and fine-scale processing approaches, and current manufacturing technology that operates at large scale. We demonstrate a new, scalable capability to produce discrete, multi-layered electrodes with a different material and/or morphology in each layer, and where each layer plays a different, critical role in enhancing the dynamics of charge/discharge. This layered structure allows efficient utilisation of each material and enables conservative use of hard-to-obtain materials. The layered electrode shows amongst the highest combinations of energy and power densities for solid-state supercapacitors. Our functional design and spray manufacturing approach to heterogeneous electrodes provide a new way forward for improved energy storage devices. PMID:27161379

  20. Solid-state supercapacitors with rationally designed heterogeneous electrodes fabricated by large area spray processing for wearable energy storage applications.

    Science.gov (United States)

    Huang, Chun; Zhang, Jin; Young, Neil P; Snaith, Henry J; Grant, Patrick S

    2016-05-10

    Supercapacitors are in demand for short-term electrical charge and discharge applications. Unlike conventional supercapacitors, solid-state versions have no liquid electrolyte and do not require robust, rigid packaging for containment. Consequently they can be thinner, lighter and more flexible. However, solid-state supercapacitors suffer from lower power density and where new materials have been developed to improve performance, there remains a gap between promising laboratory results that usually require nano-structured materials and fine-scale processing approaches, and current manufacturing technology that operates at large scale. We demonstrate a new, scalable capability to produce discrete, multi-layered electrodes with a different material and/or morphology in each layer, and where each layer plays a different, critical role in enhancing the dynamics of charge/discharge. This layered structure allows efficient utilisation of each material and enables conservative use of hard-to-obtain materials. The layered electrode shows amongst the highest combinations of energy and power densities for solid-state supercapacitors. Our functional design and spray manufacturing approach to heterogeneous electrodes provide a new way forward for improved energy storage devices.

  1. Ethanol production from residual wood chips of cellulose industry: acid pretreatment investigation, hemicellulosic hydrolysate fermentation, and remaining solid fraction fermentation by SSF process.

    Science.gov (United States)

    Silva, Neumara Luci Conceição; Betancur, Gabriel Jaime Vargas; Vasquez, Mariana Peñuela; Gomes, Edelvio de Barros; Pereira, Nei

    2011-04-01

    Current research indicates the ethanol fuel production from lignocellulosic materials, such as residual wood chips from the cellulose industry, as new emerging technology. This work aimed at evaluating the ethanol production from hemicellulose of eucalyptus chips by diluted acid pretreatment and the subsequent fermentation of the generated hydrolysate by a flocculating strain of Pichia stipitis. The remaining solid fraction generated after pretreatment was subjected to enzymatic hydrolysis, which was carried out simultaneously with glucose fermentation [saccharification and fermentation (SSF) process] using a strain of Saccharomyces cerevisiae. The acid pretreatment was evaluated using a central composite design for sulfuric acid concentration (1.0-4.0 v/v) and solid to liquid ratio (1:2-1:4, grams to milliliter) as independent variables. A maximum xylose concentration of 50 g/L was obtained in the hemicellulosic hydrolysate. The fermentation of hemicellulosic hydrolysate and the SSF process were performed in bioreactors and the final ethanol concentrations of 15.3 g/L and 28.7 g/L were obtained, respectively.

  2. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    Science.gov (United States)

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Accelerating large-scale protein structure alignments with graphics processing units

    Directory of Open Access Journals (Sweden)

    Pang Bin

    2012-02-01

    Full Text Available Abstract Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs. As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU.

  4. Mechanisms of action of brief alcohol interventions remain largely unknown – A narrative review

    Directory of Open Access Journals (Sweden)

    Jacques eGaume

    2014-08-01

    Full Text Available A growing body of evidence has shown efficacy of brief intervention (BI for hazardous and harmful alcohol use in primary health care settings. Evidence for efficacy in other settings, and effectiveness when implemented at larger scale is disappointing. Indeed, BI comprises varying content, and exploring BI content and mechanisms of action may be a promising way to enhance efficacy and effectiveness.We searched Medline and PsychInfo, as well as references of retrieved publications for original research or reviews on active ingredients (or components, or mechanisms of face-to-face BIs (and its subtypes, including brief advice and brief motivational interviewing [BMI] for alcohol. Overall, BI active ingredients have been scarcely investigated, almost only within BMI, and mostly among Emergency Room patients, young adults, and US college students. This body of research has shown that personalized feedback may be an effective component; specific MI techniques showed mixed findings; decisional balance findings tended to suggest a potential detrimental effect; while change plan exercises, advice to reduce or stop drinking, presenting alternative change options, and moderation strategies are promising but need further study. Client change talk is a potential mediator of BMI effects; change in norm perceptions and enhanced discrepancy between current behavior and broader life goals and values have received preliminary support; readiness to change was only partially supported as a mediator; while enhanced awareness of drinking, perceived risks/benefits of alcohol use, alcohol treatment seeking, and self-efficacy were seldom studied and have as yet found no significant support as such.Research is obviously limited and has provided no clear and consistent evidence on the mechanisms of alcohol BI. How BI achieves the effects seen in randomized trials remains mostly unknown and should be investigated to inform the development of more effective interventions.

  5. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part I

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    This work presents a new strategy for fault diagnosis in large chemical processes (E.E. Tarifa, Fault diagnosis in complex chemistries plants: plants of large dimensions and batch processes. Ph.D. thesis, Universidad Nacional del Litoral, Santa Fe, 1995). A special decomposition of the plant is made in sectors. Afterwards each sector is studied independently. These steps are carried out in the off-line mode. They produced vital information for the diagnosis system. This system works in the on-line mode and is based on a two-tier strategy. When a fault is produced, the upper level identifies the faulty sector. Then, the lower level carries out an in-depth study that focuses only on the critical sectors to identify the fault. The loss of information produced by the process partition may cause spurious diagnosis. This problem is overcome at the second level using qualitative simulation and fuzzy logic. In the second part of this work, the new methodology is tested to evaluate its performance in practical cases. A multiple stage flash desalination system (MSF) is chosen because it is a complex system, with many recycles and variables to be supervised. The steps for the knowledge base generation and all the blocks included in the diagnosis system are analyzed. Evaluation of the diagnosis performance is carried out using a rigorous dynamic simulator

  6. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  7. Fish and other faunal remains from a Late Iron Age site on the Letaba River, Kruger National Park

    Directory of Open Access Journals (Sweden)

    Ina Plug

    1991-09-01

    Full Text Available Fish remains from Late Iron Age sites in the Transvaal are relatively scarce. It seems as if the people did not utilize the riverine resources extensively. Therefore the unique assemblage of large numbers of fish bones on a Late Iron Age site, provides some insight into the fish population of a section of the Letaba River a few hundred years ago. The presence of other faunal remains provides some information on prehistoric utilization of the environment in general. Hunting strategies and aspects of herding can also be deduced from the faunal remains.

  8. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    Science.gov (United States)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  9. CRISPR transcript processing: a mechanism for generating a large number of small interfering RNAs

    Directory of Open Access Journals (Sweden)

    Djordjevic Marko

    2012-07-01

    Full Text Available Abstract Background CRISPR/Cas (Clustered Regularly Interspaced Short Palindromic Repeats/CRISPR associated sequences is a recently discovered prokaryotic defense system against foreign DNA, including viruses and plasmids. CRISPR cassette is transcribed as a continuous transcript (pre-crRNA, which is processed by Cas proteins into small RNA molecules (crRNAs that are responsible for defense against invading viruses. Experiments in E. coli report that overexpression of cas genes generates a large number of crRNAs, from only few pre-crRNAs. Results We here develop a minimal model of CRISPR processing, which we parameterize based on available experimental data. From the model, we show that the system can generate a large amount of crRNAs, based on only a small decrease in the amount of pre-crRNAs. The relationship between the decrease of pre-crRNAs and the increase of crRNAs corresponds to strong linear amplification. Interestingly, this strong amplification crucially depends on fast non-specific degradation of pre-crRNA by an unidentified nuclease. We show that overexpression of cas genes above a certain level does not result in further increase of crRNA, but that this saturation can be relieved if the rate of CRISPR transcription is increased. We furthermore show that a small increase of CRISPR transcription rate can substantially decrease the extent of cas gene activation necessary to achieve a desired amount of crRNA. Conclusions The simple mathematical model developed here is able to explain existing experimental observations on CRISPR transcript processing in Escherichia coli. The model shows that a competition between specific pre-crRNA processing and non-specific degradation determines the steady-state levels of crRNA and is responsible for strong linear amplification of crRNAs when cas genes are overexpressed. The model further shows how disappearance of only a few pre-crRNA molecules normally present in the cell can lead to a large (two

  10. The Faculty Promotion Process. An Empirical Analysis of the Administration of Large State Universities.

    Science.gov (United States)

    Luthans, Fred

    One phase of academic management, the faculty promotion process, is systematically described and analyzed. The study encompasses three parts: (l) the justification of the use of management concepts in the analysis of academic administration; (2) a descriptive presentation of promotion policies and practices in 46 large state universities; and (3)…

  11. USING CONDITION MONITORING TO PREDICT REMAINING LIFE OF ELECTRIC CABLES

    International Nuclear Information System (INIS)

    LOFARO, R.; SOO, P.; VILLARAN, M.; GROVE, E.

    2001-01-01

    Electric cables are passive components used extensively throughout nuclear power stations to perform numerous safety and non-safety functions. It is known that the polymers commonly used to insulate the conductors on these cables can degrade with time; the rate of degradation being dependent on the severity of the conditions in which the cables operate. Cables do not receive routine maintenance and, since it can be very costly, they are not replaced on a regular basis. Therefore, to ensure their continued functional performance, it would be beneficial if condition monitoring techniques could be used to estimate the remaining useful life of these components. A great deal of research has been performed on various condition monitoring techniques for use on electric cables. In a research program sponsored by the U.S. Nuclear Regulatory Commission, several promising techniques were evaluated and found to provide trendable information on the condition of low-voltage electric cables. These techniques may be useful for predicting remaining life if well defined limiting values for the aging properties being measured can be determined. However, each technique has advantages and limitations that must be addressed in order to use it effectively, and the necessary limiting values are not always easy to obtain. This paper discusses how condition monitoring measurements can be used to predict the remaining useful life of electric cables. The attributes of an appropriate condition monitoring technique are presented, and the process to be used in estimating the remaining useful life of a cable is discussed along with the difficulties that must be addressed

  12. Hydrodynamic processes in sharp meander bends and their morphological implications

    NARCIS (Netherlands)

    Blanckaert, K.

    2011-01-01

    The migration rate of sharp meander bends exhibits large variance and indicates that some sharply curved bends tend to stabilize. These observations remain unexplained. This paper examines three hydrodynamic processes in sharp bends with fixed banks and discusses their morphological implications:

  13. US GAAP vs. IFRS – A COMPARISON OF REMAINING DIFFERENCES

    OpenAIRE

    Mihelčić, Eva

    2008-01-01

    In spite of the on-going harmonization process, there are still some differences between US GAAP and IFRS. Currently, companies listed on the New York Stock Exchange, which are reporting according to IFRS, must still prepare the reconciliation to US GAAP, to show the financial statements compliant with US GAAP as well. This article presents an overview of the remaining major differences between US GAAP and IFRS, descriptive as well as table-wise. First, the standards compared are shortly intr...

  14. Ghost Remains After Black Hole Eruption

    Science.gov (United States)

    2009-05-01

    NASA's Chandra X-ray Observatory has found a cosmic "ghost" lurking around a distant supermassive black hole. This is the first detection of such a high-energy apparition, and scientists think it is evidence of a huge eruption produced by the black hole. This discovery presents astronomers with a valuable opportunity to observe phenomena that occurred when the Universe was very young. The X-ray ghost, so-called because a diffuse X-ray source has remained after other radiation from the outburst has died away, is in the Chandra Deep Field-North, one of the deepest X-ray images ever taken. The source, a.k.a. HDF 130, is over 10 billion light years away and existed at a time 3 billion years after the Big Bang, when galaxies and black holes were forming at a high rate. "We'd seen this fuzzy object a few years ago, but didn't realize until now that we were seeing a ghost", said Andy Fabian of the Cambridge University in the United Kingdom. "It's not out there to haunt us, rather it's telling us something - in this case what was happening in this galaxy billions of year ago." Fabian and colleagues think the X-ray glow from HDF 130 is evidence for a powerful outburst from its central black hole in the form of jets of energetic particles traveling at almost the speed of light. When the eruption was ongoing, it produced prodigious amounts of radio and X-radiation, but after several million years, the radio signal faded from view as the electrons radiated away their energy. HDF 130 Chandra X-ray Image of HDF 130 However, less energetic electrons can still produce X-rays by interacting with the pervasive sea of photons remaining from the Big Bang - the cosmic background radiation. Collisions between these electrons and the background photons can impart enough energy to the photons to boost them into the X-ray energy band. This process produces an extended X-ray source that lasts for another 30 million years or so. "This ghost tells us about the black hole's eruption long after

  15. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  16. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    Science.gov (United States)

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  17. Study of Drell-Yan process in CMS experiment at Large Hadron Collider

    CERN Document Server

    Jindal, Monika

    The proton-proton collisions at the Large Hadron Collider (LHC) is the begining of a new era in the high energy physics. It enables the possibility of the discoveries at high-energy frontier and also allows the study of Standard Model physics with high precision. The new physics discoveries and the precision measurements can be achieved with highly efficient and accurate detectors like Compact Muon Solenoid. In this thesis, we report the measurement of the differential production cross-section of the Drell-Yan process, $q ar{q} ightarrow Z/gamma^{*} ightarrowmu^{+}mu^{-}$ in proton-proton collisions at the center-of-mass energy $sqrt{s}=$ 7 TeV using CMS experiment at the LHC. This measurement is based on the analysis of data which corresponds to an integrated luminosity of $intmath{L}dt$ = 36.0 $pm$ 1.4 pb$^{-1}$. The measurement of the production cross-section of the Drell-Yan process provides a first test of the Standard Model in a new energy domain and may reveal exotic physics processes. The Drell...

  18. Investigation of deep inelastic scattering processes involving large p$_{t}$ direct photons in the final state

    CERN Multimedia

    2002-01-01

    This experiment will investigate various aspects of photon-parton scattering and will be performed in the H2 beam of the SPS North Area with high intensity hadron beams up to 350 GeV/c. \\\\\\\\ a) The directly produced photon yield in deep inelastic hadron-hadron collisions. Large p$_{t}$ direct photons from hadronic interactions are presumably a result of a simple annihilation process of quarks and antiquarks or of a QCD-Compton process. The relative contribution of the two processes can be studied by using various incident beam projectiles $\\pi^{+}, \\pi^{-}, p$ and in the future $\\bar{p}$. \\\\\\\\b) The correlations between directly produced photons and their accompanying hadronic jets. We will examine events with a large p$_{t}$ direct photon for away-side jets. If jets are recognised their properties will be investigated. Differences between a gluon and a quark jet may become observable by comparing reactions where valence quark annihilations (away-side jet originates from a gluon) dominate over the QDC-Compton...

  19. Near-Space TOPSAR Large-Scene Full-Aperture Imaging Scheme Based on Two-Step Processing

    Directory of Open Access Journals (Sweden)

    Qianghui Zhang

    2016-07-01

    Full Text Available Free of the constraints of orbit mechanisms, weather conditions and minimum antenna area, synthetic aperture radar (SAR equipped on near-space platform is more suitable for sustained large-scene imaging compared with the spaceborne and airborne counterparts. Terrain observation by progressive scans (TOPS, which is a novel wide-swath imaging mode and allows the beam of SAR to scan along the azimuth, can reduce the time of echo acquisition for large scene. Thus, near-space TOPS-mode SAR (NS-TOPSAR provides a new opportunity for sustained large-scene imaging. An efficient full-aperture imaging scheme for NS-TOPSAR is proposed in this paper. In this scheme, firstly, two-step processing (TSP is adopted to eliminate the Doppler aliasing of the echo. Then, the data is focused in two-dimensional frequency domain (FD based on Stolt interpolation. Finally, a modified TSP (MTSP is performed to remove the azimuth aliasing. Simulations are presented to demonstrate the validity of the proposed imaging scheme for near-space large-scene imaging application.

  20. Some Examples of Residence-Time Distribution Studies in Large-Scale Chemical Processes by Using Radiotracer Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, R. M.; Johnson, P.; Whiston, J. [Imperial Chemical Industries Ltd., Billingham, Co., Durham (United Kingdom)

    1967-06-15

    The application of radiotracers to determine flow patterns in chemical processes is discussed with particular reference to the derivation of design data from model reactors for translation to large-scale units, the study of operating efficiency and design attainment in established plant and the rapid identification of various types of process malfunction. The requirements governing the selection of tracers for various types of media are considered and an example is given of the testing of the behaviour of a typical tracer before use in a particular large-scale process operating at 250 atm and 200 Degree-Sign C. Information which may be derived from flow patterns is discussed including the determination of mixing parameters, gas hold-up in gas/liquid reactions and the detection of channelling and stagnant regions. Practical results and their interpretation are given in relation to an define hydroformylation reaction system, a process for the conversion of propylene to isopropanol, a moving bed catalyst system for the isomerization of xylenes and a three-stage gas-liquid reaction system. The use of mean residence-time data for the detection of leakage between reaction vessels and a heat interchanger system is given as an example of the identification of process malfunction. (author)

  1. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  2. Extraterrestrial processing and manufacturing of large space systems, volume 1, chapters 1-6

    Science.gov (United States)

    Miller, R. H.; Smith, D. B. S.

    1979-01-01

    Space program scenarios for production of large space structures from lunar materials are defined. The concept of the space manufacturing facility (SMF) is presented. The manufacturing processes and equipment for the SMF are defined and the conceptual layouts are described for the production of solar cells and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, and converters. A 'reference' SMF was designed and its operation requirements are described.

  3. Regulatory perspective on remaining challenges for utilization of pharmacogenomics-guided drug developments.

    Science.gov (United States)

    Otsubo, Yasuto; Ishiguro, Akihiro; Uyama, Yoshiaki

    2013-01-01

    Pharmacogenomics-guided drug development has been implemented in practice in the last decade, resulting in increased labeling of drugs with pharmacogenomic information. However, there are still many challenges remaining in utilizing this process. Here, we describe such remaining challenges from the regulatory perspective, specifically focusing on sample collection, biomarker qualification, ethnic factors, codevelopment of companion diagnostics and means to provide drugs for off-target patients. To improve the situation, it is important to strengthen international harmonization and collaboration among academia, industries and regulatory agencies, followed by the establishment of an international guideline on this topic. Communication with a regulatory agency from an early stage of drug development is also a key to success.

  4. Large earthquake rupture process variations on the Middle America megathrust

    Science.gov (United States)

    Ye, Lingling; Lay, Thorne; Kanamori, Hiroo

    2013-11-01

    The megathrust fault between the underthrusting Cocos plate and overriding Caribbean plate recently experienced three large ruptures: the August 27, 2012 (Mw 7.3) El Salvador; September 5, 2012 (Mw 7.6) Costa Rica; and November 7, 2012 (Mw 7.4) Guatemala earthquakes. All three events involve shallow-dipping thrust faulting on the plate boundary, but they had variable rupture processes. The El Salvador earthquake ruptured from about 4 to 20 km depth, with a relatively large centroid time of ˜19 s, low seismic moment-scaled energy release, and a depleted teleseismic short-period source spectrum similar to that of the September 2, 1992 (Mw 7.6) Nicaragua tsunami earthquake that ruptured the adjacent shallow portion of the plate boundary. The Costa Rica and Guatemala earthquakes had large slip in the depth range 15 to 30 km, and more typical teleseismic source spectra. Regional seismic recordings have higher short-period energy levels for the Costa Rica event relative to the El Salvador event, consistent with the teleseismic observations. A broadband regional waveform template correlation analysis is applied to categorize the focal mechanisms for larger aftershocks of the three events. Modeling of regional wave spectral ratios for clustered events with similar mechanisms indicates that interplate thrust events have corner frequencies, normalized by a reference model, that increase down-dip from anomalously low values near the Middle America trench. Relatively high corner frequencies are found for thrust events near Costa Rica; thus, variations along strike of the trench may also be important. Geodetic observations indicate trench-parallel motion of a forearc sliver extending from Costa Rica to Guatemala, and low seismic coupling on the megathrust has been inferred from a lack of boundary-perpendicular strain accumulation. The slip distributions and seismic radiation from the large regional thrust events indicate relatively strong seismic coupling near Nicoya, Costa

  5. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  6. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    Science.gov (United States)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  7. The Impact of Nursing Leader's Behavioral Integrity and Intragroup Relationship Conflict on Staff Nurses' Intention to Remain.

    Science.gov (United States)

    Kang, Seung-Wan; Lee, Soojin; Choi, Suk Bong

    2017-05-01

    This study tested a multilevel model examining the effect of nursing leader's behavioral integrity and intragroup relationship conflict on staff nurses' intent to remain. In the challenging situation of nursing shortage, nurse executives are required to focus on the retention of nurses. No previous studies have examined the impact of nursing leader's behavioral integrity and intragroup relationship conflict on nurses' intention to remain. A cross-sectional survey of 480 RNs in 34 nursing units of a large public hospital in South Korea was conducted to test the hypothesized multilevel model. Nursing leader's behavioral integrity was positively related to nurses' intention to remain (b = 0.34, P relationship was enhanced when the level of intragroup relationship conflict was high (b = 0.21, P relationship conflict should endeavor to maintain their behavioral integrity to promote nurses' intention to remain.

  8. Plasma processing of large curved surfaces for superconducting rf cavity modification

    Directory of Open Access Journals (Sweden)

    J. Upadhyay

    2014-12-01

    Full Text Available Plasma-based surface modification of niobium is a promising alternative to wet etching of superconducting radio frequency (SRF cavities. We have demonstrated surface layer removal in an asymmetric nonplanar geometry, using a simple cylindrical cavity. The etching rate is highly correlated with the shape of the inner electrode, radio-frequency (rf circuit elements, gas pressure, rf power, chlorine concentration in the Cl_{2}/Ar gas mixtures, residence time of reactive species, and temperature of the cavity. Using variable radius cylindrical electrodes, large-surface ring-shaped samples, and dc bias in the external circuit, we have measured substantial average etching rates and outlined the possibility of optimizing plasma properties with respect to maximum surface processing effect.

  9. Subpixelic measurement of large 1D displacements: principle, processing algorithms, performances and software.

    Science.gov (United States)

    Guelpa, Valérian; Laurent, Guillaume J; Sandoz, Patrick; Zea, July Galeano; Clévy, Cédric

    2014-03-12

    This paper presents a visual measurement method able to sense 1D rigid body displacements with very high resolutions, large ranges and high processing rates. Sub-pixelic resolution is obtained thanks to a structured pattern placed on the target. The pattern is made of twin periodic grids with slightly different periods. The periodic frames are suited for Fourier-like phase calculations-leading to high resolution-while the period difference allows the removal of phase ambiguity and thus a high range-to-resolution ratio. The paper presents the measurement principle as well as the processing algorithms (source files are provided as supplementary materials). The theoretical and experimental performances are also discussed. The processing time is around 3 µs for a line of 780 pixels, which means that the measurement rate is mostly limited by the image acquisition frame rate. A 3-σ repeatability of 5 nm is experimentally demonstrated which has to be compared with the 168 µm measurement range.

  10. Event processing time prediction at the CMS experiment of the Large Hadron Collider

    International Nuclear Information System (INIS)

    Cury, Samir; Gutsche, Oliver; Kcira, Dorian

    2014-01-01

    The physics event reconstruction is one of the biggest challenges for the computing of the LHC experiments. Among the different tasks that computing systems of the CMS experiment performs, the reconstruction takes most of the available CPU resources. The reconstruction time of single collisions varies according to event complexity. Measurements were done in order to determine this correlation quantitatively, creating means to predict it based on the data-taking conditions of the input samples. Currently the data processing system splits tasks in groups with the same number of collisions and does not account for variations in the processing time. These variations can be large and can lead to a considerable increase in the time it takes for CMS workflows to finish. The goal of this study was to use estimates on processing time to more efficiently split the workflow into jobs. By considering the CPU time needed for each job the spread of the job-length distribution in a workflow is reduced.

  11. High-Rate Fabrication of a-Si-Based Thin-Film Solar Cells Using Large-Area VHF PECVD Processes

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Xunming [University of Toledo; Fan, Qi Hua

    2011-12-31

    The University of Toledo (UT), working in concert with it’s a-Si-based PV industry partner Xunlight Corporation (Xunlight), has conducted a comprehensive study to develop a large-area (3ft x 3ft) VHF PECVD system for high rate uniform fabrication of silicon absorber layers, and the large-area VHF PECVD processes to achieve high performance a-Si/a-SiGe or a-Si/nc-Si tandem junction solar cells during the period of July 1, 2008 to Dec. 31, 2011, under DOE Award No. DE-FG36-08GO18073. The project had two primary goals: (i) to develop and improve a large area (3 ft × 3 ft) VHF PECVD system for high rate fabrication of > = 8 Å/s a-Si and >= 20 Å/s nc-Si or 4 Å/s a-SiGe absorber layers with high uniformity in film thicknesses and in material structures. (ii) to develop and optimize the large-area VHF PECVD processes to achieve high-performance a-Si/nc-Si or a-Si/a-SiGe tandem-junction solar cells with >= 10% stable efficiency. Our work has met the goals and is summarized in “Accomplishments versus goals and objectives”.

  12. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  13. Geomagnetic and geoelectrical prospection for buried archaeological remains on the Upper City of Amorium, a Byzantine city in midwestern Turkey

    International Nuclear Information System (INIS)

    Ekinci, Yunus Levent; Balkaya, Çağlayan; Şeren, Aysel; Kaya, Mehmet Ali; Lightfoot, Christopher Sherwin

    2014-01-01

    On the basis of geophysical imaging surveys, including geomagnetic and geoelectrical resistivity, possible archaeological remains and their spatial parameters (i.e., location, extension, depth and thickness) were explored to provide useful data for future excavations on the Upper City of the ancient Amorium site, which comprises a large prehistoric man-made mound. The surveys were performed very close to the main axis of the Basilica, and the derived geophysical traces indicated some subsurface structures that appear to confirm that more-substantial brick and masonry buildings lie near the present-day surface of the mound. Analyzing the local gradients by total horizontal derivatives of pseudogravity data enhanced the edges of the magnetic sources. Additionally, a profile curvature technique, which has rarely been applied to potential field data sets, dramatically improved the magnetic-source body edges and the lineaments that may be associated with buried archaeological remains. The depths of these possible anthropogenic remains were estimated by applying the Euler deconvolution technique to the geomagnetic data set. The Euler solutions on tentative indices indicated that the depths of the source bodies are not more than about 3 m. Moreover, geoelectrical resistivity depth slices produced from the results of two- and three-dimensional linearized least-squares inversion techniques revealed high-resistivity anomalies within a depth of about 3 m from the ground surface, which is in close agreement with those obtained by applying the Euler deconvolution technique to the magnetic data. Based on the existence of some archaeological remains in the vicinity of the surveyed area, these geophysical anomalies were thought to be the possible traces of the buried remains and were suggested as targets for excavations. This study also emphasized that the data-processing techniques applied in this investigation should be suitable for providing an insight into the layout of the

  14. The genetic basis for altered blood vessel function in disease: large artery stiffening

    Directory of Open Access Journals (Sweden)

    Alex Agrotis

    2005-12-01

    Full Text Available Alex AgrotisThe Cell Biology Laboratory, Baker Heart Research Institute, Melbourne, Victoria, AustraliaAbstract: The progressive stiffening of the large arteries in humans that occurs during aging constitutes a potential risk factor for increased cardiovascular morbidity and mortality, and is accompanied by an elevation in systolic blood pressure and pulse pressure. While the underlying basis for these changes remains to be fully elucidated, factors that are able to influence the structure and composition of the extracellular matrix and the way it interacts with arterial smooth muscle cells could profoundly affect the properties of the large arteries. Thus, while age and sex represent important factors contributing to large artery stiffening, the variation in growth-stimulating factors and those that modulate extracellular production and homeostasis are also being increasingly recognized to play a key role in the process. Therefore, elucidating the contribution that genetic variation makes to large artery stiffening could ultimately provide the basis for clinical strategies designed to regulate the process for therapeutic benefit.Keywords: arterial stiffness, genes, polymorphism, extracellular matrix proteins

  15. Process mining in the large : a tutorial

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Zimányi, E.

    2014-01-01

    Recently, process mining emerged as a new scientific discipline on the interface between process models and event data. On the one hand, conventional Business Process Management (BPM) and Workflow Management (WfM) approaches and tools are mostly model-driven with little consideration for event data.

  16. A framework for the direct evaluation of large deviations in non-Markovian processes

    International Nuclear Information System (INIS)

    Cavallaro, Massimo; Harris, Rosemary J

    2016-01-01

    We propose a general framework to simulate stochastic trajectories with arbitrarily long memory dependence and efficiently evaluate large deviation functions associated to time-extensive observables. This extends the ‘cloning’ procedure of Giardiná et al (2006 Phys. Rev. Lett. 96 120603) to non-Markovian systems. We demonstrate the validity of this method by testing non-Markovian variants of an ion-channel model and the totally asymmetric exclusion process, recovering results obtainable by other means. (letter)

  17. Communication, Psychosocial, and Educational Outcomes of Children with Cochlear Implants and Challenges Remaining for Professionals and Parents

    Directory of Open Access Journals (Sweden)

    Renée Punch

    2011-01-01

    Full Text Available This paper provides an overview and a synthesis of the findings of a large, multifaceted study investigating outcomes from paediatric cochlear implantation. The study included children implanted at several Australian implant clinics and attending a variety of early intervention and educational settings across a range of locations in eastern Australia. It investigated three major aspects of childhood cochlear implantation: (1 parental expectations of their children's implantation, (2 families' decision-making processes, and (3 the communication, social, and educational outcomes of cochlear implantation for deaf children. It employed a mixed-methods approach in which quantitative survey data were gathered from 247 parents and 151 teachers, and qualitative data from semistructured interviews with 27 parents, 15 teachers, and 11 children and adolescents with cochlear implants. The summarised findings highlight several areas where challenges remain for implant clinics, parents, and educators if children with cochlear implants are to reach their full potential personally, educationally, and socially.

  18. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    International Nuclear Information System (INIS)

    Arbanas, G.; Dunn, M.E.; Wiarda, D.

    2011-01-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The 235 U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  19. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    Energy Technology Data Exchange (ETDEWEB)

    Arbanas, G.; Dunn, M.E.; Wiarda, D., E-mail: arbanasg@ornl.gov, E-mail: dunnme@ornl.gov, E-mail: wiardada@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2011-07-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The {sup 235}U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  20. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  1. Prognostic modelling options for remaining useful life estimation by industry

    Science.gov (United States)

    Sikorska, J. Z.; Hodkiewicz, M.; Ma, L.

    2011-07-01

    Over recent years a significant amount of research has been undertaken to develop prognostic models that can be used to predict the remaining useful life of engineering assets. Implementations by industry have only had limited success. By design, models are subject to specific assumptions and approximations, some of which are mathematical, while others relate to practical implementation issues such as the amount of data required to validate and verify a proposed model. Therefore, appropriate model selection for successful practical implementation requires not only a mathematical understanding of each model type, but also an appreciation of how a particular business intends to utilise a model and its outputs. This paper discusses business issues that need to be considered when selecting an appropriate modelling approach for trial. It also presents classification tables and process flow diagrams to assist industry and research personnel select appropriate prognostic models for predicting the remaining useful life of engineering assets within their specific business environment. The paper then explores the strengths and weaknesses of the main prognostics model classes to establish what makes them better suited to certain applications than to others and summarises how each have been applied to engineering prognostics. Consequently, this paper should provide a starting point for young researchers first considering options for remaining useful life prediction. The models described in this paper are Knowledge-based (expert and fuzzy), Life expectancy (stochastic and statistical), Artificial Neural Networks, and Physical models.

  2. Determination of Remaining Useful Life of Gas Turbine Blade

    Directory of Open Access Journals (Sweden)

    Meor Said Mior Azman

    2016-01-01

    Full Text Available The aim of this research is to determine the remaining useful life of gas turbine blade, using service-exposed turbine blades. This task is performed using Stress Rupture Test (SRT under accelerated test conditions where the applied stresses to the specimen is between 400 MPa to 600 MPa and the test temperature is 850°C. The study will focus on the creep behaviour of the 52000 hours service-exposed blades, complemented with creep-rupture modelling using JMatPro software and microstructure examination using optical microscope. The test specimens, made up of Ni-based superalloy of the first stage turbine blades, are machined based on International Standard (ISO 24. The results from the SRT will be analyzed using these two main equations – Larson-Miller Parameter and Life Fraction Rule. Based on the results of the remaining useful life analysis, the 52000h service-exposed blade has the condition to operate in the range of another 4751 hr to 18362 hr. The microstructure examinations shows traces of carbide precipitation that deteriorate the grain boundaries that occurs during creep process. Creep-rupture life modelling using JMatPro software has shown good agreement with the accelerated creep rupture test with minimal error.

  3. A comparison between decomposition rates of buried and surface remains in a temperate region of South Africa.

    Science.gov (United States)

    Marais-Werner, Anátulie; Myburgh, J; Becker, P J; Steyn, M

    2018-01-01

    Several studies have been conducted on decomposition patterns and rates of surface remains; however, much less are known about this process for buried remains. Understanding the process of decomposition in buried remains is extremely important and aids in criminal investigations, especially when attempting to estimate the post mortem interval (PMI). The aim of this study was to compare the rates of decomposition between buried and surface remains. For this purpose, 25 pigs (Sus scrofa; 45-80 kg) were buried and excavated at different post mortem intervals (7, 14, 33, 92, and 183 days). The observed total body scores were then compared to those of surface remains decomposing at the same location. Stages of decomposition were scored according to separate categories for different anatomical regions based on standardised methods. Variation in the degree of decomposition was considerable especially with the buried 7-day interval pigs that displayed different degrees of discolouration in the lower abdomen and trunk. At 14 and 33 days, buried pigs displayed features commonly associated with the early stages of decomposition, but with less variation. A state of advanced decomposition was reached where little change was observed in the next ±90-183 days after interment. Although the patterns of decomposition for buried and surface remains were very similar, the rates differed considerably. Based on the observations made in this study, guidelines for the estimation of PMI are proposed. This pertains to buried remains found at a depth of approximately 0.75 m in the Central Highveld of South Africa.

  4. What remains of the Arrow oil?

    International Nuclear Information System (INIS)

    Sergy, G.; Owens, E.

    1993-01-01

    In February 1970, the tanker Arrow became grounded 6.5 km off the north shore of Chedabucto Bay, Nova Scotia, and nearly 72,000 bbl of Bunker C fuel oil were released from the vessel during its subsequent breakup and sinking. The oil was washed ashore in various degrees over an estimated 305 km of the bay's 604-km shoreline, of which only 48 km were cleaned. In addition, the tanker Kurdistan broke in two in pack ice in March 1979 in the Cabot Strait area, spilling ca 54,000 bbl of Bunker C, some of which was later found at 16 locations along the northeast and east shorelines of Chedabucto Bay. In summer 1992, a systematic ground survey of the bay's shorelines was conducted using Environment Canada Shoreline Cleanup Assessment Team (SCAT) procedures. Standard observations were made of oil distribution and width, thickness, and character of the oil residues in 419 coastal segments. Results from the survey are summarized. Oil was found to be present on 13.3 km of the shoreline, with heavy oiling restricted to 1.3 km primarily in the areas of Black Duck Cove and Lennox Passage. Some of this residual oil was identified as coming from the Arrow. Natural weathering processes account for removal of most of the spilled oil from the bay. Oil remaining on the shore was found in areas outside of the zone of physical wave action, in areas of nearshore mixing where fine sediments are not present to weather the oil through biophysical processes, or in crusts formed by oil weathered on the surface. The systematic description of oiled shorelines using the SCAT methodology proved very successful, even for such an old spill. 6 refs

  5. Dyadic Processes in Early Marriage: Attributions, Behavior, and Marital Quality

    Science.gov (United States)

    Durtschi, Jared A.; Fincham, Frank D.; Cui, Ming; Lorenz, Frederick O.; Conger, Rand D.

    2011-01-01

    Marital processes in early marriage are important for understanding couples' future marital quality. Spouses' attributions about a partner's behavior have been linked to marital quality, yet the mechanisms underlying this association remain largely unknown. When we used couple data from the Family Transitions Project (N = 280 couples) across the…

  6. Aftershocks and triggering processes in rock fracture

    Science.gov (United States)

    Davidsen, J.; Kwiatek, G.; Goebel, T.; Stanchits, S. A.; Dresen, G.

    2017-12-01

    One of the hallmarks of our understanding of seismicity in nature is the importance of triggering processes, which makes the forecasting of seismic activity feasible. These triggering processes by which one earthquake induces (dynamic or static) stress changes leading to potentially multiple other earthquakes are at the core relaxation processes. A specic example of triggering are aftershocks following a large earthquake, which have been observed to follow certain empirical relationships such as the Omori-Utsu relation. Such an empirical relation should arise from the underlying microscopic dynamics of the involved physical processes but the exact connection remains to be established. Simple explanations have been proposed but their general applicability is unclear. Many explanations involve the picture of an earthquake as a purely frictional sliding event. Here, we present experimental evidence that these empirical relationships are not limited to frictional processes but also arise in fracture zone formation and are mostly related to compaction-type events. Our analysis is based on tri-axial compression experiments under constant displacement rate on sandstone and granite samples using spatially located acoustic emission events and their focal mechanisms. More importantly, we show that event-event triggering plays an important role in the presence of large-scale or macrocopic imperfections while such triggering is basically absent if no signicant imperfections are present. We also show that spatial localization and an increase in activity rates close to failure do not necessarily imply triggering behavior associated with aftershocks. Only if a macroscopic crack is formed and its propagation remains subcritical do we observe significant triggering.

  7. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    Science.gov (United States)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming

  8. Asymptotic description of two metastable processes of solidification for the case of large relaxation time

    International Nuclear Information System (INIS)

    Omel'yanov, G.A.

    1995-07-01

    The non-isothermal Cahn-Hilliard equations in the n-dimensional case (n = 2,3) are considered. The interaction length is proportional to a small parameter, and the relaxation time is proportional to a constant. The asymptotic solutions describing two metastable processes are constructed and justified. The soliton type solution describes the first stage of separation in alloy, when a set of ''superheated liquid'' appears inside the ''solid'' part. The Van der Waals type solution describes the free interface dynamics for large time. The smoothness of temperature is established for large time and the Mullins-Sekerka problem describing the free interface is derived. (author). 46 refs

  9. Forest landscape models, a tool for understanding the effect of the large-scale and long-term landscape processes

    Science.gov (United States)

    Hong S. He; Robert E. Keane; Louis R. Iverson

    2008-01-01

    Forest landscape models have become important tools for understanding large-scale and long-term landscape (spatial) processes such as climate change, fire, windthrow, seed dispersal, insect outbreak, disease propagation, forest harvest, and fuel treatment, because controlled field experiments designed to study the effects of these processes are often not possible (...

  10. Process Ambidexterity for Entrepreneurial Firms

    Directory of Open Access Journals (Sweden)

    Sonia D. Bot

    2012-04-01

    Full Text Available Technology-based entrepreneurial firms must effectively support both mainstream exploitation and new-stream exploration in order to remain competitive for the long term. The processes that support exploitation and exploration initiatives are different in terms of logistics, payoff horizons, and capabilities. Few firms are able to strike a balance between the two, where mainstream exploitation usually trumps new-stream exploration. The ultimate goal is for the firm to operate effectively in a repeatable, scalable, and systematic manner, rather than relying on good luck and hoping either to come up with the next innovation or for the product to function according to its requirements. This article builds on the author’s years of experience in building businesses and transforming medium and large-sized, entrepreneurial technology firms, leading large-scale breakthrough and sustained performance improvements by using and evolving Lean Six Sigma methodologies, and reviews of technology innovation management and entrepreneurship literature. This article provides a process-based perspective to understanding and addressing the issues on balancing mainstream exploitation and new-stream exploration in medium and large-sized entrepreneurial firms and extending it to startups. The resulting capability is known as process ambidexterity and requires disciplined, agile, and lean business management.

  11. In-database processing of a large collection of remote sensing data: applications and implementation

    Science.gov (United States)

    Kikhtenko, Vladimir; Mamash, Elena; Chubarov, Dmitri; Voronina, Polina

    2016-04-01

    Large archives of remote sensing data are now available to scientists, yet the need to work with individual satellite scenes or product files constrains studies that span a wide temporal range or spatial extent. The resources (storage capacity, computing power and network bandwidth) required for such studies are often beyond the capabilities of individual geoscientists. This problem has been tackled before in remote sensing research and inspired several information systems. Some of them such as NASA Giovanni [1] and Google Earth Engine have already proved their utility for science. Analysis tasks involving large volumes of numerical data are not unique to Earth Sciences. Recent advances in data science are enabled by the development of in-database processing engines that bring processing closer to storage, use declarative query languages to facilitate parallel scalability and provide high-level abstraction of the whole dataset. We build on the idea of bridging the gap between file archives containing remote sensing data and databases by integrating files into relational database as foreign data sources and performing analytical processing inside the database engine. Thereby higher level query language can efficiently address problems of arbitrary size: from accessing the data associated with a specific pixel or a grid cell to complex aggregation over spatial or temporal extents over a large number of individual data files. This approach was implemented using PostgreSQL for a Siberian regional archive of satellite data products holding hundreds of terabytes of measurements from multiple sensors and missions taken over a decade-long span. While preserving the original storage layout and therefore compatibility with existing applications the in-database processing engine provides a toolkit for provisioning remote sensing data in scientific workflows and applications. The use of SQL - a widely used higher level declarative query language - simplifies interoperability

  12. Fish remains and humankind: part two

    Directory of Open Access Journals (Sweden)

    Andrew K G Jones

    1998-07-01

    Full Text Available The significance of aquatic resources to past human groups is not adequately reflected in the published literature - a deficiency which is gradually being acknowledged by the archaeological community world-wide. The publication of the following three papers goes some way to redress this problem. Originally presented at an International Council of Archaeozoology (ICAZ Fish Remains Working Group meeting in York, U.K. in 1987, these papers offer clear evidence of the range of interest in ancient fish remains across the world. Further papers from the York meeting were published in Internet Archaeology 3 in 1997.

  13. Large-scale simulation of ductile fracture process of microstructured materials

    International Nuclear Information System (INIS)

    Tian Rong; Wang Chaowei

    2011-01-01

    The promise of computational science in the extreme-scale computing era is to reduce and decompose macroscopic complexities into microscopic simplicities with the expense of high spatial and temporal resolution of computing. In materials science and engineering, the direct combination of 3D microstructure data sets and 3D large-scale simulations provides unique opportunity for the development of a comprehensive understanding of nano/microstructure-property relationships in order to systematically design materials with specific desired properties. In the paper, we present a framework simulating the ductile fracture process zone in microstructural detail. The experimentally reconstructed microstructural data set is directly embedded into a FE mesh model to improve the simulation fidelity of microstructure effects on fracture toughness. To the best of our knowledge, it is for the first time that the linking of fracture toughness to multiscale microstructures in a realistic 3D numerical model in a direct manner is accomplished. (author)

  14. A large-scale forest landscape model incorporating multi-scale processes and utilizing forest inventory data

    Science.gov (United States)

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian. Yang

    2013-01-01

    Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...

  15. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  16. The Liang Bua faunal remains: a 95k.yr. sequence from Flores, East Indonesia.

    Science.gov (United States)

    van den Bergh, G D; Meijer, H J M; Due Awe, Rokhus; Morwood, M J; Szabó, K; van den Hoek Ostende, L W; Sutikna, T; Saptomo, E W; Piper, P J; Dobney, K M

    2009-11-01

    Excavations at Liang Bua, a limestone cave on the island of Flores, East Indonesia, have yielded a well-dated archaeological and faunal sequence spanning the last 95k.yr., major climatic fluctuations, and two human species -H. floresiensis from 95 to 17k.yr.(1), and modern humans from 11k.yr. to the present. The faunal assemblage comprises well-preserved mammal, bird, reptile and mollusc remains, including examples of island gigantism in small mammals and the dwarfing of large taxa. Together with evidence from Early-Middle Pleistocene sites in the Soa Basin, it confirms the long-term isolation, impoverishment, and phylogenetic continuity of the Flores faunal community. The accumulation of Stegodon and Komodo dragon remains at the site in the Pleistocene is attributed to Homo floresiensis, while predatory birds, including an extinct species of owl, were largely responsible for the accumulation of the small vertebrates. The disappearance from the sequence of the two large-bodied, endemic mammals, Stegodon florensis insularis and Homo floresiensis, was associated with a volcanic eruption at 17 ka and precedes the earliest evidence for modern humans, who initiated use of mollusc and shell working, and began to introduce a range of exotic animals to the island. Faunal introductions during the Holocene included the Sulawesi warty pig (Sus celebensis) at about 7ka, followed by the Eurasian pig (Sus scrofa), Long-tailed macaque, Javanese porcupine, and Masked palm civet at about 4ka, and cattle, deer, and horse - possibly by the Portuguese within historic times. The Holocene sequence at the site also documents local faunal extinctions - a result of accelerating human population growth, habitat loss, and over-exploitation.

  17. Processing large sensor data sets for safeguards : the knowledge generation system.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Maikel A.; Smartt, Heidi Anne; Matthews, Robert F.

    2012-04-01

    Modern nuclear facilities, such as reprocessing plants, present inspectors with significant challenges due in part to the sheer amount of equipment that must be safeguarded. The Sandia-developed and patented Knowledge Generation system was designed to automatically analyze large amounts of safeguards data to identify anomalous events of interest by comparing sensor readings with those expected from a process of interest and operator declarations. This paper describes a demonstration of the Knowledge Generation system using simulated accountability tank sensor data to represent part of a reprocessing plant. The demonstration indicated that Knowledge Generation has the potential to address several problems critical to the future of safeguards. It could be extended to facilitate remote inspections and trigger random inspections. Knowledge Generation could analyze data to establish trust hierarchies, to facilitate safeguards use of operator-owned sensors.

  18. V and V-based remaining fault estimation model for safety–critical software of a nuclear power plant

    International Nuclear Information System (INIS)

    Eom, Heung-seop; Park, Gee-yong; Jang, Seung-cheol; Son, Han Seong; Kang, Hyun Gook

    2013-01-01

    Highlights: ► A software fault estimation model based on Bayesian Nets and V and V. ► Use of quantified data derived from qualitative V and V results. ► Faults insertion and elimination process was modeled in the context of probability. ► Systematically estimates the expected number of remaining faults. -- Abstract: Quantitative software reliability measurement approaches have some limitations in demonstrating the proper level of reliability in cases of safety–critical software. One of the more promising alternatives is the use of software development quality information. Particularly in the nuclear industry, regulatory bodies in most countries use both probabilistic and deterministic measures for ensuring the reliability of safety-grade digital computers in NPPs. The point of deterministic criteria is to assess the whole development process and its related activities during the software development life cycle for the acceptance of safety–critical software. In addition software Verification and Validation (V and V) play an important role in this process. In this light, we propose a V and V-based fault estimation method using Bayesian Nets to estimate the remaining faults for safety–critical software after the software development life cycle is completed. By modeling the fault insertion and elimination processes during the whole development phases, the proposed method systematically estimates the expected number of remaining faults.

  19. Remaining Sites Verification Package for the 126-B-2, 183-B Clearwells

    International Nuclear Information System (INIS)

    Dittmer, L.M.

    2007-01-01

    The 126-B-2, 183-B Clearwells were built as part of the 183-B Water Treatment Facility and are composed of 2 covered concrete reservoirs. The bulk of the water stored in the clearwells was used as process water to cool the 105-B Reactor and as a source of potable water. Residual conditions were determined to meet the remedial action objectives specified in the Remaining Sites ROD through an evaluation of the available process knowledge. The results of the evaluation do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also indicate that residual concentrations are protective of groundwater and the Columbia River.

  20. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Berres, Anne Sabine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Adhinarayanan, Vignesh [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Turton, Terece [Univ. of Texas, Austin, TX (United States); Feng, Wu [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Rogers, David Honegger [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline at the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.

  1. Quantifying hyporheic exchange dynamics in a highly regulated large river reach.

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn Edward; Zhou, T; Huang, M; Hou, Z; Bao, J; Arntzen, E; Mackley, R; Harding, S; Titzler, S; Murray, C; Perkins, W; Chen, X; Stegen, J; Thorne, P; Zachara, J

    2017-03-01

    Hyporheic exchange is an important mechanism taking place in riverbanks and riverbed sediments, where river water and shallow groundwater mix and interact with each other. The direction, magnitude, and residence time of the hyporheic flux that penetrates the river bed are critical for biogeochemical processes such as carbon and nitrogen cycling, and biodegradation of organic contaminants. Many approaches including field measurements and numerical methods have been developed to quantify the hyporheic exchanges in relatively small rivers. However, the spatial and temporal distributions of hyporheic exchanges in a large, regulated river reach remain less explored due to the large spatial domains, complexity of geomorphologic features and subsurface properties, and the great pressure gradient variations at the riverbed created by dam operations.

  2. Processing and properties of large-sized ceramic slabs

    Directory of Open Access Journals (Sweden)

    Fossa, L.

    2010-10-01

    Full Text Available Large-sized ceramic slabs – with dimensions up to 360x120 cm2 and thickness down to 2 mm – are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites. Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD and microstructural (SEM viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated façades, tunnel coverings, insulating panelling, indoor furnitures (table tops, doors, support for photovoltaic ceramic panels.

    Se han fabricado piezas de gran formato, con dimensiones de hasta 360x120 cm, y menos de 2 mm, de espesor, empleando métodos innovadores de fabricación, partiendo de composiciones de gres porcelánico y utilizando, molienda con bolas por vía húmeda, atomización, prensado a baja velocidad sin boquilla de extrusión, secado y cocción rápido en una sola etapa, y un acabado que incluye la adhesión de fibra de vidrio al soporte cerámico y el rectificado de la pieza final. Se han

  3. Large deviations in stochastic heat-conduction processes provide a gradient-flow structure for heat conduction

    International Nuclear Information System (INIS)

    Peletier, Mark A.; Redig, Frank; Vafayi, Kiamars

    2014-01-01

    We consider three one-dimensional continuous-time Markov processes on a lattice, each of which models the conduction of heat: the family of Brownian Energy Processes with parameter m (BEP(m)), a Generalized Brownian Energy Process, and the Kipnis-Marchioro-Presutti (KMP) process. The hydrodynamic limit of each of these three processes is a parabolic equation, the linear heat equation in the case of the BEP(m) and the KMP, and a nonlinear heat equation for the Generalized Brownian Energy Process with parameter a (GBEP(a)). We prove the hydrodynamic limit rigorously for the BEP(m), and give a formal derivation for the GBEP(a). We then formally derive the pathwise large-deviation rate functional for the empirical measure of the three processes. These rate functionals imply gradient-flow structures for the limiting linear and nonlinear heat equations. We contrast these gradient-flow structures with those for processes describing the diffusion of mass, most importantly the class of Wasserstein gradient-flow systems. The linear and nonlinear heat-equation gradient-flow structures are each driven by entropy terms of the form −log ρ; they involve dissipation or mobility terms of order ρ 2 for the linear heat equation, and a nonlinear function of ρ for the nonlinear heat equation

  4. Level of processing modulates the neural correlates of emotional memory formation

    OpenAIRE

    Ritchey, Maureen; LaBar, Kevin S.; Cabeza, Roberto

    2010-01-01

    Emotion is known to influence multiple aspects of memory formation, including the initial encoding of the memory trace and its consolidation over time. However, the neural mechanisms whereby emotion impacts memory encoding remain largely unexplored. The present study employed a levels-of-processing manipulation to characterize the impact of emotion on encoding with and without the influence of elaborative processes. Participants viewed emotionally negative, neutral, and positive scenes under ...

  5. Looking at large data sets using binned data plots

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  6. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young [Low-temperature Plasma Laboratory, Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 305-701 (Korea, Republic of); An, Sang-Hyuk [Agency of Defense Development, Yuseong-gu, Daejeon 305-151 (Korea, Republic of)

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources due to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.

  7. Manufacturing Process Simulation of Large-Scale Cryotanks

    Science.gov (United States)

    Babai, Majid; Phillips, Steven; Griffin, Brian

    2003-01-01

    NASA's Space Launch Initiative (SLI) is an effort to research and develop the technologies needed to build a second-generation reusable launch vehicle. It is required that this new launch vehicle be 100 times safer and 10 times cheaper to operate than current launch vehicles. Part of the SLI includes the development of reusable composite and metallic cryotanks. The size of these reusable tanks is far greater than anything ever developed and exceeds the design limits of current manufacturing tools. Several design and manufacturing approaches have been formulated, but many factors must be weighed during the selection process. Among these factors are tooling reachability, cycle times, feasibility, and facility impacts. The manufacturing process simulation capabilities available at NASA.s Marshall Space Flight Center have played a key role in down selecting between the various manufacturing approaches. By creating 3-D manufacturing process simulations, the varying approaches can be analyzed in a virtual world before any hardware or infrastructure is built. This analysis can detect and eliminate costly flaws in the various manufacturing approaches. The simulations check for collisions between devices, verify that design limits on joints are not exceeded, and provide cycle times which aide in the development of an optimized process flow. In addition, new ideas and concerns are often raised after seeing the visual representation of a manufacturing process flow. The output of the manufacturing process simulations allows for cost and safety comparisons to be performed between the various manufacturing approaches. This output helps determine which manufacturing process options reach the safety and cost goals of the SLI. As part of the SLI, The Boeing Company was awarded a basic period contract to research and propose options for both a metallic and a composite cryotank. Boeing then entered into a task agreement with the Marshall Space Flight Center to provide manufacturing

  8. Decision process in MCDM with large number of criteria and heterogeneous risk preferences

    Directory of Open Access Journals (Sweden)

    Jian Liu

    Full Text Available A new decision process is proposed to address the challenge that a large number criteria in the multi-criteria decision making (MCDM problem and the decision makers with heterogeneous risk preferences. First, from the perspective of objective data, the effective criteria are extracted based on the similarity relations between criterion values and the criteria are weighted, respectively. Second, the corresponding types of theoretic model of risk preferences expectations will be built, based on the possibility and similarity between criterion values to solve the problem for different interval numbers with the same expectation. Then, the risk preferences (Risk-seeking, risk-neutral and risk-aversion will be embedded in the decision process. Later, the optimal decision object is selected according to the risk preferences of decision makers based on the corresponding theoretic model. Finally, a new algorithm of information aggregation model is proposed based on fairness maximization of decision results for the group decision, considering the coexistence of decision makers with heterogeneous risk preferences. The scientific rationality verification of this new method is given through the analysis of real case. Keywords: Heterogeneous, Risk preferences, Fairness, Decision process, Group decision

  9. “Why We Stay”: Immigrants’ motivations for remaining in communities impacted by anti-immigration policy

    Science.gov (United States)

    Valdez, Carmen R.; Valentine, Jessa L.; Padilla, Brian

    2013-01-01

    Although restrictive immigration policy reduces incentives for unauthorized immigrants to remain in the United States, many immigrants remain in their U.S. community in spite of the anti-immigration climate surrounding them. This study explores motivations shaping immigrants’ intentions to stay in Arizona after passage of Senate Bill 1070 in 2010, one of the most restrictive immigration policies in recent decades. We conducted three focus groups in a large metropolitan city in Arizona with Mexican immigrant parents (N = 25). Themes emerging from the focus groups described multiple and interlocking personal, family and community, and contemporary sociopolitical motivations to stay in their community, and suggest that some important motivating factors have evolved as a result of immigrants’ changing environment. Implications for research and social policy reform are discussed. PMID:23875853

  10. The Human Remains from HMS Pandora

    Directory of Open Access Journals (Sweden)

    D.P. Steptoe

    2002-04-01

    Full Text Available In 1977 the wreck of HMS Pandora (the ship that was sent to re-capture the Bounty mutineers was discovered off the north coast of Queensland. Since 1983, the Queensland Museum Maritime Archaeology section has carried out systematic excavation of the wreck. During the years 1986 and 1995-1998, more than 200 human bone and bone fragments were recovered. Osteological investigation revealed that this material represented three males. Their ages were estimated at approximately 17 +/-2 years, 22 +/-3 years and 28 +/-4 years, with statures of 168 +/-4cm, 167 +/-4cm, and 166cm +/-3cm respectively. All three individuals were probably Caucasian, although precise determination of ethnicity was not possible. In addition to poor dental hygiene, signs of chronic diseases suggestive of rickets and syphilis were observed. Evidence of spina bifida was seen on one of the skeletons, as were other skeletal anomalies. Various taphonomic processes affecting the remains were also observed and described. Compact bone was observed under the scanning electron microscope and found to be structurally coherent. Profiles of the three skeletons were compared with historical information about the 35 men lost with the ship, but no precise identification could be made. The investigation did not reveal the cause of death. Further research, such as DNA analysis, is being carried out at the time of publication.

  11. Report of the large solenoid detector group

    International Nuclear Information System (INIS)

    Hanson, G.G.; Mori, S.; Pondrom, L.G.

    1987-09-01

    This report presents a conceptual design of a large solenoid for studying physics at the SSC. The parameters and nature of the detector have been chosen based on present estimates of what is required to allow the study of heavy quarks, supersymmetry, heavy Higgs particles, WW scattering at large invariant masses, new W and Z bosons, and very large momentum transfer parton-parton scattering. Simply stated, the goal is to obtain optimum detection and identification of electrons, muons, neutrinos, jets, W's and Z's over a large rapidity region. The primary region of interest extends over +-3 units of rapidity, although the calorimetry must extend to +-5.5 units if optimal missing energy resolution is to be obtained. A magnetic field was incorporated because of the importance of identifying the signs of the charges for both electrons and muons and because of the added possibility of identifying tau leptons and secondary vertices. In addition, the existence of a magnetic field may prove useful for studying new physics processes about which we currently have no knowledge. Since hermeticity of the calorimetry is extremely important, the entire central and endcap calorimeters were located inside the solenoid. This does not at the moment seem to produce significant problems (although many issues remain to be resolved) and in fact leads to a very effective muon detector in the central region

  12. Neutral processes forming large clones during colonization of new areas.

    Science.gov (United States)

    Rafajlović, M; Kleinhans, D; Gulliksson, C; Fries, J; Johansson, D; Ardehed, A; Sundqvist, L; Pereyra, R T; Mehlig, B; Jonsson, P R; Johannesson, K

    2017-08-01

    In species reproducing both sexually and asexually clones are often more common in recently established populations. Earlier studies have suggested that this pattern arises due to natural selection favouring generally or locally successful genotypes in new environments. Alternatively, as we show here, this pattern may result from neutral processes during species' range expansions. We model a dioecious species expanding into a new area in which all individuals are capable of both sexual and asexual reproduction, and all individuals have equal survival rates and dispersal distances. Even under conditions that favour sexual recruitment in the long run, colonization starts with an asexual wave. After colonization is completed, a sexual wave erodes clonal dominance. If individuals reproduce more than one season, and with only local dispersal, a few large clones typically dominate for thousands of reproductive seasons. Adding occasional long-distance dispersal, more dominant clones emerge, but they persist for a shorter period of time. The general mechanism involved is simple: edge effects at the expansion front favour asexual (uniparental) recruitment where potential mates are rare. Specifically, our model shows that neutral processes (with respect to genotype fitness) during the population expansion, such as random dispersal and demographic stochasticity, produce genotype patterns that differ from the patterns arising in a selection model. The comparison with empirical data from a post-glacially established seaweed species (Fucus radicans) shows that in this case, a neutral mechanism is strongly supported. © 2017 The Authors. Journal of Evolutionary Biology Published by John Wiley & Sons ltd on Behalf of European Society for Evolutionary Biology.

  13. Level of Processing Modulates the Neural Correlates of Emotional Memory Formation

    Science.gov (United States)

    Ritchey, Maureen; LaBar, Kevin S.; Cabeza, Roberto

    2011-01-01

    Emotion is known to influence multiple aspects of memory formation, including the initial encoding of the memory trace and its consolidation over time. However, the neural mechanisms whereby emotion impacts memory encoding remain largely unexplored. The present study used a levels-of-processing manipulation to characterize the impact of emotion on…

  14. Arguments Against a Configural Processing Account of Familiar Face Recognition.

    Science.gov (United States)

    Burton, A Mike; Schweinberger, Stefan R; Jenkins, Rob; Kaufmann, Jürgen M

    2015-07-01

    Face recognition is a remarkable human ability, which underlies a great deal of people's social behavior. Individuals can recognize family members, friends, and acquaintances over a very large range of conditions, and yet the processes by which they do this remain poorly understood, despite decades of research. Although a detailed understanding remains elusive, face recognition is widely thought to rely on configural processing, specifically an analysis of spatial relations between facial features (so-called second-order configurations). In this article, we challenge this traditional view, raising four problems: (1) configural theories are underspecified; (2) large configural changes leave recognition unharmed; (3) recognition is harmed by nonconfigural changes; and (4) in separate analyses of face shape and face texture, identification tends to be dominated by texture. We review evidence from a variety of sources and suggest that failure to acknowledge the impact of familiarity on facial representations may have led to an overgeneralization of the configural account. We argue instead that second-order configural information is remarkably unimportant for familiar face recognition. © The Author(s) 2015.

  15. Cell therapy-processing economics: small-scale microfactories as a stepping stone toward large-scale macrofactories.

    Science.gov (United States)

    Harrison, Richard P; Medcalf, Nicholas; Rafiq, Qasim A

    2018-03-01

    Manufacturing methods for cell-based therapies differ markedly from those established for noncellular pharmaceuticals and biologics. Attempts to 'shoehorn' these into existing frameworks have yielded poor outcomes. Some excellent clinical results have been realized, yet emergence of a 'blockbuster' cell-based therapy has so far proved elusive.  The pressure to provide these innovative therapies, even at a smaller scale, remains. In this process, economics research paper, we utilize cell expansion research data combined with operational cost modeling in a case study to demonstrate the alternative ways in which a novel mesenchymal stem cell-based therapy could be provided at small scale. This research outlines the feasibility of cell microfactories but highlighted that there is a strong pressure to automate processes and split the quality control cost-burden over larger production batches. The study explores one potential paradigm of cell-based therapy provisioning as a potential exemplar on which to base manufacturing strategy.

  16. "Recent" macrofossil remains from the Lomonosov Ridge, central Arctic Ocean

    Science.gov (United States)

    Le Duc, Cynthia; de Vernal, Anne; Archambault, Philippe; Brice, Camille; Roberge, Philippe

    2016-04-01

    The examination of surface sediment samples collected from 17 sites along the Lomonosov Ridge at water depths ranging from 737 to 3339 meters during Polarstern Expedition PS87 in 2014 (Stein, 2015), indicates a rich biogenic content almost exclusively dominated by calcareous remains. Amongst biogenic remains, microfossils (planktic and benthic foraminifers, pteropods, ostracods, etc.) dominate but millimetric to centrimetric macrofossils occurred frequently at the surface of the sediment. The macrofossil remains consist of a large variety of taxa, including gastropods, bivalvia, polychaete tubes, scaphopods, echinoderm plates and spines, and fish otoliths. Among the Bivalvia, the most abundant taxa are Portlandia arctica, Hyalopecten frigidus, Cuspidaria glacilis, Policordia densicostata, Bathyarca spp., and Yoldiella spp. Whereas a few specimens are well preserved and apparently pristine, most mollusk shells displayed extensive alteration features. Moreover, most shells were covered by millimeter scale tubes of the serpulid polychaete Spirorbis sp. suggesting transport from low intertidal or subtidal zone. Both the ecological affinity and known geographic distribution of identified bivalvia as named above support the hypothesis of transportation rather than local development. In addition to mollusk shells, more than a hundred fish otoliths were recovered in surface sediments. The otoliths mostly belong to the Gadidae family. Most of them are well preserved and without serpulid tubes attached to their surface, suggesting a local/regional origin, unlike the shell remains. Although recovered at the surface, the macrofaunal assemblages of the Lomonosov Ridge do not necessarily represent the "modern" environments as they may result from reworking and because their occurrence at the surface of the sediment may also be due to winnowing of finer particles. Although the shells were not dated, we suspect that their actual ages may range from modern to several thousands of

  17. Development of Integrated Die Casting Process for Large Thin-Wall Magnesium Applications

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Jon T. [General Motors LLC, Warren, MI (United States); Wang, Gerry [Meridian Lightweight Technologies, Plymouth MI (United States); Luo, Alan [General Motors LLC, Warren, MI (United States)

    2017-11-29

    The purpose of this project was to develop a process and product which would utilize magnesium die casting and result in energy savings when compared to the baseline steel product. The specific product chosen was a side door inner panel for a mid-size car. The scope of the project included: re-design of major structural parts of the door, design and build of the tooling required to make the parts, making of parts, assembly of doors, and testing (both physical and simulation) of doors. Additional work was done on alloy development, vacuum die casting, and overcasting, all in order to improve the performance of the doors and reduce cost. The project achieved the following objectives: 1. Demonstrated ability to design a large thin-wall magnesium die casting. 2. Demonstrated ability to manufacture a large thin-wall magnesium die casting in AM60 alloy. 3. Tested via simulations and/or physical tests the mechanical behavior and corrosion behavior of magnesium die castings and/or lightweight experimental automotive side doors which incorporate a large, thin-wall, powder coated, magnesium die casting. Under some load cases, the results revealed cracking of the casting, which can be addressed with re-design and better material models for CAE analysis. No corrosion of the magnesium panel was observed. 4. Using life cycle analysis models, compared the energy consumption and global warming potential of the lightweight door with those of a conventional steel door, both during manufacture and in service. Compared to a steel door, the lightweight door requires more energy to manufacture but less energy during operation (i.e., fuel consumption when driving vehicle). Similarly, compared to a steel door, the lightweight door has higher global warming potential (GWP) during manufacture, but lower GWP during operation. 5. Compared the conventional magnesium die casting process with the “super-vacuum” die casting process. Results achieved with cast tensile bars suggest some

  18. The use of fish remains in sediments for the reconstruction of paleoproductivity

    Energy Technology Data Exchange (ETDEWEB)

    Drago, T; Santos, A M P; Pinheiro, J [Institute Nacional de Recursos Biologicos (INRB), L-IPIMAR, Av. 5 de Outubro s/n 8700-305 OLHaO (Portugal); Ferreira-Bartrina, V [Centra de Investigacion CientIfica y de Educacion Superior de Ensenada- CICESE, Km. 107 Carretera Tijuana, C.P.22860, Ensenada, B.C. (Mexico)], E-mail: tdrago@ipimar.pt

    2009-01-01

    The majority of the works concerning fish productivity are based in fish landing records. However, in order to understand the causes of variability in fish productivity (natural and/or anthropogenic) it is essential to have information from periods when human impacts (e.g., fisheries) are considered unimportant. This can be achieved through the use of fish remains, i.e. scales, vertebrae and otoliths, from sediment records. The obtained data can be used to develop time series of fish stocks revealing the history of fish population dynamics over the last centuries or millennia. The majority of these works are located in Eastern Boundary Current Systems (e.g., Benguela, Peru-Humboldt, California), because these are associated with coastal upwelling and high productivity, which in some cases is at the origin of low bottom oxygen levels, leading to scale preservation. A search for fish remains in the Portuguese margin sediments is in progress in the context of the ongoing research project POPEI (High-resolution oceanic paleoproductivity and environmental changes; correlation with fish populations), which intend to fill the gap in studies of this type for the Canary Current System. In this paper we review some general ideas of the use of fish remains, related studies, methodologies and data processing, as well as presenting the first results of POPEI.

  19. High-energy, large-momentum-transfer processes: Ladder diagrams in φ3 theory. Pt. 1

    International Nuclear Information System (INIS)

    Osland, P.; Wu, T.T.; Harvard Univ., Cambridge, MA

    1987-01-01

    Relativistic quantum field theories may give us useful guidance to understanding high-energy, large-momentum-transfer processes, where the center-of-mass energy is much larger than the transverse momentum transfers, which are in turn much larger than the masses of the participating particles. With this possibility in mind, we study the ladder diagrams in φ 3 theory. In this paper, some of the necessary techniques are developed and applied to the simplest cases of the fourth- and sixth-order ladder diagrams. (orig.)

  20. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    Science.gov (United States)

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  1. Summer Decay Processes in a Large Tabular Iceberg

    Science.gov (United States)

    Wadhams, P.; Wagner, T. M.; Bates, R.

    2012-12-01

    Summer Decay Processes in a Large Tabular Iceberg Peter Wadhams (1), Till J W Wagner(1) and Richard Bates(2) (1) Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA, UK (2) Scottish Oceans Institute, School of Geography and Geosciences, University of St Andrews, St. Andrews, Scotland KY16 9AL We present observational results from an experiment carried out during July-August 2012 on a giant grounded tabular iceberg off Baffin Island. The iceberg studied was part of the Petermann Ice Island B1 (PIIB1) which calved off the Petermann Glacier in NW Greenland in 2010. Since 2011 it has been aground in 100 m of water on the Baffin Island shelf at 69 deg 06'N, 66 deg 06'W. As part of the project a set of high resolution GPS sensors and tiltmeters was placed on the ice island to record rigid body motion as well as flexural responses to wind, waves, current and tidal forces, while a Waverider buoy monitored incident waves and swell. On July 31, 2012 a major breakup event was recorded, with a piece of 25,000 sq m surface area calving off the iceberg. At the time of breakup, GPS sensors were collecting data both on the main berg as well as on the newly calved piece, while two of us (PW and TJWW) were standing on the broken-out portion which rose by 0.6 m to achieve a new isostatic equilibrium. Crucially, there was no significant swell at the time of breakup, which suggests a melt-driven decay process rather than wave-driven flexural break-up. The GPS sensors recorded two disturbances during the hour preceding the breakup, indicative of crack growth and propagation. Qualitative observation during the two weeks in which our research ship was moored to, or was close to, the ice island edge indicates that an important mechanism for summer ablation is successive collapses of the overburden from above an unsupported wave cut, which creates a submerged ram fringing the berg. A model of buoyancy stresses induced by

  2. Benchmarking processes for managing large international space programs

    Science.gov (United States)

    Mandell, Humboldt C., Jr.; Duke, Michael B.

    1993-01-01

    The relationship between management style and program costs is analyzed to determine the feasibility of financing large international space missions. The incorporation of management systems is considered to be essential to realizing low cost spacecraft and planetary surface systems. Several companies ranging from large Lockheed 'Skunk Works' to small companies including Space Industries, Inc., Rocket Research Corp., and Orbital Sciences Corp. were studied. It is concluded that to lower the prices, the ways in which spacecraft and hardware are developed must be changed. Benchmarking of successful low cost space programs has revealed a number of prescriptive rules for low cost managements, including major changes in the relationships between the public and private sectors.

  3. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  4. Auroral phenomenology and magnetospheric processes earth and other planets

    CERN Document Server

    Keiling, Andreas; Bagenal, Fran; Karlsson, Tomas

    2013-01-01

    Published by the American Geophysical Union as part of the Geophysical Monograph Series. Many of the most basic aspects of the aurora remain unexplained. While in the past terrestrial and planetary auroras have been largely treated in separate books, Auroral Phenomenology and Magnetospheric Processes: Earth and Other Planets takes a holistic approach, treating the aurora as a fundamental process and discussing the phenomenology, physics, and relationship with the respective planetary magnetospheres in one volume. While there are some behaviors common in auroras of the diffe

  5. Drell–Yan process at Large Hadron Collider

    Indian Academy of Sciences (India)

    Drell–Yan process at LHC, q q ¯ → /* → ℓ+ ℓ-, is one of the benchmarks for confirmation of Standard Model at TeV energy scale. Since the theoretical prediction for the rate is precise and the final state is clean as well as relatively easy to measure, the process can be studied at the LHC even at relatively low luminosity.

  6. Inconsistency in large pharmacogenomic studies

    DEFF Research Database (Denmark)

    Haibe-Kains, Benjamin; El-Hachem, Nehme; Birkbak, Nicolai Juul

    2013-01-01

    Two large-scale pharmacogenomic studies were published recently in this journal. Genomic data are well correlated between studies; however, the measured drug response data are highly discordant. Although the source of inconsistencies remains uncertain, it has potential implications for using...

  7. Large polarons in lead halide perovskites

    OpenAIRE

    Miyata, Kiyoshi; Meggiolaro, Daniele; Trinh, M. Tuan; Joshi, Prakriti P.; Mosconi, Edoardo; Jones, Skyler C.; De Angelis, Filippo; Zhu, X.-Y.

    2017-01-01

    Lead halide perovskites show marked defect tolerance responsible for their excellent optoelectronic properties. These properties might be explained by the formation of large polarons, but how they are formed and whether organic cations are essential remain open questions. We provide a direct time domain view of large polaron formation in single-crystal lead bromide perovskites CH3NH3PbBr3 and CsPbBr3. We found that large polaron forms predominantly from the deformation of the PbBr3 ? framewor...

  8. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  9. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    is the idea that the large-scale foodservice such as hospital food service should adopt a buy organic policy due to their large buying volume. But whereas implementation of organic foods has developed quite unproblematically in smaller institutions such as kindergartens and nurseries, introduction of organic...... foods into large-scale foodservice such as that taking place in hospitals and larger homes for the elderly, has proven to be quite difficult. The very complex planning, procurement and processing procedures used in such facilities are among reasons for this. Against this background an evaluation...

  10. Psychotherapy for Borderline Personality Disorder: Progress and Remaining Challenges.

    Science.gov (United States)

    Links, Paul S; Shah, Ravi; Eynan, Rahel

    2017-03-01

    The main purpose of this review was to critically evaluate the literature on psychotherapies for borderline personality disorder (BPD) published over the past 5 years to identify the progress with remaining challenges and to determine priority areas for future research. A systematic review of the literature over the last 5 years was undertaken. The review yielded 184 relevant abstracts, and after applying inclusion criteria, 16 articles were fully reviewed based on the articles' implications for future research and/or clinical practice. Our review indicated that patients with various severities benefited from psychotherapy; more intensive therapies were not significantly superior to less intensive therapies; enhancing emotion regulation processes and fostering more coherent self-identity were important mechanisms of change; therapies had been extended to patients with BPD and posttraumatic stress disorder; and more research was needed to be directed at functional outcomes.

  11. Utilization of the MPI Process for in-tank solidification of heel material in large-diameter cylindrical tanks

    Energy Technology Data Exchange (ETDEWEB)

    Kauschinger, J.L.; Lewis, B.E.

    2000-01-01

    A major problem faced by the US Department of Energy is remediation of sludge and supernatant waste in underground storage tanks. Exhumation of the waste is currently the preferred remediation method. However, exhumation cannot completely remove all of the contaminated materials from the tanks. For large-diameter tanks, amounts of highly contaminated ``heel'' material approaching 20,000 gal can remain. Often sludge containing zeolite particles leaves ``sand bars'' of locally contaminated material across the floor of the tank. The best management practices for in-tank treatment (stabilization and immobilization) of wastes require an integrated approach to develop appropriate treatment agents that can be safely delivered and mixed uniformly with sludge. Ground Environmental Services has developed and demonstrated a remotely controlled, high-velocity jet delivery system termed, Multi-Point-Injection (MPI). This robust jet delivery system has been field-deployed to create homogeneous monoliths containing shallow buried miscellaneous waste in trenches [fiscal year (FY) 1995] and surrogate sludge in cylindrical (FY 1998) and long, horizontal tanks (FY 1999). During the FY 1998 demonstration, the MPI process successfully formed a 32-ton uniform monolith of grout and waste surrogates in about 8 min. Analytical data indicated that 10 tons of zeolite-type physical surrogate were uniformly mixed within a 40-in.-thick monolith without lifting the MPI jetting tools off the tank floor. Over 1,000 lb of cohesive surrogates, with consistencies similar to Gunite and Associated Tank (GAAT) TH-4 and Hanford tank sludges, were easily intermixed into the monolith without exceeding a core temperature of 100 F during curing.

  12. Design of an RF Antenna for a Large-Bore, High Power, Steady State Plasma Processing Chamber for Material Separation

    International Nuclear Information System (INIS)

    Rasmussen, D.A.; Freeman, R.L.

    2001-01-01

    The purpose of this Cooperative Research and Development Agreement (CRADA) between UT-Battelle, LLC, (Contractor), and Archimedes Technology Group, (Participant) is to evaluate the design of an RF antenna for a large-bore, high power, steady state plasma processing chamber for material separation. Criteria for optimization will be to maximize the power deposition in the plasma while operating at acceptable voltages and currents in the antenna structure. The project objectives are to evaluate the design of an RF antenna for a large-bore, high power, steady state plasma processing chamber for material separation. Criteria for optimization will be to maximize the power deposition in the plasma while operating at acceptable voltages and currents in the antenna structure

  13. Research on the drawing process with a large total deformation wires of AZ31 alloy

    International Nuclear Information System (INIS)

    Bajor, T; Muskalski, Z; Suliga, M

    2010-01-01

    Magnesium and their alloys have been extensively studied in recent years, not only because of their potential applications as light-weight engineering materials, but also owing to their biodegradability. Due to their hexagonal close-packed crystallographic structure, cold plastic processing of magnesium alloys is difficult. The preliminary researches carried out by the authors have indicated that the application of the KOBO method, based on the effect of cyclic strain path change, for the deformation of magnesium alloys, provides the possibility of obtaining a fine-grained structure material to be used for further cold plastic processing with large total deformation. The main purpose of this work is to present research findings concerning a detailed analysis of mechanical properties and changes occurring in the structure of AZ31 alloy wire during the multistage cold drawing process. The appropriate selection of drawing parameters and the application of multistep heat treatment operations enable the deformation of the AZ31 alloy in the cold drawing process with a total draft of about 90%.

  14. Remaining Useful Lifetime Prognosis of Controlled Systems: A Case of Stochastically Deteriorating Actuator

    Directory of Open Access Journals (Sweden)

    Danh Ngoc Nguyen

    2015-01-01

    Full Text Available This paper addresses the case of automatic controlled system which deteriorates during its operation because of components’ wear or deterioration. Depending on its specific closed-loop structure, the controlled system has the ability to compensate for disturbances affecting the actuators which can remain partially hidden. The deterioration modeling and the Remaining Useful Lifetime (RUL estimation for such closed-loop dynamic system have not been addressed extensively. In this paper, we consider a controlled system with Proportional-Integral-Derivative controller. It is assumed that the actuator is subject to shocks that occur randomly in time. An integrated model is proposed to jointly describe the state of the controlled process and the actuator deterioration. Only the output of the controlled system is available to assess its health condition. By considering a Piecewise Deterministic Markov Process, the RUL of the system can be estimated by a two-step approach. In the first step referred as the “Diagnosis” step, the system state is estimated online from the available monitoring observations by using a particle filtering method. In the second step referred as the “Prognosis” step, the RUL is estimated as a conditional reliability by Monte Carlo simulation. To illustrate the approach, a simulated tank level control system is used.

  15. Detecting change in dynamic process systems with immunocomputing

    Energy Technology Data Exchange (ETDEWEB)

    Yang, X.; Aldrich, C.; Maree, C. [University of Stellenbosch, Stellenbosch (South Africa). Dept. of Process Engineering

    2007-02-15

    The natural immune system is an adaptive distributed pattern recognition system with several functional components designed for recognition, memory acquisition, diversity and self-regulation. In artificial immune systems, some of these characteristics are exploited in order to design computational systems capable of detecting novel patterns or the anomalous behaviour of a system in some sense. Despite their obvious promise in the application of fault diagnostic systems in process engineering, their potential remains largely unexplored in this regard. In this paper, the application of real-valued negative selection algorithms to simulated and real-world systems is considered. These algorithms deal with the self-nonself discrimination problem in immunocomputing, where normal process behaviour is coded as the self and any deviations from normal behaviour is encoded as nonself. The case studies have indicated that immunocomputing based on negative selection can provide competitive options for fault diagnosis in nonlinear process systems, but further work is required on large systems characterized by many variables.

  16. A case study of remaining storage life prediction using stochastic filtering with the influence of condition monitoring

    International Nuclear Information System (INIS)

    Wang, Zhaoqiang; Hu, Changhua; Wang, Wenbin; Zhou, Zhijie; Si, Xiaosheng

    2014-01-01

    Some systems may spend most of their time in storage, but once needed, must be fully functional. Slow degradation occurs when the system is in storage, so to ensure the functionality of these systems, condition monitoring is usually conducted periodically to check the condition of the system. However, taking the condition monitoring data may require putting the system under real testing situation which may accelerate the degradation, and therefore, shorten the storage life of the system. This paper presents a case study of condition-based remaining storage life prediction for gyros in the inertial navigation system on the basis of the condition monitoring data and the influence of the condition monitoring data taking process. A stochastic-filtering-based degradation model is developed to incorporate both into the prediction of the remaining storage life distribution. This makes the predicted remaining storage life depend on not only the condition monitoring data but also the testing process of taking the condition monitoring data, which the existing prognostic techniques and algorithms did not consider. The presented model is fitted to the real condition monitoring data of gyros testing using the maximum likelihood estimation method for parameter estimation. Comparisons are made with the model without considering the process of taking the condition monitoring data, and the results clearly demonstrate the superiority of the newly proposed model

  17. Quenches in large superconducting magnets

    International Nuclear Information System (INIS)

    Eberhard, P.H.; Alston-Garnjost, M.; Green, M.A.; Lecomte, P.; Smits, R.G.; Taylor, J.D.; Vuillemin, V.

    1977-08-01

    The development of large high current density superconducting magnets requires an understanding of the quench process by which the magnet goes normal. A theory which describes the quench process in large superconducting magnets is presented and compared with experimental measurements. The use of a quench theory to improve the design of large high current density superconducting magnets is discussed

  18. Kadav Moun PSA (:60) (Human Remains)

    Centers for Disease Control (CDC) Podcasts

    2010-02-18

    This is an important public health announcement about safety precautions for those handling human remains. Language: Haitian Creole.  Created: 2/18/2010 by Centers for Disease Control and Prevention (CDC).   Date Released: 2/18/2010.

  19. Statistical process control charts for attribute data involving very large sample sizes: a review of problems and solutions.

    Science.gov (United States)

    Mohammed, Mohammed A; Panesar, Jagdeep S; Laney, David B; Wilson, Richard

    2013-04-01

    The use of statistical process control (SPC) charts in healthcare is increasing. The primary purpose of SPC is to distinguish between common-cause variation which is attributable to the underlying process, and special-cause variation which is extrinsic to the underlying process. This is important because improvement under common-cause variation requires action on the process, whereas special-cause variation merits an investigation to first find the cause. Nonetheless, when dealing with attribute or count data (eg, number of emergency admissions) involving very large sample sizes, traditional SPC charts often produce tight control limits with most of the data points appearing outside the control limits. This can give a false impression of common and special-cause variation, and potentially misguide the user into taking the wrong actions. Given the growing availability of large datasets from routinely collected databases in healthcare, there is a need to present a review of this problem (which arises because traditional attribute charts only consider within-subgroup variation) and its solutions (which consider within and between-subgroup variation), which involve the use of the well-established measurements chart and the more recently developed attribute charts based on Laney's innovative approach. We close by making some suggestions for practice.

  20. Consultancy on Large-Scale Submerged Aerobic Cultivation Process Design - Final Technical Report: February 1, 2016 -- June 30, 2016

    Energy Technology Data Exchange (ETDEWEB)

    Crater, Jason [Gemomatica, Inc., San Diego, CA (United States); Galleher, Connor [Gemomatica, Inc., San Diego, CA (United States); Lievense, Jeff [Gemomatica, Inc., San Diego, CA (United States)

    2017-05-12

    NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integrated black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.

  1. Current Understanding and Remaining Challenges in Modeling Long-Term Degradation of Borosilicate Nuclear Waste Glasses

    International Nuclear Information System (INIS)

    Vienna, John D.; Ryan, Joseph V.; Gin, Stephane; Inagaki, Yaohiro

    2013-01-01

    Chemical durability is not a single material property that can be uniquely measured. Instead it is the response to a host of coupled material and environmental processes whose rates are estimated by a combination of theory, experiment, and modeling. High-level nuclear waste (HLW) glass is perhaps the most studied of any material yet there remain significant technical gaps regarding their chemical durability. The phenomena affecting the long-term performance of HLW glasses in their disposal environment include surface reactions, transport properties to and from the reacting glass surface, and ion exchange between the solid glass and the surrounding solution and alteration products. The rates of these processes are strongly influenced and are coupled through the solution chemistry, which is in turn influenced by the reacting glass and also by reaction with the near-field materials and precipitation of alteration products. Therefore, those processes must be understood sufficiently well to estimate or bound the performance of HLW glass in its disposal environment over geologic time-scales. This article summarizes the current state of understanding of surface reactions, transport properties, and ion exchange along with the near-field materials and alteration products influences on solution chemistry and glass reaction rates. Also summarized are the remaining technical gaps along with recommended approaches to fill those technical gaps

  2. Lawn Care Pesticides. Risks Remain Uncertain While Prohibited Safety Claims Continue

    Science.gov (United States)

    1990-03-23

    health effects, such as cancer and birth defects, and adverse ecological effects. Currently these pesticides are being applied in large amounts without...Reregi stration such as cancer and birth defects, and that reassessing the health risks of using these pesticides as part of the reregistration process may...Endothall Herbicide NO Glyphosate Herbicide YES Isoxaben Herbicide a MCPA (2-methyl-4-chlorophenoxyacetic acid) Herbicide YES MCPP (potassium salt

  3. Detecting Buried Archaeological Remains by the Use of Geophysical Data Processing with 'Diffusion Maps' Methodology

    Science.gov (United States)

    Eppelbaum, Lev

    2015-04-01

    Geophysical methods are prompt, non-invasive and low-cost tool for quantitative delineation of buried archaeological targets. However, taking into account the complexity of geological-archaeological media, some unfavourable environments and known ambiguity of geophysical data analysis, a single geophysical method examination might be insufficient (Khesin and Eppelbaum, 1997). Besides this, it is well-known that the majority of inverse-problem solutions in geophysics are ill-posed (e.g., Zhdanov, 2002), which means, according to Hadamard (1902), that the solution does not exist, or is not unique, or is not a continuous function of observed geophysical data (when small perturbations in the observations will cause arbitrary mistakes in the solution). This fact has a wide application for informational, probabilistic and wavelet methodologies in archaeological geophysics (Eppelbaum, 2014a). The goal of the modern geophysical data examination is to detect the geophysical signatures of buried targets at noisy areas via the analysis of some physical parameters with a minimal number of false alarms and miss-detections (Eppelbaum et al., 2011; Eppelbaum, 2014b). The proposed wavelet approach to recognition of archaeological targets (AT) by the examination of geophysical method integration consists of advanced processing of each geophysical method and nonconventional integration of different geophysical methods between themselves. The recently developed technique of diffusion clustering combined with the abovementioned wavelet methods was utilized to integrate the geophysical data and detect existing irregularities. The approach is based on the wavelet packet techniques applied as to the geophysical images (or graphs) versus coordinates. For such an analysis may be utilized practically all geophysical methods (magnetic, gravity, seismic, GPR, ERT, self-potential, etc.). On the first stage of the proposed investigation a few tens of typical physical-archaeological models (PAM

  4. Individual differences influence two-digit number processing, but not their analog magnitude processing: a large-scale online study.

    Science.gov (United States)

    Huber, Stefan; Nuerk, Hans-Christoph; Reips, Ulf-Dietrich; Soltanlou, Mojtaba

    2017-12-23

    Symbolic magnitude comparison is one of the most well-studied cognitive processes in research on numerical cognition. However, while the cognitive mechanisms of symbolic magnitude processing have been intensively studied, previous studies have paid less attention to individual differences influencing symbolic magnitude comparison. Employing a two-digit number comparison task in an online setting, we replicated previous effects, including the distance effect, the unit-decade compatibility effect, and the effect of cognitive control on the adaptation to filler items, in a large-scale study in 452 adults. Additionally, we observed that the most influential individual differences were participants' first language, time spent playing computer games and gender, followed by reported alcohol consumption, age and mathematical ability. Participants who used a first language with a left-to-right reading/writing direction were faster than those who read and wrote in the right-to-left direction. Reported playing time for computer games was correlated with faster reaction times. Female participants showed slower reaction times and a larger unit-decade compatibility effect than male participants. Participants who reported never consuming alcohol showed overall slower response times than others. Older participants were slower, but more accurate. Finally, higher grades in mathematics were associated with faster reaction times. We conclude that typical experiments on numerical cognition that employ a keyboard as an input device can also be run in an online setting. Moreover, while individual differences have no influence on domain-specific magnitude processing-apart from age, which increases the decade distance effect-they generally influence performance on a two-digit number comparison task.

  5. Small enterprises' importance to the U.S. secondary wood processing industry

    Science.gov (United States)

    Urs Buehlmann; Omar Espinoza; Matthew Bumgardner; Michael. Sperber

    2013-01-01

    The past decades have seen numerous U.S. secondary wood processing companies shift their production to overseas locations, mainly in Southeast Asia. The remaining companies have been hit hard by the downturn in housing markets and the following recession. Thus, many large customers of the U.S. hardwood lumber industry have reduced or stopped the purchase of products,...

  6. A National Perspective: An Analysis of Factors That Influence Special Educators to Remain in the Field of Education

    Science.gov (United States)

    Nickson, Lautrice M.; Kritsonis, William Allan

    2006-01-01

    The purpose of this article is to analyze factors that influence special educators to remain in the field of education. School administrators are perplexed by the large number of teachers who decide to leave the field of education after three years. The retention rates of special educators' require school administrators to focus on developing a…

  7. Waterfront redevelopment processes in Aalborg, Denmark

    DEFF Research Database (Denmark)

    Galland, Daniel

    Waterfront redevelopment processes have been an important topic of research in the planning domain during the past decades. This has been particularly evident in the case of projects associated with large metropolitan areas such as London, To-ronto, Boston or Barcelona. However, planning research...... in relation with medium-sized or smaller cities undergoing waterfront redevelopment remains somewhat unexplored. In contributing to fill in this gap, this paper explores different processes of urban regeneration comprised within the practice of waterfront redevelopment in Aalborg, Denmark. In doing so......, the paper takes on such planning processes through an in-depth analysis of different waterfront redevelopment sites. The pa-per attempts to elaborate an understanding of planning not only as a means to control development but also as a market-driven practice. In doing so, it provides descriptive...

  8. Reducing process delays for real-time earthquake parameter estimation - An application of KD tree to large databases for Earthquake Early Warning

    Science.gov (United States)

    Yin, Lucy; Andrews, Jennifer; Heaton, Thomas

    2018-05-01

    Earthquake parameter estimations using nearest neighbor searching among a large database of observations can lead to reliable prediction results. However, in the real-time application of Earthquake Early Warning (EEW) systems, the accurate prediction using a large database is penalized by a significant delay in the processing time. We propose to use a multidimensional binary search tree (KD tree) data structure to organize large seismic databases to reduce the processing time in nearest neighbor search for predictions. We evaluated the performance of KD tree on the Gutenberg Algorithm, a database-searching algorithm for EEW. We constructed an offline test to predict peak ground motions using a database with feature sets of waveform filter-bank characteristics, and compare the results with the observed seismic parameters. We concluded that large database provides more accurate predictions of the ground motion information, such as peak ground acceleration, velocity, and displacement (PGA, PGV, PGD), than source parameters, such as hypocenter distance. Application of the KD tree search to organize the database reduced the average searching process by 85% time cost of the exhaustive method, allowing the method to be feasible for real-time implementation. The algorithm is straightforward and the results will reduce the overall time of warning delivery for EEW.

  9. Process evaluation of treatment times in a large radiotherapy department

    International Nuclear Information System (INIS)

    Beech, R.; Burgess, K.; Stratford, J.

    2016-01-01

    Purpose/objective: The Department of Health (DH) recognises access to appropriate and timely radiotherapy (RT) services as crucial in improving cancer patient outcomes, especially when facing a predicted increase in cancer diagnosis. There is a lack of ‘real-time’ data regarding daily demand of a linear accelerator, the impact of increasingly complex techniques on treatment times, and whether current scheduling reflects time needed for RT delivery, which would be valuable in highlighting current RT provision. Material/methods: A systematic quantitative process evaluation was undertaken in a large regional cancer centre, including a satellite centre, between January and April 2014. Data collected included treatment room-occupancy time, RT site, RT and verification technique and patient mobility status. Data was analysed descriptively; average room-occupancy times were calculated for RT techniques and compared to historical standardised treatment times within the department. Results: Room-occupancy was recorded for over 1300 fractions, over 50% of which overran their allotted treatment time. In a focused sample of 16 common techniques, 10 overran their allocated timeslots. Verification increased room-occupancy by six minutes (50%) over non-imaging. Treatments for patients requiring mobility assistance took four minutes (29%) longer. Conclusion: The majority of treatments overran their standardised timeslots. Although technique advancement has reduced RT delivery time, room-occupancy has not necessarily decreased. Verification increases room-occupancy and needs to be considered when moving towards adaptive techniques. Mobility affects room-occupancy and will become increasingly significant in an ageing population. This evaluation assesses validity of current treatment times in this department, and can be modified and repeated as necessary. - Highlights: • A process evaluation examined room-occupancy for various radiotherapy techniques. • Appointment lengths

  10. Motivation for entry, occupational commitment and intent to remain: a survey regarding Registered Nurse retention.

    Science.gov (United States)

    Gambino, Kathleen M

    2010-11-01

    This paper is a report of a study of the relationships between Registered Nurses' motivation for entering the profession, occupational commitment and intent to remain with an employer until retirement. Identifying and supporting nurses who are strongly committed to their profession may be the single most influential intervention in combating the nursing shortage. An understanding of the characteristics these individuals possess could lead to a decline in the high attrition rates plaguing the profession. Using a survey design, Registered Nurses enrolled at the school of nursing and/or employed at the associated university medical centre of a large, not-for-profit state university were polled in 2008. Logistic regression analysis was used to determine how the variables of motivation for entry and occupational commitment could indicate intent to remain. The strongest indicators of intent to remain were normative commitment and age, with a 70% average rate of correctly estimating retention. Exp(B) values for normative commitment (1·09) and age (1·07) indicated that for each one-point increase on the normative commitment scale or one-point increase in age, the odds of remaining with an employer until retirement increased by 1·1%. Transformational changes in healthcare environments and nursing schools must be made to encourage loyalty and obligation, the hallmarks of normative commitment. Retention strategies should accommodate mature nurses as well as promote normative commitment in younger nurses. © 2010 Blackwell Publishing Ltd.

  11. The effects of large scale processing on caesium leaching from cemented simulant sodium nitrate waste

    International Nuclear Information System (INIS)

    Lee, D.J.; Brown, D.J.

    1982-01-01

    The effects of large scale processing on the properties of cemented simulant sodium nitrate waste have been investigated. Leach tests have been performed on full-size drums, cores and laboratory samples of cement formulations containing Ordinary Portland Cement (OPC), Sulphate Resisting Portland Cement (SRPC) and a blended cement (90% ground granulated blast furnace slag/10% OPC). In addition, development of the cement hydration exotherms with time and the temperature distribution in 220 dm 3 samples have been followed. (author)

  12. Process Simulation and Characterization of Substrate Engineered Silicon Thin Film Transistor for Display Sensors and Large Area Electronics

    International Nuclear Information System (INIS)

    Hashmi, S M; Ahmed, S

    2013-01-01

    Design, simulation, fabrication and post-process qualification of substrate-engineered Thin Film Transistors (TFTs) are carried out to suggest an alternate manufacturing process step focused on display sensors and large area electronics applications. Damage created by ion implantation of Helium and Silicon ions into single-crystalline n-type silicon substrate provides an alternate route to create an amorphized region responsible for the fabrication of TFT structures with controllable and application-specific output parameters. The post-process qualification of starting material and full-cycle devices using Rutherford Backscattering Spectrometry (RBS) and Proton or Particle induced X-ray Emission (PIXE) techniques also provide an insight to optimize the process protocols as well as their applicability in the manufacturing cycle

  13. The Annuity Puzzle Remains a Puzzle

    NARCIS (Netherlands)

    Peijnenburg, J.M.J.; Werker, Bas; Nijman, Theo

    We examine incomplete annuity menus and background risk as possible drivers of divergence from full annuitization. Contrary to what is often suggested in the literature, we find that full annuitization remains optimal if saving is possible after retirement. This holds irrespective of whether real or

  14. Review of the Remaining Useful Life Prognostics of Vehicle Lithium-Ion Batteries Using Data-Driven Methodologies

    Directory of Open Access Journals (Sweden)

    Lifeng Wu

    2016-05-01

    Full Text Available Lithium-ion batteries are the primary power source in electric vehicles, and the prognosis of their remaining useful life is vital for ensuring the safety, stability, and long lifetime of electric vehicles. Accurately establishing a mechanism model of a vehicle lithium-ion battery involves a complex electrochemical process. Remaining useful life (RUL prognostics based on data-driven methods has become a focus of research. Current research on data-driven methodologies is summarized in this paper. By analyzing the problems of vehicle lithium-ion batteries in practical applications, the problems that need to be solved in the future are identified.

  15. Nuclear Material Processing at the Savannah River Site

    International Nuclear Information System (INIS)

    Severynse, T.F.

    1998-07-01

    Plutonium production for national defense began at Savannah River in the mid-1950s, following construction of production reactors and separations facilities. Following the successful completion of its production mission, the site's nuclear material processing facilities continue to operate to perform stabilization of excess materials and potentially support the disposition of these materials. A number of restoration and productivity improvement projects implemented in the 1980s, totaling nearly a billion dollars, have resulted in these facilities representing the most modern and only remaining operating large-scale processing facilities in the DOE Complex. Together with the Site's extensive nuclear infrastructure, and integrated waste management system, SRS is the only DOE site with the capability and mission of ongoing processing operations

  16. Large eddy simulation of turbulent and stably-stratified flows

    International Nuclear Information System (INIS)

    Fallon, Benoit

    1994-01-01

    The unsteady turbulent flow over a backward-facing step is studied by mean of Large Eddy Simulations with structure function sub grid model, both in isothermal and stably-stratified configurations. Without stratification, the flow develops highly-distorted Kelvin-Helmholtz billows, undergoing to helical pairing, with A-shaped vortices shed downstream. We show that forcing injected by recirculation fluctuations governs this oblique mode instabilities development. The statistical results show good agreements with the experimental measurements. For stably-stratified configurations, the flow remains more bi-dimensional. We show with increasing stratification, how the shear layer growth is frozen by inhibition of pairing process then of Kelvin-Helmholtz instabilities, and the development of gravity waves or stable density interfaces. Eddy structures of the flow present striking analogies with the stratified mixing layer. Additional computations show the development of secondary Kelvin-Helmholtz instabilities on the vorticity layers between two primary structures. This important mechanism based on baroclinic effects (horizontal density gradients) constitutes an additional part of the turbulent mixing process. Finally, the feasibility of Large Eddy Simulation is demonstrated for industrial flows, by studying a complex stratified cavity. Temperature fluctuations are compared to experimental measurements. We also develop three-dimensional un-stationary animations, in order to understand and visualize turbulent interactions. (author) [fr

  17. Survey of high-voltage pulse technology suitable for large-scale plasma source ion implantation processes

    International Nuclear Information System (INIS)

    Reass, W.A.

    1994-01-01

    Many new plasma processes ideas are finding their way from the research lab to the manufacturing plant floor. These require high voltage (HV) pulse power equipment, which must be optimized for application, system efficiency, and reliability. Although no single HV pulse technology is suitable for all plasma processes, various classes of high voltage pulsers may offer a greater versatility and economy to the manufacturer. Technology developed for existing radar and particle accelerator modulator power systems can be utilized to develop a modern large scale plasma source ion implantation (PSII) system. The HV pulse networks can be broadly defined by two classes of systems, those that generate the voltage directly, and those that use some type of pulse forming network and step-up transformer. This article will examine these HV pulse technologies and discuss their applicability to the specific PSII process. Typical systems that will be reviewed will include high power solid state, hard tube systems such as crossed-field ''hollow beam'' switch tubes and planar tetrodes, and ''soft'' tube systems with crossatrons and thyratrons. Results will be tabulated and suggestions provided for a particular PSII process

  18. A progress report for the large block test of the coupled thermal-mechanical-hydrological-chemical processes

    International Nuclear Information System (INIS)

    Lin, W.; Wilder, D.G.; Blink, J.

    1994-10-01

    This is a progress report on the Large Block Test (LBT) project. The purpose of the LBT is to study some of the coupled thermal-mechanical-hydrological-chemical (TMHC) processes in the near field of a nuclear waste repository under controlled boundary conditions. To do so, a large block of Topopah Spring tuff will be heated from within for about 4 to 6 months, then cooled down for about the same duration. Instruments to measure temperature, moisture content, stress, displacement, and chemical changes will be installed in three directions in the block. Meanwhile, laboratory tests will be conducted on small blocks to investigate individual thermal-mechanical, thermal-hydrological, and thermal-chemical processes. The fractures in the large block will be characterized from five exposed surfaces. The minerals on fracture surfaces will be studied before and after the test. The results from the LBT will be useful for testing and building confidence in models that will be used to predict TMHC processes in a repository. The boundary conditions to be controlled on the block include zero moisture flux and zero heat flux on the sides, constant temperature on the top, and constant stress on the outside surfaces of the block. To control these boundary conditions, a load-retaining frame is required. A 3 x 3 x 4.5 m block of Topopah Spring tuff has been isolated on the outcrop at Fran Ridge, Nevada Test Site. Pre-test model calculations indicate that a permeability of at least 10 -15 m 2 is required so that a dryout zone can be created within a practical time frame when the block is heated from within. Neutron logging was conducted in some of the vertical holes to estimate the initial moisture content of the block. It was found that about 60 to 80% of the pore volume of the block is saturated with water. Cores from the vertical holes have been used to map the fractures and to determine the properties of the rock. A current schedule is included in the report

  19. Massive Cloud Computing Processing of P-SBAS Time Series for Displacement Analyses at Large Spatial Scale

    Science.gov (United States)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  20. Evaluation of remaining behavior of halogen on the fabrication of MOX pellet containing Am

    International Nuclear Information System (INIS)

    Ozaki, Yoko; Osaka, Masahiko; Obayashi, Hiroshi; Tanaka, Kenya

    2004-11-01

    It is important to limit the content of halogen elements, namely fluorine and chlorine that are sources of making cladding material corrode, in nuclear fuel from the viewpoint of quality assurance. The halogen content should be more carefully limited in the MOX fuel containing Americium (Am-MOX), which is fabricated in the Alpha-Gamma Facility (AGF) for irradiation testing to be conducted in the experimental fast reactor JOYO, because fluorine may remain in the sintered pellets owing to a formation of AmF 3 known to have a low vapor pressure and may exceeds the limit of 25 ppm. In this study, a series of experimental determination of halogen element in Am-MOX were performed by a combination method of pyrolysis and ion-chromatography for the purpose of an evaluation of behavior of remaining halogen through the sintering process. Oxygen potential, temperature and time were changed as experimental parameters and their effects on the remaining behavior of halogen were examined. It was confirmed that good pellets, which contained small amount of halogen, could be obtained by the sintering for 3 hour at 1700degC in the oxygen potential range from -520 to -390 kJ/mol. In order to analysis of fluorine chemical form in green pellet, thermal analysis was performed. AmF 3 and PuF 3 have been confirmed to remain in the green pellet. (author)

  1. Uranium processing developments

    International Nuclear Information System (INIS)

    Jones, J.Q.

    1977-01-01

    The basic methods for processing ore to recover the contained uranium have not changed significantly since the 1954-62 period. Improvements in mill operations have been the result of better or less expensive reagents, changes in equipment, and in the successful resolvement of many environmental matters. There is also an apparent trend toward large mills that can profitably process lower grade ores. The major thrust in the near future will not be on process technology but on the remaining environmental constraints associated with milling. At this time the main ''spot light'' is on tailings dam and impoundment area construction and reclamation. Plans must provide for an adequate safety factor for stability, no surface or groundwater contamination, and minimal discharge of radionuclides to unrestricted areas, as may be required by law. Solution mining methods must also provide for plans to restore the groundwater back to its original condition as defined by local groundwater regulations. Basic flowsheets (each to finished product) plus modified versions of the basic types are shown

  2. Statistical processing of large image sequences.

    Science.gov (United States)

    Khellah, F; Fieguth, P; Murray, M J; Allen, M

    2005-01-01

    The dynamic estimation of large-scale stochastic image sequences, as frequently encountered in remote sensing, is important in a variety of scientific applications. However, the size of such images makes conventional dynamic estimation methods, for example, the Kalman and related filters, impractical. In this paper, we present an approach that emulates the Kalman filter, but with considerably reduced computational and storage requirements. Our approach is illustrated in the context of a 512 x 512 image sequence of ocean surface temperature. The static estimation step, the primary contribution here, uses a mixture of stationary models to accurately mimic the effect of a nonstationary prior, simplifying both computational complexity and modeling. Our approach provides an efficient, stable, positive-definite model which is consistent with the given correlation structure. Thus, the methods of this paper may find application in modeling and single-frame estimation.

  3. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    Science.gov (United States)

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  4. Juveniles' Motivations for Remaining in Prostitution

    Science.gov (United States)

    Hwang, Shu-Ling; Bedford, Olwen

    2004-01-01

    Qualitative data from in-depth interviews were collected in 1990-1991, 1992, and 2000 with 49 prostituted juveniles remanded to two rehabilitation centers in Taiwan. These data are analyzed to explore Taiwanese prostituted juveniles' feelings about themselves and their work, their motivations for remaining in prostitution, and their difficulties…

  5. Ñuagapua (Chaco, Bolivia): Evidence for the latest occurrence of megafauna in association with human remains in South America

    Science.gov (United States)

    Coltorti, Mauro; Della Fazia, Jacopo; Paredes Rios, Freddy; Tito, Giuseppe

    2012-02-01

    Quebrada (stream) Ñuagapua, which is located in the Bolivian Chaco in the Andean foothill generates an alluvial fan many kilometres in length. Three major lithostratigraphic units characterise the sedimentary sequence in this region. The lower and upper parts are formed from predominantly sandy sediments that demonstrate rapid growth of the alluvial fan, associated with an intense erosion of barren slopes. The intermediate unit consists of forest soil that seals deep channels containing bones together with a forest association. The remains of a wooden plank, dated 140 yr BP, were found at the top of this soil, which laterally contains charcoals, ash layers and large charred trunks, sometimes in growth positions. Roots localised in this layer also sustain a number of very large still living trees. These findings are evidence of a recent phase of alluvial fan sedimentation resulting from slope erosion activated by forest clearing. The Chaco has been intensively settled for agricultural and pastoral purposes since the 18th century. The lower unit contains a hearth, scattered burnt bones, flint flakes and ceramic artefacts. Radiometric dating indicates a middle Holocene human occupation, between ca. 7.79 and 6.65 ka cal yr BP. We suggest that the sedimentary unit is associated with intense soil erosion processes triggered by early Neolithic deforestation. A sandy layer of the lower unit, slightly above the archaeological remains, contains transported bones of megafaunal elements that apparently represent the South American latest occurrence of some extinct taxa. The mammal association is highly heterogeneous, containing species living in aquatic, forest, prairie and savannah environments from a very specific layer that represents the almost simultaneous burial of animals killed slightly up-valley. This anomalous association is probably the result of human impact as opening the forest favoured the introduction of open environment fauna that had previously survived on

  6. [Large vessel vasculitides].

    Science.gov (United States)

    Morović-Vergles, Jadranka; Puksić, Silva; Gracanin, Ana Gudelj

    2013-01-01

    Large vessel vasculitis includes Giant cell arteritis and Takayasu arteritis. Giant cell arteritis is the most common form of vasculitis affect patients aged 50 years or over. The diagnosis should be considered in older patients who present with new onset of headache, visual disturbance, polymyalgia rheumatica and/or fever unknown cause. Glucocorticoides remain the cornerstone of therapy. Takayasu arteritis is a chronic panarteritis of the aorta ant its major branches presenting commonly in young ages. Although all large arteries can be affected, the aorta, subclavian and carotid arteries are most commonly involved. The most common symptoms included upper extremity claudication, hypertension, pain over the carotid arteries (carotidynia), dizziness and visual disturbances. Early diagnosis and treatment has improved the outcome in patients with TA.

  7. Geomorphic and habitat response to a large-dam removal in a Mediterranean river

    Science.gov (United States)

    Harrison, L.; East, A. E.; Smith, D. P.; Bond, R.; Logan, J. B.; Nicol, C.; Williams, T.; Boughton, D. A.; Chow, K.

    2017-12-01

    The presence of large dams has fundamentally altered physical and biological processes in riverine ecosystems, and dam removal is becoming more common as a river restoration strategy. We used a before-after-control-impact study design to investigate the geomorphic and habitat response to removal of 32-m-high San Clemente Dam on the Carmel River, CA. The project represents the first major dam removal in a Mediterranean river and is also unique among large dam removals in that most reservoir sediment was sequestered in place. We found that in the first year post-removal, a sediment pulse migrated 3.5 km downstream, filling pools and the interstitial pore spaces of gravels with sand. These sedimentary and topographic changes initially reduced the overall quality of steelhead (O. mykiss) spawning and rearing habitat in impacted reaches. Over the second winter after dam removal, a sequence of high flows flushed large volumes of sand from pools and mobilized the river bed throughout much of the active channel. The floods substantially altered fluvial evolution in the upper part of the reservoir, promoting new avulsion and the subsequent delivery of gravel and large wood to below dam reaches. These geomorphic processes increased the availability of spawning-sized gravel and enhanced channel complexity in reaches within several km of the former dam, which should improve habitat for multiple life stages of steelhead. Results indicate that when most reservoir sediment remains impounded, high flows become more important drivers of geomorphic and habitat change than dam removal alone. In such cases, the rates at which biophysical processes are reestablished will depend largely on post-dam removal flow sequencing and the upstream supply of sediment and large wood.

  8. Precise large deviations of aggregate claims in a size-dependent renewal risk model with stopping time claim-number process

    Directory of Open Access Journals (Sweden)

    Shuo Zhang

    2017-04-01

    Full Text Available Abstract In this paper, we consider a size-dependent renewal risk model with stopping time claim-number process. In this model, we do not make any assumption on the dependence structure of claim sizes and inter-arrival times. We study large deviations of the aggregate amount of claims. For the subexponential heavy-tailed case, we obtain a precise large-deviation formula; our method substantially relies on a martingale for the structure of our models.

  9. 320-nm Flexible Solution-Processed 2,7-dioctyl[1] benzothieno[3,2-b]benzothiophene Transistors

    OpenAIRE

    Ren, Hang; Tang, Qingxin; Tong, Yanhong; Liu, Yichun

    2017-01-01

    Flexible organic thin-film transistors (OTFTs) have received extensive attention due to their outstanding advantages such as light weight, low cost, flexibility, large-area fabrication, and compatibility with solution-processed techniques. However, compared with a rigid substrate, it still remains a challenge to obtain good device performance by directly depositing solution-processed organic semiconductors onto an ultrathin plastic substrate. In this work, ultrathin flexible OTFTs are success...

  10. A hotspot of large branchiopod diversity in south-eastern Zimbabwe ...

    African Journals Online (AJOL)

    Large branchiopods are considered threatened across much of their global range. However, because several regions, including Zimbabwe in general and its south-eastern lowveld in particular, remain largely unstudied, interpretations of species distribution patterns are often based on limited data. A detailed study of large ...

  11. Reducing Plug and Process Loads for a Large Scale, Low Energy Office Building: NREL's Research Support Facility; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Pless, S.; Sheppy, M.; Torcellini, P.

    2011-02-01

    This paper documents the design and operational plug and process load energy efficiency measures needed to allow a large scale office building to reach ultra high efficiency building goals. The appendices of this document contain a wealth of documentation pertaining to plug and process load design in the RSF, including a list of equipment was selected for use.

  12. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    Science.gov (United States)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of

  13. Inverse problem to constrain the controlling parameters of large-scale heat transport processes: The Tiberias Basin example

    Science.gov (United States)

    Goretzki, Nora; Inbar, Nimrod; Siebert, Christian; Möller, Peter; Rosenthal, Eliyahu; Schneider, Michael; Magri, Fabien

    2015-04-01

    Salty and thermal springs exist along the lakeshore of the Sea of Galilee, which covers most of the Tiberias Basin (TB) in the northern Jordan- Dead Sea Transform, Israel/Jordan. As it is the only freshwater reservoir of the entire area, it is important to study the salinisation processes that pollute the lake. Simulations of thermohaline flow along a 35 km NW-SE profile show that meteoric and relic brines are flushed by the regional flow from the surrounding heights and thermally induced groundwater flow within the faults (Magri et al., 2015). Several model runs with trial and error were necessary to calibrate the hydraulic conductivity of both faults and major aquifers in order to fit temperature logs and spring salinity. It turned out that the hydraulic conductivity of the faults ranges between 30 and 140 m/yr whereas the hydraulic conductivity of the Upper Cenomanian aquifer is as high as 200 m/yr. However, large-scale transport processes are also dependent on other physical parameters such as thermal conductivity, porosity and fluid thermal expansion coefficient, which are hardly known. Here, inverse problems (IP) are solved along the NW-SE profile to better constrain the physical parameters (a) hydraulic conductivity, (b) thermal conductivity and (c) thermal expansion coefficient. The PEST code (Doherty, 2010) is applied via the graphical interface FePEST in FEFLOW (Diersch, 2014). The results show that both thermal and hydraulic conductivity are consistent with the values determined with the trial and error calibrations. Besides being an automatic approach that speeds up the calibration process, the IP allows to cover a wide range of parameter values, providing additional solutions not found with the trial and error method. Our study shows that geothermal systems like TB are more comprehensively understood when inverse models are applied to constrain coupled fluid flow processes over large spatial scales. References Diersch, H.-J.G., 2014. FEFLOW Finite

  14. Loss aversion, large deviation preferences and optimal portfolio weights for some classes of return processes

    Science.gov (United States)

    Duffy, Ken; Lobunets, Olena; Suhov, Yuri

    2007-05-01

    We propose a model of a loss averse investor who aims to maximize his expected wealth under certain constraints. The constraints are that he avoids, with high probability, incurring an (suitably defined) unacceptable loss. The methodology employed comes from the theory of large deviations. We explore a number of fundamental properties of the model and illustrate its desirable features. We demonstrate its utility by analyzing assets that follow some commonly used financial return processes: Fractional Brownian Motion, Jump Diffusion, Variance Gamma and Truncated Lévy.

  15. DB-XES : enabling process discovery in the large

    NARCIS (Netherlands)

    Syamsiyah, A.; van Dongen, B.F.; van der Aalst, W.M.P.; Ceravolo, P.; Guetl, C.; Rinderle-Ma, S.

    2018-01-01

    Dealing with the abundance of event data is one of the main process discovery challenges. Current process discovery techniques are able to efficiently handle imported event log files that fit in the computer’s memory. Once data files get bigger, scalability quickly drops since the speed required to

  16. Environment and host as large-scale controls of ectomycorrhizal fungi.

    Science.gov (United States)

    van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I

    2018-06-06

    Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.

  17. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    NARCIS (Netherlands)

    Cheong, S.A.; Tan, T.L.; Chen, C.-C.; Chang, W.-L.; Liu, Z.; Chew, L.Y.; Sloot, P.M.A.; Johnson, N.F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting

  18. VisualRank: applying PageRank to large-scale image search.

    Science.gov (United States)

    Jing, Yushi; Baluja, Shumeet

    2008-11-01

    Because of the relative ease in understanding and processing text, commercial image-search systems often rely on techniques that are largely indistinguishable from text-search. Recently, academic studies have demonstrated the effectiveness of employing image-based features to provide alternative or additional signals. However, it remains uncertain whether such techniques will generalize to a large number of popular web queries, and whether the potential improvement to search quality warrants the additional computational cost. In this work, we cast the image-ranking problem into the task of identifying "authority" nodes on an inferred visual similarity graph and propose VisualRank to analyze the visual link structures among images. The images found to be "authorities" are chosen as those that answer the image-queries well. To understand the performance of such an approach in a real system, we conducted a series of large-scale experiments based on the task of retrieving images for 2000 of the most popular products queries. Our experimental results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results. Maintaining modest computational cost is vital to ensuring that this procedure can be used in practice; we describe the techniques required to make this system practical for large scale deployment in commercial search engines.

  19. The large-scale process of microbial carbonate precipitation for nickel remediation from an industrial soil.

    Science.gov (United States)

    Zhu, Xuejiao; Li, Weila; Zhan, Lu; Huang, Minsheng; Zhang, Qiuzhuo; Achal, Varenyam

    2016-12-01

    Microbial carbonate precipitation is known as an efficient process for the remediation of heavy metals from contaminated soils. In the present study, a urease positive bacterial isolate, identified as Bacillus cereus NS4 through 16S rDNA sequencing, was utilized on a large scale to remove nickel from industrial soil contaminated by the battery industry. The soil was highly contaminated with an initial total nickel concentration of approximately 900 mg kg -1 . The soluble-exchangeable fraction was reduced to 38 mg kg -1 after treatment. The primary objective of metal stabilization was achieved by reducing the bioavailability through immobilizing the nickel in the urease-driven carbonate precipitation. The nickel removal in the soils contributed to the transformation of nickel from mobile species into stable biominerals identified as calcite, vaterite, aragonite and nickelous carbonate when analyzed under XRD. It was proven that during precipitation of calcite, Ni 2+ with an ion radius close to Ca 2+ was incorporated into the CaCO 3 crystal. The biominerals were also characterized by using SEM-EDS to observe the crystal shape and Raman-FTIR spectroscopy to predict responsible bonding during bioremediation with respect to Ni immobilization. The electronic structure and chemical-state information of the detected elements during MICP bioremediation process was studied by XPS. This is the first study in which microbial carbonate precipitation was used for the large-scale remediation of metal-contaminated industrial soil. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Leveraging High Performance Computing for Managing Large and Evolving Data Collections

    Directory of Open Access Journals (Sweden)

    Ritu Arora

    2014-10-01

    Full Text Available The process of developing a digital collection in the context of a research project often involves a pipeline pattern during which data growth, data types, and data authenticity need to be assessed iteratively in relation to the different research steps and in the interest of archiving. Throughout a project’s lifecycle curators organize newly generated data while cleaning and integrating legacy data when it exists, and deciding what data will be preserved for the long term. Although these actions should be part of a well-oiled data management workflow, there are practical challenges in doing so if the collection is very large and heterogeneous, or is accessed by several researchers contemporaneously. There is a need for data management solutions that can help curators with efficient and on-demand analyses of their collection so that they remain well-informed about its evolving characteristics. In this paper, we describe our efforts towards developing a workflow to leverage open science High Performance Computing (HPC resources for routinely and efficiently conducting data management tasks on large collections. We demonstrate that HPC resources and techniques can significantly reduce the time for accomplishing critical data management tasks, and enable a dynamic archiving throughout the research process. We use a large archaeological data collection with a long and complex formation history as our test case. We share our experiences in adopting open science HPC resources for large-scale data management, which entails understanding usage of the open source HPC environment and training users. These experiences can be generalized to meet the needs of other data curators working with large collections.

  1. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    Science.gov (United States)

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    15 m. Large data sets also create challenges for the delivery of genetic evaluations that must be overcome in a way that does not disrupt the transition from conventional to genomic evaluations. Processing time is important, especially as real-time systems for on-farm decisions are developed. The ultimate value of these systems is to decrease time-to-results in research, increase accuracy in genomic evaluations, and accelerate rates of genetic improvement.

  2. Recycling process of Mn-Al doped large grain UO2 pellets

    International Nuclear Information System (INIS)

    Nam, Ik Hui; Yang, Jae Ho; Rhee, Young Woo; Kim, Dong Joo; Kim, Jong Hun; Kim, Keon Sik; Song, Kun Woo

    2010-01-01

    To reduce the fuel cycle costs and the total mass of spent light water reactor (LWR) fuels, it is necessary to extend the fuel discharged burn-up. Research on fuel pellets focuses on increasing the pellet density and grain size to increase the uranium contents and the high burnup safety margins for LWRs. KAERI are developing the large grain UO 2 pellet for the same purpose. Small amount of additives doping technology are used to increase the grain size and the high temperature deformation of UO 2 pellets. Various promising additive candidates had been developed during the last 3 years and the MnO-Al 2 O 3 doped UO 2 fuel pellet is one of the most promising candidates. In a commercial UO 2 fuel pellet manufacturing process, defective UO 2 pellets or scraps are produced and those should be reused. A common recycling method for defective UO 2 pellets or scraps is that they are oxidized in air at about 450 .deg. C to make U 3 O 8 powder and then added to UO 2 powder. In the oxidation of a UO 2 pellet, the oxygen propagates along the grain boundary. The U 3 O 8 formation on the grain boundary causes a spallation of the grains. So, size and shape of U 3 O 8 powder deeply depend on the initial grain size of UO 2 pellets. In the case of Mn-Al doped large grain pellets, the average grain size is about 45μm and about 5 times larger than a typical un-doped UO 2 pellet which has grain size of about 8∼10μm. That big difference in grain size is expected to cause a big difference in recycled U 3 O 8 powder morphology. Addition of U 3 O 8 to UO 2 leads to a drop in the pellet density, impeding a grain growth and the formation of graph- like pore segregates. Such degradation of the UO 2 pellet properties by adding the recycled U 3 O 8 powder depend on the U 3 O 8 powder properties. So, it is necessary to understand the property and its effect on the pellet of the recycled U 3 O 8 . This paper shows a preliminary result about the recycled U 3 O 8 powder which was obtained by

  3. Self-Calibrated In-Process Photogrammetry for Large Raw Part Measurement and Alignment before Machining.

    Science.gov (United States)

    Mendikute, Alberto; Yagüe-Fabra, José A; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai

    2017-09-09

    Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g. 0.1 mm error in 1 m) with

  4. Future-oriented maintenance strategy based on automated processes is finding its way into large astronomical facilities at remote observing sites

    Science.gov (United States)

    Silber, Armin; Gonzalez, Christian; Pino, Francisco; Escarate, Patricio; Gairing, Stefan

    2014-08-01

    With expanding sizes and increasing complexity of large astronomical observatories on remote observing sites, the call for an efficient and recourses saving maintenance concept becomes louder. The increasing number of subsystems on telescopes and instruments forces large observatories, like in industries, to rethink conventional maintenance strategies for reaching this demanding goal. The implementation of full-, or semi-automatic processes for standard service activities can help to keep the number of operating staff on an efficient level and to reduce significantly the consumption of valuable consumables or equipment. In this contribution we will demonstrate on the example of the 80 Cryogenic subsystems of the ALMA Front End instrument, how an implemented automatic service process increases the availability of spare parts and Line Replaceable Units. Furthermore how valuable staff recourses can be freed from continuous repetitive maintenance activities, to allow focusing more on system diagnostic tasks, troubleshooting and the interchanging of line replaceable units. The required service activities are decoupled from the day-to-day work, eliminating dependencies on workload peaks or logistic constrains. The automatic refurbishing processes running in parallel to the operational tasks with constant quality and without compromising the performance of the serviced system components. Consequentially that results in an efficiency increase, less down time and keeps the observing schedule on track. Automatic service processes in combination with proactive maintenance concepts are providing the necessary flexibility for the complex operational work structures of large observatories. The gained planning flexibility is allowing an optimization of operational procedures and sequences by considering the required cost efficiency.

  5. Process Improvement to Enhance Quality in a Large Volume Labor and Birth Unit.

    Science.gov (United States)

    Bell, Ashley M; Bohannon, Jessica; Porthouse, Lisa; Thompson, Heather; Vago, Tony

    using the Lean process, frontline clinicians identified areas that needed improvement, developed and implemented successful strategies that addressed each gap, and enhanced the quality and safety of care for a large volume perinatal service.

  6. Measuring the In-Process Figure, Final Prescription, and System Alignment of Large Optics and Segmented Mirrors Using Lidar Metrology

    Science.gov (United States)

    Ohl, Raymond; Slotwinski, Anthony; Eegholm, Bente; Saif, Babak

    2011-01-01

    The fabrication of large optics is traditionally a slow process, and fabrication capability is often limited by measurement capability. W hile techniques exist to measure mirror figure with nanometer precis ion, measurements of large-mirror prescription are typically limited to submillimeter accuracy. Using a lidar instrument enables one to measure the optical surface rough figure and prescription in virtuall y all phases of fabrication without moving the mirror from its polis hing setup. This technology improves the uncertainty of mirror presc ription measurement to the micron-regime.

  7. Experiments to Distribute Map Generalization Processes

    Science.gov (United States)

    Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas

    2018-05-01

    Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.

  8. Remaining lifetime modeling using State-of-Health estimation

    Science.gov (United States)

    Beganovic, Nejra; Söffker, Dirk

    2017-08-01

    Technical systems and system's components undergo gradual degradation over time. Continuous degradation occurred in system is reflected in decreased system's reliability and unavoidably lead to a system failure. Therefore, continuous evaluation of State-of-Health (SoH) is inevitable to provide at least predefined lifetime of the system defined by manufacturer, or even better, to extend the lifetime given by manufacturer. However, precondition for lifetime extension is accurate estimation of SoH as well as the estimation and prediction of Remaining Useful Lifetime (RUL). For this purpose, lifetime models describing the relation between system/component degradation and consumed lifetime have to be established. In this contribution modeling and selection of suitable lifetime models from database based on current SoH conditions are discussed. Main contribution of this paper is the development of new modeling strategies capable to describe complex relations between measurable system variables, related system degradation, and RUL. Two approaches with accompanying advantages and disadvantages are introduced and compared. Both approaches are capable to model stochastic aging processes of a system by simultaneous adaption of RUL models to current SoH. The first approach requires a priori knowledge about aging processes in the system and accurate estimation of SoH. An estimation of SoH here is conditioned by tracking actual accumulated damage into the system, so that particular model parameters are defined according to a priori known assumptions about system's aging. Prediction accuracy in this case is highly dependent on accurate estimation of SoH but includes high number of degrees of freedom. The second approach in this contribution does not require a priori knowledge about system's aging as particular model parameters are defined in accordance to multi-objective optimization procedure. Prediction accuracy of this model does not highly depend on estimated SoH. This model

  9. Development of large-volume rhyolitic ignibrites (LRI'S): The Chalupas Caldera, an example from Ecuador

    International Nuclear Information System (INIS)

    Hammersley, L.; DePaolo, D.J; Beate, B

    2001-01-01

    The mechanisms responsible for the generation of large volumes of silicic magma and the eruption of large-volume rhyolitic ignimbrites (LRI's) remain poorly understood. Of particular interest are the relative roles of crustal assimilation, fractional crystallization and magma supply and the processes by which large volumes of magma accumulate in crustal chambers rather than erupt in smaller batches. Isotope geochemistry, combined with study of major and trace element variations of lavas, can be used to infer the relative contribution of crustal material and continued magmatic supply. Timescales for the accumulation of magma can be estimated using detailed geochronology. Magma supply rates can be estimated from eruption rates of nearby volcanoes. In this study we investigate the evolution of the Chalupas LRI, a caldera system in the Ecuadorian Andes where LRI's are rare in comparison to the Southern Volcanic Zone (SVZ) of South America (au)

  10. Industrial irradiation processing of polymers. Status and prospects. Report

    International Nuclear Information System (INIS)

    2005-08-01

    At the close of the 20th century and now in the beginning of the 21st, several changes have taken place in the businesses marketing radiation source technologies used in industrial radiation processing. Such changes involved more than just transitions in ownership and product line extensions for proven equipment, but also the market successes of new accelerator technologies, the evolution of high intensity X ray processing and the ability of providers and users of isotope sources to cope with heightened security issues involving radioactive materials. Concurrent with this evolution in source technologies, there has been a modest increase in the acceptance of radiation processing for polymeric materials. At the same time, there has been a broadening of polymer options available to formulators and producers of irradiated products. Unfortunately, however, there have been no major market breakthroughs; no adoption of radiation processing on a large scale in some new industrial application. For example, the much proven and long hoped for use of radiation processing by the food industry remains at a very small scale. This is despite the fact that this technology has cleared most regulatory hurdles that call for efficacy and the maintenance of food quality. This brief paper describes some of these changes and outlines some current issues that remain to be addressed

  11. Preparation by the nano-casting process of novel porous carbons from large pore zeolite templates

    International Nuclear Information System (INIS)

    F Gaslain; J Parmentier; V Valtchev; J Patarin; C Vix Guterl

    2005-01-01

    The development of new growing industrial applications such as gas storage (e.g.: methane or hydrogen) or electric double-layer capacitors has focussed the attention of many research groups. For this kind of application, porous carbons with finely tailored micro-porosity (i.e.: pore size diameter ≤ 1 nm) appear as very promising materials due to their high surface area and their specific pore size distribution. In order to meet these requirements, attention has been paid towards the feasibility of preparing microporous carbons by the nano-casting process. Since the sizes and shapes of the pores and walls respectively become the walls and pores of the resultant carbons, using templates with different framework topologies leads to various carbon replicas. The works performed with commercially available zeolites employed as templates [1-4] showed that the most promising candidate is the FAU-type zeolite, which is a large zeolite with three-dimensional channel system. The promising results obtained on FAU-type matrices encouraged us to study the microporous carbon formation on large pore zeolites synthesized in our laboratory, such as EMC-1 (International Zeolite Association framework type FAU), zeolite β (BEA) or EMC-2 (EMT). The carbon replicas were prepared following largely the nano-casting method proposed for zeolite Y by the Kyotani research group [4]: either by liquid impregnation of furfuryl alcohol (FA) followed by carbonization or by vapour deposition (CVD) of propylene, or by an association of these two processes. Heat treatment of the mixed materials (zeolite / carbon) could also follow in order to improve the structural ordering of the carbon. After removal of the inorganic template by an acidic treatment, the carbon materials obtained were characterised by several analytical techniques (XRD, N 2 and CO 2 adsorption, electron microscopy, etc...). The unique characteristics of these carbons are discussed in details in this paper and compared to those

  12. Process parameter impact on properties of sputtered large-area Mo bilayers for CIGS thin film solar cell applications

    Energy Technology Data Exchange (ETDEWEB)

    Badgujar, Amol C.; Dhage, Sanjay R., E-mail: dhage@arci.res.in; Joshi, Shrikant V.

    2015-08-31

    Copper indium gallium selenide (CIGS) has emerged as a promising candidate for thin film solar cells, with efficiencies approaching those of silicon-based solar cells. To achieve optimum performance in CIGS solar cells, uniform, conductive, stress-free, well-adherent, reflective, crystalline molybdenum (Mo) thin films with preferred orientation (110) are desirable as a back contact on large area glass substrates. The present study focuses on cylindrical rotating DC magnetron sputtered bilayer Mo thin films on 300 mm × 300 mm soda lime glass (SLG) substrates. Key sputtering variables, namely power and Ar gas flow rates, were optimized to achieve best structural, electrical and optical properties. The Mo films were comprehensively characterized and found to possess high degree of thickness uniformity over large area. Best crystallinity, reflectance and sheet resistance was obtained at high sputtering powers and low argon gas flow rates, while mechanical properties like adhesion and residual stress were found to be best at low sputtering power and high argon gas flow rate, thereby indicating a need to arrive at a suitable trade-off during processing. - Highlights: • Sputtering of bilayer molybdenum thin films on soda lime glass • Large area deposition using rotating cylindrical direct current magnetron • Trade of sputter process parameters power and pressure • High uniformity of thickness and best electrical properties obtained • Suitable mechanical and optical properties of molybdenum are achieved for CIGS application.

  13. Methods for Prediction of Steel Temperature Curve in the Whole Process of a Localized Fire in Large Spaces

    Directory of Open Access Journals (Sweden)

    Zhang Guowei

    2014-01-01

    Full Text Available Based on a full-scale bookcase fire experiment, a fire development model is proposed for the whole process of localized fires in large-space buildings. We found that for localized fires in large-space buildings full of wooden combustible materials the fire growing phases can be simplified into a t2 fire with a 0.0346 kW/s2 fire growth coefficient. FDS technology is applied to study the smoke temperature curve for a 2 MW to 25 MW fire occurring within a large space with a height of 6 m to 12 m and a building area of 1 500 m2 to 10 000 m2 based on the proposed fire development model. Through the analysis of smoke temperature in various fire scenarios, a new approach is proposed to predict the smoke temperature curve. Meanwhile, a modified model of steel temperature development in localized fire is built. In the modified model, the localized fire source is treated as a point fire source to evaluate the flame net heat flux to steel. The steel temperature curve in the whole process of a localized fire could be accurately predicted by the above findings. These conclusions obtained in this paper could provide valuable reference to fire simulation, hazard assessment, and fire protection design.

  14. Emotion perception and executive control interact in the salience network during emotionally charged working memory processing

    NARCIS (Netherlands)

    Luo, Y.; Qin, S.; Fernandez, G.S.E.; Zhang, Y.; Klumpers, F.; Li, H.

    2014-01-01

    Processing of emotional stimuli can either hinder or facilitate ongoing working memory (WM); however, the neural basis of these effects remains largely unknown. Here we examined the neural mechanisms of these paradoxical effects by implementing a novel emotional WM task in an fMRI study. Twenty-five

  15. Semantic processing in deaf and hard-of-hearing children: Large N400 mismatch effects in brain responses, despite poor semantic ability

    Directory of Open Access Journals (Sweden)

    Petter Kallioinen

    2016-08-01

    Full Text Available Difficulties in auditory and phonological processing affect semantic processing in speech comprehension of deaf and hard-of-hearing (DHH children. However, little is known about brain responses of semantic processing in this group. We investigated event-related potentials (ERPs in DHH children with cochlear implants (CI and/or hearing aids (HA, and in normally hearing controls (NH. We used a semantic priming task with spoken word primes followed by picture targets. In both DHH children and controls, response differences between matching and mismatching targets revealed a typical N400-effect associated with semantic processing. Children with CI had the largest mismatch response despite poor semantic abilities overall, children with CI also had the largest ERP differentiation between mismatch types, with small effects of within-category mismatches (target from same category as prime and large effects between-category mismatches (were target is from a different category than prime. NH and HA children had similar responses to both mismatch types. While the large and differentiated ERP responses in the CI group were unexpected and should be interpreted with caution, the results could reflect less precision in semantic processing among children with CI, or a stronger reliance on predictive processing.

  16. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  17. Optical methods to study the gas exchange processes in large diesel engines

    Energy Technology Data Exchange (ETDEWEB)

    Gros, S.; Hattar, C. [Wartsila Diesel International Oy, Vaasa (Finland); Hernberg, R.; Vattulainen, J. [Tampere Univ. of Technology, Tampere (Finland). Plasma Technology Lab.

    1996-12-01

    To be able to study the gas exchange processes in realistic conditions for a single cylinder of a large production-line-type diesel engine, a fast optical absorption spectroscopic method was developed. With this method line-of-sight UV-absorption of SO{sub 2} contained in the exhaust gas was measured as a function of time in the exhaust port area in a continuously fired medium speed diesel engine type Waertsilae 6L20. SO{sub 2} formed during the combustion from the fuel contained sulphur was used as a tracer to study the gas exchange as a function of time in the exhaust channel. In this case of a 4-stroke diesel engine by assuming a known concentration of SO{sub 2} in the exhaust gas after exhaust valve opening and before inlet and exhaust valve overlap period, the measured optical absorption was used to determine the gas density and further the instantaneous exhaust gas temperature during the exhaust cycle. (author)

  18. Signal formation processes in Micromegas detectors and quality control for large size detector construction for the ATLAS new small wheel

    Energy Technology Data Exchange (ETDEWEB)

    Kuger, Fabian

    2017-07-31

    The Micromegas technology is one of the most successful modern gaseous detector concepts and widely utilized in nuclear and particle physics experiments. Twenty years of R and D rendered the technology sufficiently mature to be selected as precision tracking detector for the New Small Wheel (NSW) upgrade of the ATLAS Muon spectrometer. This will be the first large scale application of Micromegas in one of the major LHC experiments. However, many of the fundamental microscopic processes in these gaseous detectors are still not fully understood and studies on several detector aspects, like the micromesh geometry, have never been addressed systematically. The studies on signal formation in Micromegas, presented in the first part of this thesis, focuses on the microscopic signal electron loss mechanisms and the amplification processes in electron gas interaction. Based on a detailed model of detector parameter dependencies, these processes are scrutinized in an iterating comparison between experimental results, theory prediction of the macroscopic observables and process simulation on the microscopic level. Utilizing the specialized detectors developed in the scope of this thesis as well as refined simulation algorithms, an unprecedented level of accuracy in the description of the microscopic processes is reached, deepening the understanding of the fundamental process in gaseous detectors. The second part is dedicated to the challenges arising with the large scale Micromegas production for the ATLAS NSW. A selection of technological choices, partially influenced or determined by the herein presented studies, are discussed alongside a final report on two production related tasks addressing the detectors' core components: For the industrial production of resistive anode PCBs a detailed quality control (QC) and quality assurance (QA) scheme as well as the therefore required testing tools have been developed. In parallel the study on micromesh parameter optimization

  19. Signal formation processes in Micromegas detectors and quality control for large size detector construction for the ATLAS new small wheel

    International Nuclear Information System (INIS)

    Kuger, Fabian

    2017-01-01

    The Micromegas technology is one of the most successful modern gaseous detector concepts and widely utilized in nuclear and particle physics experiments. Twenty years of R and D rendered the technology sufficiently mature to be selected as precision tracking detector for the New Small Wheel (NSW) upgrade of the ATLAS Muon spectrometer. This will be the first large scale application of Micromegas in one of the major LHC experiments. However, many of the fundamental microscopic processes in these gaseous detectors are still not fully understood and studies on several detector aspects, like the micromesh geometry, have never been addressed systematically. The studies on signal formation in Micromegas, presented in the first part of this thesis, focuses on the microscopic signal electron loss mechanisms and the amplification processes in electron gas interaction. Based on a detailed model of detector parameter dependencies, these processes are scrutinized in an iterating comparison between experimental results, theory prediction of the macroscopic observables and process simulation on the microscopic level. Utilizing the specialized detectors developed in the scope of this thesis as well as refined simulation algorithms, an unprecedented level of accuracy in the description of the microscopic processes is reached, deepening the understanding of the fundamental process in gaseous detectors. The second part is dedicated to the challenges arising with the large scale Micromegas production for the ATLAS NSW. A selection of technological choices, partially influenced or determined by the herein presented studies, are discussed alongside a final report on two production related tasks addressing the detectors' core components: For the industrial production of resistive anode PCBs a detailed quality control (QC) and quality assurance (QA) scheme as well as the therefore required testing tools have been developed. In parallel the study on micromesh parameter optimization

  20. Towards large-scale production of solution-processed organic tandem modules based on ternary composites: Design of the intermediate layer, device optimization and laser based module processing

    DEFF Research Database (Denmark)

    Li, Ning; Kubis, Peter; Forberich, Karen

    2014-01-01

    on commercially available materials, which enhances the absorption of poly(3-hexylthiophene) (P3HT) and as a result increase the PCE of the P3HT-based large-scale OPV devices; 3. laser-based module processing, which provides an excellent processing resolution and as a result can bring the power conversion...... efficiency (PCE) of mass-produced organic photovoltaic (OPV) devices close to the highest PCE values achieved for lab-scale solar cells through a significant increase in the geometrical fill factor. We believe that the combination of the above mentioned concepts provides a clear roadmap to push OPV towards...

  1. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  2. From Process Understanding to Process Control

    NARCIS (Netherlands)

    Streefland, M.

    2010-01-01

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged. Recent changes in the

  3. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    Science.gov (United States)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  4. Different healing process of esophageal large mucosal defects by endoscopic mucosal dissection between with and without steroid injection in an animal model.

    Science.gov (United States)

    Nonaka, Kouichi; Miyazawa, Mitsuo; Ban, Shinichi; Aikawa, Masayasu; Akimoto, Naoe; Koyama, Isamu; Kita, Hiroto

    2013-04-25

    Stricture formation is one of the major complications after endoscopic removal of large superficial squamous cell neoplasms of the esophagus, and local steroid injections have been adopted to prevent it. However, fundamental pathological alterations related to them have not been well analyzed so far. The aim of this study was to analyze the time course of the healing process of esophageal large mucosal defects resulting in stricture formation and its modification by local steroid injection, using an animal model. Esophageal circumferential mucosal defects were created by endoscopic mucosal dissection (ESD) for four pigs. One pig was sacrificed five minutes after the ESD, and other two pigs were followed-up on endoscopy and sacrificed at the time of one week and three weeks after the ESD, respectively. The remaining one pig was followed-up on endoscopy with five times of local steroid injection and sacrificed at the time of eight weeks after the ESD. The esophageal tissues of all pigs were subjected to pathological analyses. For the pigs without steroid injection, the esophageal stricture was completed around three weeks after the ESD on both endoscopy and esophagography. Histopathological examination of the esophageal tissues revealed that spindle-shaped α-smooth muscle actin (SMA)-positive myofibroblasts arranged in a parallel fashion and extending horizontally were identified at the ulcer bed one week after the ESD, and increased contributing to formation of the stenotic luminal ridge covered with the regenerated epithelium three weeks after the ESD. The proper muscle layer of the stricture site was thinned with some myocytes which seemingly showed transition to the myofibroblast layer. By contrast, for the pig with steroid injection, esophageal stricture formation was not evident with limited appearance of the spindle-shaped myofibroblasts, instead, appearance of stellate or polygocal SMA-positive stromal cells arranged haphazardly in the persistent granulation

  5. Research Update: Large-area deposition, coating, printing, and processing techniques for the upscaling of perovskite solar cell technology

    Directory of Open Access Journals (Sweden)

    Stefano Razza

    2016-09-01

    Full Text Available To bring perovskite solar cells to the industrial world, performance must be maintained at the photovoltaic module scale. Here we present large-area manufacturing and processing options applicable to large-area cells and modules. Printing and coating techniques, such as blade coating, slot-die coating, spray coating, screen printing, inkjet printing, and gravure printing (as alternatives to spin coating, as well as vacuum or vapor based deposition and laser patterning techniques are being developed for an effective scale-up of the technology. The latter also enables the manufacture of solar modules on flexible substrates, an option beneficial for many applications and for roll-to-roll production.

  6. Food processing and allergenicity.

    Science.gov (United States)

    Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian

    2015-06-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  8. Hard processes in photon-photon interactions

    International Nuclear Information System (INIS)

    Duchovni, E.

    1985-03-01

    In this thesis, the existence of hard component in two-photon collisions is investigated. Due to the relative simplicity of the photon, such processes can be exactly calculated in QCD. Untagged (low Q 2 ) two-photon events are used. This leads to relatively high statistics, but to severe background problem due mainly to e + e - annihilation. The background contamination is reduced to a tolerable level using a special set of cuts. Moreover, the remaining contamination is shown to be calculable with a small systematic error. A large number of events of the hard ''γγ'' type is found. An attempt to explain these events using the simplest QCD diagram (the Born term) is done. This process is found to be capable of explaining only a 1/4 of the data. Other options like the constituent intercharge model, integer charged quarks, and higher order diagrams are therefore also discussed. The large cross-section for the production of ρ 0 ρ 0 pairs in ''γγ'' collisions has not been understood yet. Inorder to look at closely related processes, a search for φρ 0 and φφ was initiated. The cross-section for θπ + π - was found to be sizeable. Only upper limits for the production of φρ 0 and φφ are obtained

  9. Sentinel-1 data massive processing for large scale DInSAR analyses within Cloud Computing environments through the P-SBAS approach

    Science.gov (United States)

    Lanari, Riccardo; Bonano, Manuela; Buonanno, Sabatino; Casu, Francesco; De Luca, Claudio; Fusco, Adele; Manunta, Michele; Manzo, Mariarosaria; Pepe, Antonio; Zinno, Ivana

    2017-04-01

    -core programming techniques. Currently, Cloud Computing environments make available large collections of computing resources and storage that can be effectively exploited through the presented S1 P-SBAS processing chain to carry out interferometric analyses at a very large scale, in reduced time. This allows us to deal also with the problems connected to the use of S1 P-SBAS chain in operational contexts, related to hazard monitoring and risk prevention and mitigation, where handling large amounts of data represents a challenging task. As a significant experimental result we performed a large spatial scale SBAS analysis relevant to the Central and Southern Italy by exploiting the Amazon Web Services Cloud Computing platform. In particular, we processed in parallel 300 S1 acquisitions covering the Italian peninsula from Lazio to Sicily through the presented S1 P-SBAS processing chain, generating 710 interferograms, thus finally obtaining the displacement time series of the whole processed area. This work has been partially supported by the CNR-DPC agreement, the H2020 EPOS-IP project (GA 676564) and the ESA GEP project.

  10. Upscaling of bio-nano-processes selective bioseparation by magnetic particles

    CERN Document Server

    Keller, Karsten

    2014-01-01

    Despite ongoing progress in nano- and biomaterial sciences, large scale bioprocessing of nanoparticles remains a great challenge, especially because of the difficulties in removing unwanted elements during processing in food, pharmaceutical and feed industry at production level. This book presents magnetic nanoparticles and a novel technology for the upscaling of protein separation. The results come from the EU Project "MagPro2Life", which was conducted in cooperation of several european institutions and companies.

  11. Endogenous magnetic reconnection and associated high energy plasma processes

    Science.gov (United States)

    Coppi, B.; Basu, B.

    2018-02-01

    An endogenous reconnection process involves a driving factor that lays inside the layer where a drastic change of magnetic field topology occurs. A process of this kind is shown to take place when an electron temperature gradient is present in a magnetically confined plasma and the evolving electron temperature fluctuations are anisotropic. The width of the reconnecting layer remains significant even when large macroscopic distances are considered. In view of the fact that there are plasmas in the Universe with considerable electron thermal energy contents this feature can be relied upon in order to produce generation or conversion of magnetic energy, high energy particle populations and momentum and angular momentum transport.

  12. Development of a remaining lifetime management system for NPPS

    International Nuclear Information System (INIS)

    Galvan, J.C.; Regano, M.; Hevia Ruperez, F.

    1994-01-01

    The interest evinced by Spain nuclear power plants in providing a tool to support remaining lifetime management led to UNESA's application to OCIDE in 1992, and the latter's approval, for financing the project to develop a Remaining Lifetime Evaluation System for LWR nuclear power plants. This project is currently being developed under UNESA leadership, and the collaboration of three Spanish engineering companies and a research centre. The paper will describe its objectives, activities, current status and prospects. The project is defined in two phases, the first consisting of the identification and analysis of the main ageing phenomena and their significant parameters and specification of the Remaining Lifetime Evaluation System (RLES), and the second implementation of a pilot application of the RLES to verify its effectiveness. (Author)

  13. Combined radiographic and anthropological approaches to victim identification of partially decomposed or skeletal remains

    International Nuclear Information System (INIS)

    Leo, C.; O'Connor, J.E.; McNulty, J.P.

    2013-01-01

    Victim identification is the priority in any scenario involving the discovery of single or multiple human remains for both humanitarian and legal reasons. Such remains may be incomplete and in various stages of decomposition. In such scenarios radiography contributes to both primary and secondary methods of identification; the comparison of ante-mortem dental radiographs to post-mortem findings is a primary identification method whereas the analysis of post-mortem skeletal radiographs to help create a biological profile and identify other individuating features is a secondary method of identification. This review will introduce and explore aspects of victim identification with a focus on the anthropological and radiography-based virtual anthropology approaches to establishing a biological profile, identifying other individuating factors and ultimately restoring an individual's identity. It will highlight the potential contribution that radiography, and radiographers, can make to the identification process and contribute to increasing awareness amongst radiographers of the value of their professional role in such investigations

  14. Photonic Architecture for Scalable Quantum Information Processing in Diamond

    Directory of Open Access Journals (Sweden)

    Kae Nemoto

    2014-08-01

    Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.

  15. Large-group psychodynamics and massive violence

    Directory of Open Access Journals (Sweden)

    Vamik D. Volkan

    2006-06-01

    Full Text Available Beginning with Freud, psychoanalytic theories concerning large groups have mainly focused on individuals' perceptions of what their large groups psychologically mean to them. This chapter examines some aspects of large-group psychology in its own right and studies psychodynamics of ethnic, national, religious or ideological groups, the membership of which originates in childhood. I will compare the mourning process in individuals with the mourning process in large groups to illustrate why we need to study large-group psychology as a subject in itself. As part of this discussion I will also describe signs and symptoms of large-group regression. When there is a threat against a large-group's identity, massive violence may be initiated and this violence in turn, has an obvious impact on public health.

  16. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    was carried out of the change process related implementation of organic foods in large-scale foodservice facilities in Greater Copenhagen county in order to study the effects of such a change. Based on the findings, a set of guidelines has been developed for the successful implementation of organic foods...

  17. The broad spectrum revisited: evidence from plant remains.

    Science.gov (United States)

    Weiss, Ehud; Wetterstrom, Wilma; Nadel, Dani; Bar-Yosef, Ofer

    2004-06-29

    The beginning of agriculture is one of the most important developments in human history, with enormous consequences that paved the way for settled life and complex society. Much of the research on the origins of agriculture over the last 40 years has been guided by Flannery's [Flannery, K. V. (1969) in The Domestication and Exploitation of Plants and Animals, eds. Ucko, P. J. & Dimbleby, G. W. (Duckworth, London), pp. 73-100] "broad spectrum revolution" (BSR) hypothesis, which posits that the transition to farming in southwest Asia entailed a period during which foragers broadened their resource base to encompass a wide array of foods that were previously ignored in an attempt to overcome food shortages. Although these resources undoubtedly included plants, nearly all BSR hypothesis-inspired research has focused on animals because of a dearth of Upper Paleolithic archaeobotanical assemblages. Now, however, a collection of >90,000 plant remains, recently recovered from the Stone Age site Ohalo II (23,000 B.P.), Israel, offers insights into the plant foods of the late Upper Paleolithic. The staple foods of this assemblage were wild grasses, pushing back the dietary shift to grains some 10,000 years earlier than previously recognized. Besides the cereals (wild wheat and barley), small-grained grasses made up a large component of the assemblage, indicating that the BSR in the Levant was even broader than originally conceived, encompassing what would have been low-ranked plant foods. Over the next 15,000 years small-grained grasses were gradually replaced by the cereals and ultimately disappeared from the Levantine diet.

  18. Efficient collective influence maximization in cascading processes with first-order transitions

    Science.gov (United States)

    Pei, Sen; Teng, Xian; Shaman, Jeffrey; Morone, Flaviano; Makse, Hernán A.

    2017-01-01

    In many social and biological networks, the collective dynamics of the entire system can be shaped by a small set of influential units through a global cascading process, manifested by an abrupt first-order transition in dynamical behaviors. Despite its importance in applications, efficient identification of multiple influential spreaders in cascading processes still remains a challenging task for large-scale networks. Here we address this issue by exploring the collective influence in general threshold models of cascading process. Our analysis reveals that the importance of spreaders is fixed by the subcritical paths along which cascades propagate: the number of subcritical paths attached to each spreader determines its contribution to global cascades. The concept of subcritical path allows us to introduce a scalable algorithm for massively large-scale networks. Results in both synthetic random graphs and real networks show that the proposed method can achieve larger collective influence given the same number of seeds compared with other scalable heuristic approaches. PMID:28349988

  19. Carnivoran remains from the Malapa hominin site, South Africa.

    Directory of Open Access Journals (Sweden)

    Brian F Kuhn

    Full Text Available Recent discoveries at the new hominin-bearing deposits of Malapa, South Africa, have yielded a rich faunal assemblage associated with the newly described hominin taxon Australopithecus sediba. Dating of this deposit using U-Pb and palaeomagnetic methods has provided an age of 1.977 Ma, being one of the most accurately dated, time constrained deposits in the Plio-Pleistocene of southern Africa. To date, 81 carnivoran specimens have been identified at this site including members of the families Canidae, Viverridae, Herpestidae, Hyaenidae and Felidae. Of note is the presence of the extinct taxon Dinofelis cf. D. barlowi that may represent the last appearance date for this species. Extant large carnivores are represented by specimens of leopard (Panthera pardus and brown hyaena (Parahyaena brunnea. Smaller carnivores are also represented, and include the genera Atilax and Genetta, as well as Vulpes cf. V. chama. Malapa may also represent the first appearance date for Felis nigripes (Black-footed cat. The geochronological age of Malapa and the associated hominin taxa and carnivoran remains provide a window of research into mammalian evolution during a relatively unknown period in South Africa and elsewhere. In particular, the fauna represented at Malapa has the potential to elucidate aspects of the evolution of Dinofelis and may help resolve competing hypotheses about faunal exchange between East and Southern Africa during the late Pliocene or early Pleistocene.

  20. Remaining life assessment of a high pressure turbine rotor

    International Nuclear Information System (INIS)

    Nguyen, Ninh; Little, Alfie

    2012-01-01

    This paper describes finite element and fracture mechanics based modelling work that provides a useful tool for evaluation of the remaining life of a high pressure (HP) steam turbine rotor that had experienced thermal fatigue cracking. An axis-symmetrical model of a HP rotor was constructed. Steam temperature, pressure and rotor speed data from start ups and shut downs were used for the thermal and stress analysis. Operating history and inspection records were used to benchmark the damage experienced by the rotor. Fracture mechanics crack growth analysis was carried out to evaluate the remaining life of the rotor under themal cyclic loading conditions. The work confirmed that the fracture mechanics approach in conjunction with finite element modelling provides a useful tool for assessing the remaining life of high temperature components in power plants.

  1. INVESTIGATION OF LAUNCHING PROCESS FOR STEEL REINFORCED CONCRETE FRAMEWORK OF LARGE BRIDGES

    Directory of Open Access Journals (Sweden)

    V. A. Grechukhin

    2017-01-01

    Full Text Available Bridges are considered as the most complicated, labour-consuming and expensive components in roadway network of the Republic of Belarus. So their construction and operation are to be carried out at high technological level. One of the modern industrial methods is a cyclic longitudinal launching of large frameworks which provide the possibility to reject usage of expensive auxiliary facilities and reduce a construction period. There are several variants of longitudinal launching according to shipping conditions and span length: without launching girder, with launching girder, with top strut-framed beam in the form of cable-stayed system, with strut-framed beam located under span. While using method for the cyclic longitudinal launching manufacturing process of span is concentrated on the shore. The main task of the investigations is to select economic, quick and technologically simple type of the cyclic longitudinal launching with minimum resource- and labour inputs. Span launching has been comparatively analyzed with temporary supports being specially constructed within the span and according to capital supports with the help of launching girder. Conclusions made on the basis of calculations for constructive elements of span according to bearing ability of element sections during launching and also during the process of reinforced concrete plate grouting and at the stage of operation have shown that span assembly with application of temporary supports does not reduce steel spread in comparison with the variant excluding them. Results of the conducted investigations have been approbated in cooperation with state enterprise “Belgiprodor” while designing a bridge across river Sozh.

  2. Postmortem Scavenging of Human Remains by Domestic Cats

    Directory of Open Access Journals (Sweden)

    Ananya Suntirukpong, M.D.

    2017-11-01

    Full Text Available Objective: Crime scene investigators, forensic medicine doctors and pathologists, and forensic anthropologists frequently encounter postmortem scavenging of human remains by household pets. Case presentation: The authors present a case report of a partially skeletonized adult male found dead after more than three months in his apartment in Thailand. The body was in an advanced stage of decomposition with nearly complete skeletonization of the head, neck, hands, and feet. The presence of maggots and necrophagous (flesh eating beetles on the body confirmed that insects had consumed much of the soft tissues. Examination of the hand and foot bones revealed canine tooth puncture marks. Evidence of chewing indicated that one or more of the decedent’s three house cats had fed on the body after death. Recognizing and identifying carnivore and rodent activity on the soft flesh and bones of human remains is important in interpreting and reconstructing postmortem damage. Thorough analysis may help explain why skeletal elements are missing, damaged, or out of anatomical position. Conclusion: This report presents a multi-disciplinary approach combining forensic anthropology and forensic medicine in examining and interpreting human remains.

  3. Importance of regional species pools and functional traits in colonization processes: predicting re-colonization after large-scale destruction of ecosystems

    NARCIS (Netherlands)

    Kirmer, A.; Tischew, S.; Ozinga, W.A.; Lampe, von M.; Baasch, A.; Groenendael, van J.M.

    2008-01-01

    Large-scale destruction of ecosystems caused by surface mining provides an opportunity for the study of colonization processes starting with primary succession. Surprisingly, over several decades and without any restoration measures, most of these sites spontaneously developed into valuable biotope

  4. Cogeneration in large processing power stations; Cogeneracion en grandes centrales de proceso

    Energy Technology Data Exchange (ETDEWEB)

    Munoz, Jose Manuel [Observatorio Ciudadano de la Energia A. C., (Mexico)

    2004-06-15

    In this communication it is spoken of the cogeneration in large processing power stations with or without electricity surplus, the characteristics of combined cycle power plants and a comparative analysis in a graph entitled Sale price of electricity in combined cycle and cogeneration power plants. The industrial plants, such as refineries, petrochemical, breweries, paper mills and cellulose plants, among others, with steam necessities for their processes, have the technical and economical conditions to cogenerate, that is, to produce steam and electricity simultaneously. In fact, many of such facilities that exist at the moment in any country, count on cogeneration equipment that allows them to obtain their electricity at a very low cost, taking advantage of the existence steam generators that anyway are indispensable to satisfy their demand. In Mexico, given the existing legal frame, the public services of electricity as well as the oil industry are activities of obligatory character for the State. For these reasons, the subject should be part of the agenda of planning of this power sector. The opportunities to which we are referring to, are valid for the small industries, but from the point of view of the national interest, they are more important for the large size facilities and in that rank, the most numerous are indeed in PEMEX, whereas large energy surplus and capacity would result into cogenerations in refineries and petrochemical facilities and they would be of a high value, precisely for the electricity public service, that is, for the Comision Federal de Electricidad (CFE). [Spanish] En esta ponencia se habla de la cogeneracion en grandes centrales de proceso con o sin excedentes de electricidad, las caracteristicas de plantas de ciclo combinado y se muestra el analisis comparativo en una grafica titulada precio de venta de electricidad en plantas de ciclo combinado y de cogeneracion. Las plantas industriales, tales como refinerias, petroquimicas

  5. Solution processed large area fabrication of Ag patterns as electrodes for flexible heaters, electrochromics and organic solar cells

    DEFF Research Database (Denmark)

    Gupta, Ritu; Walia, Sunil; Hösel, Markus

    2014-01-01

    , the process takes only a few minutes without any expensive instrumentation. The electrodes exhibited excellent adhesion and mechanical properties, important for flexible device application. Using Ag patterned electrodes, heaters operating at low voltages, pixelated electrochromic displays as well as organic...... solar cells have been demonstrated. The method is extendable to produce defect-free patterns over large areas as demonstrated by roll coating....

  6. Explosives remain preferred methods for platform abandonment

    International Nuclear Information System (INIS)

    Pulsipher, A.; Daniel, W. IV; Kiesler, J.E.; Mackey, V. III

    1996-01-01

    Economics and safety concerns indicate that methods involving explosives remain the most practical and cost-effective means for abandoning oil and gas structures in the Gulf of Mexico. A decade has passed since 51 dead sea turtles, many endangered Kemp's Ridleys, washed ashore on the Texas coast shortly after explosives helped remove several offshore platforms. Although no relationship between the explosions and the dead turtles was ever established, in response to widespread public concern, the US Minerals Management Service (MMS) and National Marine Fisheries Service (NMFS) implemented regulations limiting the size and timing of explosive charges. Also, more importantly, they required that operators pay for observers to survey waters surrounding platforms scheduled for removal for 48 hr before any detonations. If observers spot sea turtles or marine mammals within the danger zone, the platform abandonment is delayed until the turtles leave or are removed. However, concern about the effects of explosives on marine life remains

  7. Large conditional single-photon cross-phase modulation

    Science.gov (United States)

    Hosseini, Mahdi; Duan, Yiheng; Vuletić, Vladan

    2016-01-01

    Deterministic optical quantum logic requires a nonlinear quantum process that alters the phase of a quantum optical state by π through interaction with only one photon. Here, we demonstrate a large conditional cross-phase modulation between a signal field, stored inside an atomic quantum memory, and a control photon that traverses a high-finesse optical cavity containing the atomic memory. This approach avoids fundamental limitations associated with multimode effects for traveling optical photons. We measure a conditional cross-phase shift of π/6 (and up to π/3 by postselection on photons that remain in the system longer than average) between the retrieved signal and control photons, and confirm deterministic entanglement between the signal and control modes by extracting a positive concurrence. By upgrading to a state-of-the-art cavity, our system can reach a coherent phase shift of π at low loss, enabling deterministic and universal photonic quantum logic. PMID:27519798

  8. An adaptive-order particle filter for remaining useful life prediction of aviation piston pumps

    Directory of Open Access Journals (Sweden)

    Tongyang LI

    2018-05-01

    Full Text Available An accurate estimation of the remaining useful life (RUL not only contributes to an effective application of an aviation piston pump, but also meets the necessity of condition based maintenance (CBM. For the current RUL evaluation methods, a model-based method is inappropriate for the degradation process of an aviation piston pump due to difficulties of modeling, while a data-based method rarely presents high-accuracy prediction in a long period of time. In this work, an adaptive-order particle filter (AOPF prognostic process is proposed aiming at improving long-term prediction accuracy of RUL by combining both kinds of methods. A dynamic model is initialized by a data-driven or empirical method. When a new observation comes, the prior state distribution is approximated by a current model. The order of the current model is updated adaptively by fusing the information of the observation. Monte Carlo simulation is employed for estimating the posterior probability density function of future states of the pump’s degradation. With updating the order number adaptively, the method presents a higher precision in contrast with those of traditional methods. In a case study, the proposed AOPF method is adopted to forecast the degradation status of an aviation piston pump with experimental return oil flow data, and the analytical results show the effectiveness of the proposed AOPF method. Keywords: Adaptive prognosis, Condition based maintenance (CBM, Particle filter (PF, Piston pump, Remaining useful life (RUL

  9. The evaluation of the introduction of a quality management system. A process-oriented case study in a large rehabilitation hospital

    NARCIS (Netherlands)

    van Harten, Willem H.; Casparie, Ton F.; Fisscher, O.A.M.

    2002-01-01

    Objectives: So far, there is limited proof concerning the effects of the introduction of quality management systems (QMS) on organisational level. This study concerns the introduction of a QMS in a large rehabilitation hospital. Methods: Using an observational framework, a process-analysis is

  10. Solution of large nonlinear time-dependent problems using reduced coordinates

    International Nuclear Information System (INIS)

    Mish, K.D.

    1987-01-01

    This research is concerned with the idea of reducing a large time-dependent problem, such as one obtained from a finite-element discretization, down to a more manageable size while preserving the most-important physical behavior of the solution. This reduction process is motivated by the concept of a projection operator on a Hilbert Space, and leads to the Lanczos Algorithm for generation of approximate eigenvectors of a large symmetric matrix. The Lanczos Algorithm is then used to develop a reduced form of the spatial component of a time-dependent problem. The solution of the remaining temporal part of the problem is considered from the standpoint of numerical-integration schemes in the time domain. All of these theoretical results are combined to motivate the proposed reduced coordinate algorithm. This algorithm is then developed, discussed, and compared to related methods from the mechanics literature. The proposed reduced coordinate method is then applied to the solution of some representative problems in mechanics. The results of these problems are discussed, conclusions are drawn, and suggestions are made for related future research

  11. High-energy, large-momentum-transfer processes: Ladder diagrams in var-phi 3 theory

    International Nuclear Information System (INIS)

    Newton, C.L.J.

    1990-01-01

    Relativistic quantum field theories may help one to understand high-energy, large-momentum-transfer processes, where the center-of-mass energy is much larger than the transverse momentum transfers, which are in turn much larger than the masses of the participating particles. With this possibility in mind, the author studies ladder diagrams in var-phi 3 theory. He shows that in the limit s much-gt |t| much-gt m 2 , the scattering amplitude for the N-rung ladder diagram takes the form s -1 |t| -N+1 times a homogeneous polynomial of degree 2N - 2 and ln s and ln |t|. This polynomial takes different forms depending on the relation of ln |t| to ln s. More precisely, the asymptotic formula for the N-rung ladder diagram has points of non-analytically when ln |t| = γ ln s for γ = 1/2, 1/3, hor-ellipsis, 1/N-2

  12. The relationship marketing in the process of customer loyalty. Case large construction of Manizales

    Directory of Open Access Journals (Sweden)

    María Cristina Torres Camacho

    2015-06-01

    Full Text Available This paper is based on the model Lindgreen (2001, upholding the relationship marketing should be approached in three dimensions: objectives, definition of constructs and tools, which enable better customer management within organizations. The objective was to determine the characteristics of relationship marketing as a key factor in the process of Customer Loyalty in the big construction of Manizales Colombia. From a joint perspective, methodology relies on instruments and qualitative and quantitative analysis of court. The results tend to confirm that developers recognize the importance of Relational Marketing but not raised as a policy or have not defined in its strategic plan; additionally, expressed lack of strategies for customer retention, however, these remain loyal because the construction work on meeting your needs, based on trust, commitment and communication. To conclude, the faithful customers perceive that construction does not periodically evaluate their satisfaction with the purchased product, also claim that they show little interest in understanding the perception, make personal meetings, making constant communication through phone calls and meet their tastes and preferences.

  13. Really big data: Processing and analysis of large datasets

    Science.gov (United States)

    Modern animal breeding datasets are large and getting larger, due in part to the recent availability of DNA data for many animals. Computational methods for efficiently storing and analyzing those data are under development. The amount of storage space required for such datasets is increasing rapidl...

  14. All-optical signal processing data communication and storage applications

    CERN Document Server

    Eggleton, Benjamin

    2015-01-01

    This book provides a comprehensive review of the state-of-the art of optical signal processing technologies and devices. It presents breakthrough solutions for enabling a pervasive use of optics in data communication and signal storage applications. It presents presents optical signal processing as solution to overcome the capacity crunch in communication networks. The book content ranges from the development of innovative materials and devices, such as graphene and slow light structures, to the use of nonlinear optics for secure quantum information processing and overcoming the classical Shannon limit on channel capacity and microwave signal processing. Although it holds the promise for a substantial speed improvement, today’s communication infrastructure optics remains largely confined to the signal transport layer, as it lags behind electronics as far as signal processing is concerned. This situation will change in the near future as the tremendous growth of data traffic requires energy efficient and ful...

  15. Large-area homogeneous periodic surface structures generated on the surface of sputtered boron carbide thin films by femtosecond laser processing

    Energy Technology Data Exchange (ETDEWEB)

    Serra, R., E-mail: ricardo.serra@dem.uc.pt [SEG-CEMUC, Mechanical Engineering Department, University of Coimbra, Rua Luís Reis Santos, 3030-788 Coimbra (Portugal); Oliveira, V. [ICEMS-Instituto de Ciência e Engenharia de Materiais e Superfícies, Avenida Rovisco Pais no 1, 1049-001 Lisbon (Portugal); Instituto Superior de Engenharia de Lisboa, Avenida Conselheiro Emídio Navarro no 1, 1959-007 Lisbon (Portugal); Oliveira, J.C. [SEG-CEMUC, Mechanical Engineering Department, University of Coimbra, Rua Luís Reis Santos, 3030-788 Coimbra (Portugal); Kubart, T. [The Ångström Laboratory, Solid State Electronics, P.O. Box 534, SE-751 21 Uppsala (Sweden); Vilar, R. [Instituto Superior de Engenharia de Lisboa, Avenida Conselheiro Emídio Navarro no 1, 1959-007 Lisbon (Portugal); Instituto Superior Técnico, Avenida Rovisco Pais no 1, 1049-001 Lisbon (Portugal); Cavaleiro, A. [SEG-CEMUC, Mechanical Engineering Department, University of Coimbra, Rua Luís Reis Santos, 3030-788 Coimbra (Portugal)

    2015-03-15

    Highlights: • Large-area LIPSS were formed by femtosecond laser processing B-C films surface. • The LIPSS spatial period increases with laser fluence (140–200 nm). • Stress-related sinusoidal-like undulations were formed on the B-C films surface. • The undulations amplitude (down to a few nanometres) increases with laser fluence. • Laser radiation absorption increases with surface roughness. - Abstract: Amorphous and crystalline sputtered boron carbide thin films have a very high hardness even surpassing that of bulk crystalline boron carbide (≈41 GPa). However, magnetron sputtered B-C films have high friction coefficients (C.o.F) which limit their industrial application. Nanopatterning of materials surfaces has been proposed as a solution to decrease the C.o.F. The contact area of the nanopatterned surfaces is decreased due to the nanometre size of the asperities which results in a significant reduction of adhesion and friction. In the present work, the surface of amorphous and polycrystalline B-C thin films deposited by magnetron sputtering was nanopatterned using infrared femtosecond laser radiation. Successive parallel laser tracks 10 μm apart were overlapped in order to obtain a processed area of about 3 mm{sup 2}. Sinusoidal-like undulations with the same spatial period as the laser tracks were formed on the surface of the amorphous boron carbide films after laser processing. The undulations amplitude increases with increasing laser fluence. The formation of undulations with a 10 μm period was also observed on the surface of the crystalline boron carbide film processed with a pulse energy of 72 μJ. The amplitude of the undulations is about 10 times higher than in the amorphous films processed at the same pulse energy due to the higher roughness of the films and consequent increase in laser radiation absorption. LIPSS formation on the surface of the films was achieved for the three B-C films under study. However, LIPSS are formed under

  16. Large-area homogeneous periodic surface structures generated on the surface of sputtered boron carbide thin films by femtosecond laser processing

    International Nuclear Information System (INIS)

    Serra, R.; Oliveira, V.; Oliveira, J.C.; Kubart, T.; Vilar, R.; Cavaleiro, A.

    2015-01-01

    Highlights: • Large-area LIPSS were formed by femtosecond laser processing B-C films surface. • The LIPSS spatial period increases with laser fluence (140–200 nm). • Stress-related sinusoidal-like undulations were formed on the B-C films surface. • The undulations amplitude (down to a few nanometres) increases with laser fluence. • Laser radiation absorption increases with surface roughness. - Abstract: Amorphous and crystalline sputtered boron carbide thin films have a very high hardness even surpassing that of bulk crystalline boron carbide (≈41 GPa). However, magnetron sputtered B-C films have high friction coefficients (C.o.F) which limit their industrial application. Nanopatterning of materials surfaces has been proposed as a solution to decrease the C.o.F. The contact area of the nanopatterned surfaces is decreased due to the nanometre size of the asperities which results in a significant reduction of adhesion and friction. In the present work, the surface of amorphous and polycrystalline B-C thin films deposited by magnetron sputtering was nanopatterned using infrared femtosecond laser radiation. Successive parallel laser tracks 10 μm apart were overlapped in order to obtain a processed area of about 3 mm 2 . Sinusoidal-like undulations with the same spatial period as the laser tracks were formed on the surface of the amorphous boron carbide films after laser processing. The undulations amplitude increases with increasing laser fluence. The formation of undulations with a 10 μm period was also observed on the surface of the crystalline boron carbide film processed with a pulse energy of 72 μJ. The amplitude of the undulations is about 10 times higher than in the amorphous films processed at the same pulse energy due to the higher roughness of the films and consequent increase in laser radiation absorption. LIPSS formation on the surface of the films was achieved for the three B-C films under study. However, LIPSS are formed under different

  17. Gist-based conceptual processing of pictures remains intact in patients with amnestic mild cognitive impairment.

    Science.gov (United States)

    Deason, Rebecca G; Hussey, Erin P; Budson, Andrew E; Ally, Brandon A

    2012-03-01

    The picture superiority effect, better memory for pictures compared to words, has been found in young adults, healthy older adults, and, most recently, in patients with Alzheimer's disease and mild cognitive impairment. Although the picture superiority effect is widely found, there is still debate over what drives this effect. One main question is whether it is enhanced perceptual or conceptual information that leads to the advantage for pictures over words. In this experiment, we examined the picture superiority effect in healthy older adults and patients with amnestic mild cognitive impairment (MCI) to better understand the role of gist-based conceptual processing. We had participants study three exemplars of categories as either words or pictures. In the test phase, participants were again shown pictures or words and were asked to determine whether the item was in the same category as something they had studied earlier or whether it was from a new category. We found that all participants demonstrated a robust picture superiority effect, better performance for pictures than for words. These results suggest that the gist-based conceptual processing of pictures is preserved in patients with MCI. While in healthy older adults preserved recollection for pictures could lead to the picture superiority effect, in patients with MCI it is most likely that the picture superiority effect is a result of spared conceptually based familiarity for pictures, perhaps combined with their intact ability to extract and use gist information.

  18. Recycling of cellulases in a continuous process for production of bioethanol

    DEFF Research Database (Denmark)

    Haven, Mai Østergaard

    studies, this PhD project investigates enzyme recycling at industrial relevant conditions in the Inbicon process, e.g. high dry matter conditions and process configurations that could be implemented in large scale. The results point towards potential processes for industrial recycling of enzymes......The focus of the work presented in this thesis is recycling of commercial enzymes in a continuous process for production of bioethanol from biomass. To get a deeper understanding of the factors affecting the potential for enzyme recycling, the interactions between enzymes and biomass......, the adsorption and desorption as well as stability and recovery of activity was investigated. More knowledge on these factors have enabled a process adapted for enzyme recycling. The driver being that enzyme consumption remains a major cost when producing bioethanol from lignocellulosic biomass. Unlike previous...

  19. Repair of human DNA in molecules that replicate or remain unreplicated following ultraviolet irradiation

    International Nuclear Information System (INIS)

    Waters, R.

    1980-01-01

    The extent of DNA replication, the incidence of uv induced pyrimidine dimers and the repair replication observed after their excision was monitored in human fibroblasts uv irradiated with single or split uv doses. The excision repair processes were measured in molecules that remained unreplicated or in those that replicated after the latter uv irradiation. Less DNA replication was observed after a split as opposed to single uv irradiation. Furthermore, a split dose did not modify the excision parameters measured after a single irradiation, regardless of whether the DNA had replicated or not

  20. Thermally Dried Ink-Jet Process for 6,13-Bis(triisopropylsilylethynyl)-Pentacene for High Mobility and High Uniformity on a Large Area Substrate

    Science.gov (United States)

    Ryu, Gi Seong; Lee, Myung Won; Jeong, Seung Hyeon; Song, Chung Kun

    2012-05-01

    In this study we developed a simple ink-jet process for 6,13-bis(triisopropylsilylethynyl)-pentacene (TIPS-pentacene), which is known as a high-mobility soluble organic semiconductor, to achieve relatively high-mobility and high-uniformity performance for large-area applications. We analyzed the behavior of fluorescent particles in droplets and applied the results to determining a method of controlling the behavior of TIPS-pentacene molecules. The grain morphology of TIPS-pentacene varied depending on the temperature applied to the droplets during drying. We were able to obtain large and uniform grains at 46 °C without any “coffee stain”. The process was applied to a large-size organic thin-film transistor (OTFT) backplane for an electrophoretic display panel containing 192×150 pixels on a 6-in.-sized substrate. The average of mobilities of 36 OTFTs, which were taken from different locations of the backplane, was 0.44±0.08 cm2·V-1·s-1, with a small deviation of 20%, over a 6-in.-size area comprising 28,800 OTFTs. This process providing high mobility and high uniformity can be achieved by simply maintaining the whole area of the substrate at a specific temperature (46 °C in this case) during drying of the droplets.

  1. Thermally dried ink-jet process for 6,13-bis(triisopropylsilylethynyl)-pentacene for high mobility and high uniformity on a large area substrate

    Science.gov (United States)

    Ryu, Gi Seong; Lee, Myung Won; Jeong, Seung Hyeon; Song, Chung Kun

    2012-01-01

    In this study we developed a simple ink-jet process for 6,13-bis(triisopropylsilylethynyl)-pentacene (TIPS-pentacene), which is known as a high-mobility soluble organic semiconductor, to achieve relatively high-mobility and high-uniformity performance for large-area applications. We analyzed the behavior of fluorescent particles in droplets and applied the results to determining a method of controlling the behavior of TIPS-pentacene molecules. The grain morphology of TIPS-pentacene varied depending on the temperature applied to the droplets during drying. We were able to obtain large and uniform grains at 46 degrees C without any "coffee stain". The process was applied to a large-size organic thin-film transistor (OTFT) backplane for an electrophoretic display panel containing 192 x 150 pixels on a 6-in.-sized substrate. The average of mobilities of 36 OTFTs, which were taken from different locations of the backplane, was 0.44 +/- 0.08 cm2.V-1.s-1, with a small deviation of 20%, over a 6-in.-size area comprising 28,800 OTFTs. This process providing high mobility and high uniformity can be achieved by simply maintaining the whole area of the substrate at a specific temperature (46 degrees C in this case) during drying of the droplets.

  2. Prehistoric Agricultural Communities in West Central Alabama. Volume 2. Studies of Material Remains from the Lubbub Creek Archaeological Locality.

    Science.gov (United States)

    1983-01-01

    and excessive calculus deposits which promoted periodontal disease , was not observed in the sample. In a survey of caries experience in populations of...class. General categories such as large mammal (e.g., deer or bear), medium mammal (e.g., raccoon or dog sized), and small mammal (e.g., mouse or rabbit...sample from the Lubbub Creek Archaeological Locality. We know from ethnohistoric accounts and from archaeological remains that dogs were commensals

  3. Quantum information processing with atoms and photons

    International Nuclear Information System (INIS)

    Monroe, C.

    2003-01-01

    Quantum information processors exploit the quantum features of superposition and entanglement for applications not possible in classical devices, offering the potential for significant improvements in the communication and processing of information. Experimental realization of large-scale quantum information processors remains a long term vision, as the required nearly pure quantum behaviour is observed only in exotic hardware such as individual laser-cooled atoms and isolated photons. But recent theoretical and experimental advances suggest that cold atoms and individual photons may lead the way towards bigger and better quantum information processors, effectively building mesoscopic versions of Schroedinger's cat' from the bottom up. (author)

  4. The large deviation principle and steady-state fluctuation theorem for the entropy production rate of a stochastic process in magnetic fields

    International Nuclear Information System (INIS)

    Chen, Yong; Ge, Hao; Xiong, Jie; Xu, Lihu

    2016-01-01

    Fluctuation theorem is one of the major achievements in the field of nonequilibrium statistical mechanics during the past two decades. There exist very few results for steady-state fluctuation theorem of sample entropy production rate in terms of large deviation principle for diffusion processes due to the technical difficulties. Here we give a proof for the steady-state fluctuation theorem of a diffusion process in magnetic fields, with explicit expressions of the free energy function and rate function. The proof is based on the Karhunen-Loève expansion of complex-valued Ornstein-Uhlenbeck process.

  5. Automated Processing Workflow for Ambient Seismic Recordings

    Science.gov (United States)

    Girard, A. J.; Shragge, J.

    2017-12-01

    Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that

  6. Effect of Heat Treatment Process on Mechanical Properties and Microstructure of a 9% Ni Steel for Large LNG Storage Tanks

    Science.gov (United States)

    Zhang, J. M.; Li, H.; Yang, F.; Chi, Q.; Ji, L. K.; Feng, Y. R.

    2013-12-01

    In this paper, two different heat treatment processes of a 9% Ni steel for large liquefied natural gas storage tanks were performed in an industrial heating furnace. The former was a special heat treatment process consisting of quenching and intercritical quenching and tempering (Q-IQ-T). The latter was a heat treatment process only consisting of quenching and tempering. Mechanical properties were measured by tensile testing and charpy impact testing, and the microstructure was analyzed by optical microscopy, transmission electron microscopy, and x-ray diffraction. The results showed that outstanding mechanical properties were obtained from the Q-IQ-T process in comparison with the Q-T process, and a cryogenic toughness with charpy impact energy value of 201 J was achieved at 77 K. Microstructure analysis revealed that samples of the Q-IQ-T process had about 9.8% of austenite in needle-like martensite, while samples of the Q-T process only had about 0.9% of austenite retained in tempered martensite.

  7. ParaText : scalable solutions for processing and searching very large document collections : final LDRD report.

    Energy Technology Data Exchange (ETDEWEB)

    Crossno, Patricia Joyce; Dunlavy, Daniel M.; Stanton, Eric T.; Shead, Timothy M.

    2010-09-01

    This report is a summary of the accomplishments of the 'Scalable Solutions for Processing and Searching Very Large Document Collections' LDRD, which ran from FY08 through FY10. Our goal was to investigate scalable text analysis; specifically, methods for information retrieval and visualization that could scale to extremely large document collections. Towards that end, we designed, implemented, and demonstrated a scalable framework for text analysis - ParaText - as a major project deliverable. Further, we demonstrated the benefits of using visual analysis in text analysis algorithm development, improved performance of heterogeneous ensemble models in data classification problems, and the advantages of information theoretic methods in user analysis and interpretation in cross language information retrieval. The project involved 5 members of the technical staff and 3 summer interns (including one who worked two summers). It resulted in a total of 14 publications, 3 new software libraries (2 open source and 1 internal to Sandia), several new end-user software applications, and over 20 presentations. Several follow-on projects have already begun or will start in FY11, with additional projects currently in proposal.

  8. Why Agricultural Educators Remain in the Classroom

    Science.gov (United States)

    Crutchfield, Nina; Ritz, Rudy; Burris, Scott

    2013-01-01

    The purpose of this study was to identify and describe factors that are related to agricultural educator career retention and to explore the relationships between work engagement, work-life balance, occupational commitment, and personal and career factors as related to the decision to remain in the teaching profession. The target population for…

  9. Detecting and quantifying ongoing decay of organic archaeological remains - a discussion of different approaches

    DEFF Research Database (Denmark)

    Matthiesen, Henning

    2015-01-01

    are well protected and are not undergoing rapid decay, and it requires a detailed knowledge of decay processes and rates. For instance it is well established that the presence of water is of paramount importance for the preservation of organic material, and there are several examples where archaeological....... Thus, for the management of archaeological sites it is necessary to develop tools and methods that allow us to discover ongoing decay as fast as possible. Furthermore, in order to prioritize between excavation, in situ preservation and mitigation the decay rate should be evaluated on a quantitative...... scale to determine if the archaeological remains can be preserved for centuries, decades or only a few years under different conditions. This is a challenging task as archaeological sites and materials are often heterogeneous and have been subjected to different site formation processes. This paper...

  10. Large transverse momentum processes in a non-scaling parton model

    International Nuclear Information System (INIS)

    Stirling, W.J.

    1977-01-01

    The production of large transverse momentum mesons in hadronic collisions by the quark fusion mechanism is discussed in a parton model which gives logarithmic corrections to Bjorken scaling. It is found that the moments of the large transverse momentum structure function exhibit a simple scale breaking behaviour similar to the behaviour of the Drell-Yan and deep inelastic structure functions of the model. An estimate of corresponding experimental consequences is made and the extent to which analogous results can be expected in an asymptotically free gauge theory is discussed. A simple set of rules is presented for incorporating the logarithmic corrections to scaling into all covariant parton model calculations. (Auth.)

  11. Analysis of cyclic variations of liquid fuel-air mixing processes in a realistic DISI IC-engine using Large Eddy Simulation

    International Nuclear Information System (INIS)

    Goryntsev, D.; Sadiki, A.; Klein, M.; Janicka, J.

    2010-01-01

    Direct injection spark ignition (DISI) engines have a large potential to reduce emissions and specific fuel consumption. One of the most important problem in the design of DISI engines is the cycle-to-cycle variations of the flow, mixing and combustion processes. The Large Eddy Simulation (LES) based analysis is used to characterize the cycle-to-cycle fluctuations of the flow field as well as the mixture preparation in a realistic four-stroke internal combustion engine with variable charge motion system. Based on the analysis of cycle-to-cycle velocity fluctuations of in-cylinder flow, the impact of various fuel spray boundary conditions on injection processes and mixture preparation is pointed out. The joint effect of both cycle-to-cycle velocity fluctuations and variable spray boundary conditions is discussed in terms of mean and standard deviation of relative air-fuel ratio, velocity and mass fraction. Finally a qualitative analysis of the intensity of cyclic fluctuations below the spark plug is provided.

  12. PART 2: LARGE PARTICLE MODELLING Simulation of particle filtration processes in deformable media

    Directory of Open Access Journals (Sweden)

    Gernot Boiger

    2008-06-01

    Full Text Available In filtration processes it is necessary to consider both, the interaction of thefluid with the solid parts as well as the effect of particles carried in the fluidand accumulated on the solid. While part 1 of this paper deals with themodelling of fluid structure interaction effects, the accumulation of dirtparticles will be addressed in this paper. A closer look is taken on theimplementation of a spherical, LAGRANGIAN particle model suitable forsmall and large particles. As dirt accumulates in the fluid stream, it interactswith the surrounding filter fibre structure and over time causes modificationsof the filter characteristics. The calculation of particle force interactioneffects is necessary for an adequate simulation of this situation. A detailedDiscrete Phase Lagrange Model was developed to take into account thetwo-way coupling of the fluid and accumulated particles. The simulation oflarge particles and the fluid-structure interaction is realised in a single finitevolume flow solver on the basis of the OpenSource software OpenFoam.

  13. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo

    2014-04-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  14. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Nunes, Suzana Pereira; Amy, Gary L.

    2014-01-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  15. Macroecological factors explain large-scale spatial population patterns of ancient agriculturalists

    NARCIS (Netherlands)

    Xu, C.; Chen, B.; Abades, S.; Reino, L.; Teng, S.; Ljungqvist, F.C.; Huang, Z.Y.X.; Liu, X.

    2015-01-01

    Aim: It has been well demonstrated that the large-scale distribution patterns of numerous species are driven by similar macroecological factors. However, understanding of this topic remains limited when applied to our own species. Here we take a large-scale look at ancient agriculturalist

  16. New processes for uranium isotope separation

    International Nuclear Information System (INIS)

    Vanstrum, P.R.; Levin, S.A.

    1977-01-01

    An overview of the status and prospects for processes other than gaseous diffusion, gas centrifuge, and separation nozzle for uranium isotope separation is presented. The incentive for the development of these processes is the increasing requirements for enriched uranium as fuel for nuclear power plants and the potential for reducing the high costs of enrichment. The latest nuclear power projections are converted to uranium enrichment requirements. The size and timing of the market for new enrichment processes are then determined by subtracting the existing and planned uranium enrichment capacities. It is estimated that to supply this market would require the construction of a large new enrichment plant of 9,000,000 SWU per year capacity, costing about $3 billion each (in 1976 dollars) about every year till the year 2000. A very comprehensive review of uranium isotope separation processes was made in 1971 by the Uranium Isotope Separation Review Ad Hoc Committee of the USAEC. Many of the processes discussed in that review are of little current interest. However, because of new approaches or remaining uncertainties about potential, there is considerable effort or continuing interest in a number of alternative processes. The status and prospects for attaining the requirements for competitive economics are presented for these processes, which include laser, chemical exchange, aerodynamic other than separation nozzle, and plasma processes. A qualitative summary comparison of these processes is made with the gaseous diffusion, gas centrifuge, and separation nozzle processes. In order to complete the overview of new processes for uranium isotope separation, a generic program schedule of typical steps beyond the basic process determination which are required, such as subsystem, module, pilot plant, and finally plant construction, before large-scale production can be attained is presented. Also the present value savings through the year 2000 is shown for various

  17. Model reduction for the dynamics and control of large structural systems via neutral network processing direct numerical optimization

    Science.gov (United States)

    Becus, Georges A.; Chan, Alistair K.

    1993-01-01

    Three neural network processing approaches in a direct numerical optimization model reduction scheme are proposed and investigated. Large structural systems, such as large space structures, offer new challenges to both structural dynamicists and control engineers. One such challenge is that of dimensionality. Indeed these distributed parameter systems can be modeled either by infinite dimensional mathematical models (typically partial differential equations) or by high dimensional discrete models (typically finite element models) often exhibiting thousands of vibrational modes usually closely spaced and with little, if any, damping. Clearly, some form of model reduction is in order, especially for the control engineer who can actively control but a few of the modes using system identification based on a limited number of sensors. Inasmuch as the amount of 'control spillover' (in which the control inputs excite the neglected dynamics) and/or 'observation spillover' (where neglected dynamics affect system identification) is to a large extent determined by the choice of particular reduced model (RM), the way in which this model reduction is carried out is often critical.

  18. Identification of human remains from the Second World War mass graves uncovered in Bosnia and Herzegovina.

    Science.gov (United States)

    Marjanović, Damir; Hadžić Metjahić, Negra; Čakar, Jasmina; Džehverović, Mirela; Dogan, Serkan; Ferić, Elma; Džijan, Snježana; Škaro, Vedrana; Projić, Petar; Madžar, Tomislav; Rod, Eduard; Primorac, Dragan

    2015-06-01

    To present the results obtained in the identification of human remains from World War II found in two mass graves in Ljubuški, Bosnia and Herzegovina. Samples from 10 skeletal remains were collected. Teeth and femoral fragments were collected from 9 skeletons and only a femoral fragment from 1 skeleton. DNA was isolated from bone and teeth samples using an optimized phenol/chloroform DNA extraction procedure. All samples required a pre-extraction decalcification with EDTA and additional post-extraction DNA purification using filter columns. Additionally, DNA from 12 reference samples (buccal swabs from potential living relatives) was extracted using the Qiagen DNA extraction method. QuantifilerTM Human DNA Quantification Kit was used for DNA quantification. PowerPlex ESI kit was used to simultaneously amplify 15 autosomal short tandem repeat (STR) loci, and PowerPlex Y23 was used to amplify 23 Y chromosomal STR loci. Matching probabilities were estimated using a standard statistical approach. A total of 10 samples were processed, 9 teeth and 1 femoral fragment. Nine of 10 samples were profiled using autosomal STR loci, which resulted in useful DNA profiles for 9 skeletal remains. A comparison of established victims' profiles against a reference sample database yielded 6 positive identifications. DNA analysis may efficiently contribute to the identification of remains even seven decades after the end of the World War II. The significant percentage of positively identified remains (60%), even when the number of the examined possible living relatives was relatively small (only 12), proved the importance of cooperation with the members of the local community, who helped to identify the closest missing persons' relatives and collect referent samples from them.

  19. Large shift in source of fine sediment in the upper Mississippi River

    Science.gov (United States)

    Belmont, P.; Gran, K.B.; Schottler, S.P.; Wilcock, P.R.; Day, S.S.; Jennings, C.; Lauer, J.W.; Viparelli, E.; Willenbring, J.K.; Engstrom, D.R.; Parker, G.

    2011-01-01

    Although sediment is a natural constituent of rivers, excess loading to rivers and streams is a leading cause of impairment and biodiversity loss. Remedial actions require identification of the sources and mechanisms of sediment supply. This task is complicated by the scale and complexity of large watersheds as well as changes in climate and land use that alter the drivers of sediment supply. Previous studies in Lake Pepin, a natural lake on the Mississippi River, indicate that sediment supply to the lake has increased 10-fold over the past 150 years. Herein we combine geochemical fingerprinting and a suite of geomorphic change detection techniques with a sediment mass balance for a tributary watershed to demonstrate that, although the sediment loading remains very large, the dominant source of sediment has shifted from agricultural soil erosion to accelerated erosion of stream banks and bluffs, driven by increased river discharge. Such hydrologic amplification of natural erosion processes calls for a new approach to watershed sediment modeling that explicitly accounts for channel and floodplain dynamics that amplify or dampen landscape processes. Further, this finding illustrates a new challenge in remediating nonpoint sediment pollution and indicates that management efforts must expand from soil erosion to factors contributing to increased water runoff. ?? 2011 American Chemical Society.

  20. The Influence of Negative Emotion on Cognitive and Emotional Control Remains Intact in Aging

    Directory of Open Access Journals (Sweden)

    Artyom Zinchenko

    2017-11-01

    Full Text Available Healthy aging is characterized by a gradual decline in cognitive control and inhibition of interferences, while emotional control is either preserved or facilitated. Emotional control regulates the processing of emotional conflicts such as in irony in speech, and cognitive control resolves conflict between non-affective tendencies. While negative emotion can trigger control processes and speed up resolution of both cognitive and emotional conflicts, we know little about how aging affects the interaction of emotion and control. In two EEG experiments, we compared the influence of negative emotion on cognitive and emotional conflict processing in groups of younger adults (mean age = 25.2 years and older adults (69.4 years. Participants viewed short video clips and either categorized spoken vowels (cognitive conflict or their emotional valence (emotional conflict, while the visual facial information was congruent or incongruent. Results show that negative emotion modulates both cognitive and emotional conflict processing in younger and older adults as indicated in reduced response times and/or enhanced event-related potentials (ERPs. In emotional conflict processing, we observed a valence-specific N100 ERP component in both age groups. In cognitive conflict processing, we observed an interaction of emotion by congruence in the N100 responses in both age groups, and a main effect of congruence in the P200 and N200. Thus, the influence of emotion on conflict processing remains intact in aging, despite a marked decline in cognitive control. Older adults may prioritize emotional wellbeing and preserve the role of emotion in cognitive and emotional control.

  1. Adding large EM stack support

    KAUST Repository

    Holst, Glendon

    2016-12-01

    Serial section electron microscopy (SSEM) image stacks generated using high throughput microscopy techniques are an integral tool for investigating brain connectivity and cell morphology. FIB or 3View scanning electron microscopes easily generate gigabytes of data. In order to produce analyzable 3D dataset from the imaged volumes, efficient and reliable image segmentation is crucial. Classical manual approaches to segmentation are time consuming and labour intensive. Semiautomatic seeded watershed segmentation algorithms, such as those implemented by ilastik image processing software, are a very powerful alternative, substantially speeding up segmentation times. We have used ilastik effectively for small EM stacks – on a laptop, no less; however, ilastik was unable to carve the large EM stacks we needed to segment because its memory requirements grew too large – even for the biggest workstations we had available. For this reason, we refactored the carving module of ilastik to scale it up to large EM stacks on large workstations, and tested its efficiency. We modified the carving module, building on existing blockwise processing functionality to process data in manageable chunks that can fit within RAM (main memory). We review this refactoring work, highlighting the software architecture, design choices, modifications, and issues encountered.

  2. EBSD-based techniques for characterization of microstructural restoration processes during annealing of metals deformed to large plastic strains

    DEFF Research Database (Denmark)

    Godfrey, A.; Mishin, Oleg; Yu, Tianbo

    2012-01-01

    Some methods for quantitative characterization of the microstructures deformed to large plastic strains both before and after annealing are discussed and illustrated using examples of samples after equal channel angular extrusion and cold-rolling. It is emphasized that the microstructures...... in such deformed samples exhibit a heterogeneity in the microstructural refinement by high angle boundaries. Based on this, a new parameter describing the fraction of regions containing predominantly low angle boundaries is introduced. This parameter has some advantages over the simpler high angle boundary...... on mode of the distribution of dislocation cell sizes is outlined, and it is demonstrated how this parameter can be used to investigate the uniformity, or otherwise, of the restoration processes occurring during annealing of metals deformed to large plastic strains. © (2012) Trans Tech Publications...

  3. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  4. A Bayesian Framework for Remaining Useful Life Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — The estimation of remaining useful life (RUL) of a faulty component is at the center of system prognostics and health management. It gives operators a potent tool in...

  5. The 2007 Analysis of Information Remaining on Disks Offered for Sale on the Second Hand Market

    Directory of Open Access Journals (Sweden)

    Andy Jones

    2008-03-01

    Full Text Available All organisations, whether in the public or private sector, increasingly use computers and other devices that contain computer hard disks for the storage and processing of information relating to their business, their employees or their customers. Individual home users also increasingly use computers and other devices containing computer hard disks for the storage and processing of information relating to their private, personal affairs. It continues to be clear that the majority of organisations and individual home users still remain ignorant or misinformed of the volume and type of information that is stored on the hard disks that these devices contain and have not considered, or are unaware of, the potential impact of this information becoming available to their competitors or to people with criminal intent.This is the third study in an ongoing research effort that is being conducted into the volume and type of information that remains on computer hard disks offered for sale on the second hand market.  The purpose of the research has been to gain an understanding of the information that remains on the disk and to determine the level of damage that could, potentially be caused, if the information fell into the wrong hands.  The study examines disks that have been obtained in a number of countries to determine whether there is any detectable national or regional variance in the way that the disposal of computer disks is addressed and to compare the results for any other detectable regional or temporal trends.The first study was carried out in 2005 and was repeated in 2006 with the scope extended to include additional countries. The studies were carried out by British Telecommunications, the University of Glamorgan in the UK and Edith Cowan University in Australia. The basis of the research was to acquire a number of second hand computer disks from various sources and then determine whether they still contained information relating to a

  6. Review of the Dinosaur Remains from the Middle Jurassic of Scotland, UK

    Directory of Open Access Journals (Sweden)

    Neil D. L. Clark

    2018-02-01

    Full Text Available Dinosaurs are rare from the Middle Jurassic worldwide. The Isle of Skye, is the only place in Scotland thus far to have produced dinosaur remains. These remains consist mainly of footprints, but also several bones and teeth. These Bajocian and Bathonian remains represent an important collection of a basal eusauropod, early examples of non-neosauropod and possible basal titanosauriform eusauropods, and theropod remains that may belong to an early coelurosaur and a possible megalosaurid, basal tyrannosauroid, or dromaeosaurid. The footprints from here also suggest a rich and diverse dinosaur fauna for which further better diagnosable remains are likely to be found.

  7. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    Science.gov (United States)

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  8. Precision Optical Coatings for Large Space Telescope Mirrors

    Science.gov (United States)

    Sheikh, David

    This proposal “Precision Optical Coatings for Large Space Telescope Mirrors” addresses the need to develop and advance the state-of-the-art in optical coating technology. NASA is considering large monolithic mirrors 1 to 8-meters in diameter for future telescopes such as HabEx and LUVOIR. Improved large area coating processes are needed to meet the future requirements of large astronomical mirrors. In this project, we will demonstrate a broadband reflective coating process for achieving high reflectivity from 90-nm to 2500-nm over a 2.3-meter diameter coating area. The coating process is scalable to larger mirrors, 6+ meters in diameter. We will use a battery-driven coating process to make an aluminum reflector, and a motion-controlled coating technology for depositing protective layers. We will advance the state-of-the-art for coating technology and manufacturing infrastructure, to meet the reflectance and wavefront requirements of both HabEx and LUVOIR. Specifically, we will combine the broadband reflective coating designs and processes developed at GSFC and JPL with large area manufacturing technologies developed at ZeCoat Corporation. Our primary objectives are to: Demonstrate an aluminum coating process to create uniform coatings over large areas with near-theoretical aluminum reflectance Demonstrate a motion-controlled coating process to apply very precise 2-nm to 5- nm thick protective/interference layers to large areas, Demonstrate a broadband coating system (90-nm to 2500-nm) over a 2.3-meter coating area and test it against the current coating specifications for LUVOIR/HabEx. We will perform simulated space-environment testing, and we expect to advance the TRL from 3 to >5 in 3-years.

  9. Remaining life diagnosis method and device for nuclear reactor

    International Nuclear Information System (INIS)

    Yamamoto, Michiyoshi.

    1996-01-01

    A neutron flux measuring means is inserted from the outside of a reactor pressure vessel during reactor operation to forecast neutron-degradation of materials of incore structural components in the vicinity of portions to be measured based on the measured values, and the remaining life of the reactor is diagnosed by the forecast degraded state. In this case, the neutron fluxes to be measured are desirably fast and/or medium neutron fluxes. As the positions where the measuring means is to be inserted, for example, the vicinity of the structural components at the periphery of the fuel assembly is selected. Aging degradation characteristics of the structural components are determined by using the aging degradation data for the structural materials. The remaining life is analyzed based on obtained aging degradation characteristics and stress evaluation data of the incore structural components at portions to be measured. Neutron irradiation amount of structural components at predetermined positions can be recognized accurately, and appropriate countermeasures can be taken depending on the forecast remaining life thereby enabling to improve the reliability of the reactor. (N.H.)

  10. Industry remains stuck in a transitional mode

    International Nuclear Information System (INIS)

    Garb, F.A.

    1991-01-01

    The near future for industry remains foggy for several obvious reasons. The shake-up of the Soviet Union and how the pieces will reform remains unclear. How successful efforts are to privatize government oil company operations around the world has yet to be determined. A long sought peace in the Middle East seems to be inching closer, but will this continue? If it does continue, what impact will it have on world energy policy? Will American companies, which are now transferring their attention to foreign E and P, also maintain an interest in domestic activities? Is the U.S. economy really on the upswing? We are told that the worst of the recession is over, but try telling this to thousands of workers in the oil patch who are being released monthly by the big players in domestic operations. This paper reports that 1992 should be a better year than 1991, if measured in opportunity. There are more exploration and acquisition options available, both domestically and internationally, than there have been in years. Probably more opportunities exist than there are players-certainly more than can be funded with current financial resources

  11. Data-driven remaining useful life prognosis techniques stochastic models, methods and applications

    CERN Document Server

    Si, Xiao-Sheng; Hu, Chang-Hua

    2017-01-01

    This book introduces data-driven remaining useful life prognosis techniques, and shows how to utilize the condition monitoring data to predict the remaining useful life of stochastic degrading systems and to schedule maintenance and logistics plans. It is also the first book that describes the basic data-driven remaining useful life prognosis theory systematically and in detail. The emphasis of the book is on the stochastic models, methods and applications employed in remaining useful life prognosis. It includes a wealth of degradation monitoring experiment data, practical prognosis methods for remaining useful life in various cases, and a series of applications incorporated into prognostic information in decision-making, such as maintenance-related decisions and ordering spare parts. It also highlights the latest advances in data-driven remaining useful life prognosis techniques, especially in the contexts of adaptive prognosis for linear stochastic degrading systems, nonlinear degradation modeling based pro...

  12. Large-area formation of self-aligned crystalline domains of organic semiconductors on transistor channels using CONNECT

    Science.gov (United States)

    Park, Steve; Giri, Gaurav; Shaw, Leo; Pitner, Gregory; Ha, Jewook; Koo, Ja Hoon; Gu, Xiaodan; Park, Joonsuk; Lee, Tae Hoon; Nam, Ji Hyun; Hong, Yongtaek; Bao, Zhenan

    2015-01-01

    The electronic properties of solution-processable small-molecule organic semiconductors (OSCs) have rapidly improved in recent years, rendering them highly promising for various low-cost large-area electronic applications. However, practical applications of organic electronics require patterned and precisely registered OSC films within the transistor channel region with uniform electrical properties over a large area, a task that remains a significant challenge. Here, we present a technique termed “controlled OSC nucleation and extension for circuits” (CONNECT), which uses differential surface energy and solution shearing to simultaneously generate patterned and precisely registered OSC thin films within the channel region and with aligned crystalline domains, resulting in low device-to-device variability. We have fabricated transistor density as high as 840 dpi, with a yield of 99%. We have successfully built various logic gates and a 2-bit half-adder circuit, demonstrating the practical applicability of our technique for large-scale circuit fabrication. PMID:25902502

  13. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  14. Initial crystallization and growth in melt processing of large-domain YBa2Cu3Ox for magnetic levitation

    International Nuclear Information System (INIS)

    Shi, D.

    1994-10-01

    Crystallization temperature in YBa 2 Cu 3 O x (123) during peritectic reaction has been studied by differential thermal analysis (DTA) and optical microscopy. It has been found that YBa 2 Cu 3 O x experiences partial melting near 1,010 C during heating while crystallization takes place at a much lower temperature range upon cooling indicating a delayed nucleation process. A series of experiments have been conducted to search for the initial crystallization temperature in the Y 2 BaCuO x + liquid phase field. The authors have found that the slow-cool period (1 C/h) for the 123 grain texturing can start at as low as 960 C. This novel processing has resulted in high-quality, large-domain, strongly pinned 123 magnetic levitators

  15. Red Assembly: the work remains

    Directory of Open Access Journals (Sweden)

    Leslie Witz

    installed. What to do at this limit, at the transgressive encounter between saying yes and no to history, remains the challenge. It is the very challenge of what insistently remains.

  16. Crystallization process of a three-dimensional complex plasma

    Science.gov (United States)

    Steinmüller, Benjamin; Dietz, Christopher; Kretschmer, Michael; Thoma, Markus H.

    2018-05-01

    Characteristic timescales and length scales for phase transitions of real materials are in ranges where a direct visualization is unfeasible. Therefore, model systems can be useful. Here, the crystallization process of a three-dimensional complex plasma under gravity conditions is considered where the system ranges up to a large extent into the bulk plasma. Time-resolved measurements exhibit the process down to a single-particle level. Primary clusters, consisting of particles in the solid state, grow vertically and, secondarily, horizontally. The box-counting method shows a fractal dimension of df≈2.72 for the clusters. This value gives a hint that the formation process is a combination of local epitaxial and diffusion-limited growth. The particle density and the interparticle distance to the nearest neighbor remain constant within the clusters during crystallization. All results are in good agreement with former observations of a single-particle layer.

  17. Process for producing curved surface of membrane rings for large containers, particulary for prestressed concrete pressure vessels of nuclear reactors

    International Nuclear Information System (INIS)

    Kumpf, H.

    1977-01-01

    Membrane rings for large pressure vessels, particularly for prestressed-concrete pressure vessels, often have curved surfaces. The invention describes a process of producing these at site, which is particularly advantageous as the forming and installation of the vessel component coincide. According to the invention, the originally flat membrane ring is set in a predetermined position, is then pressed in sections by a forming tool (with a preformed support ring as opposite tool), and shaped. After this, the shaped parts are welded to the ring-shaped wall parts of the large vessel. The manufacture of single and double membrane rings arrangements is described. (HP) [de

  18. The synthesis of alternatives for the bioconversion of waste-monoethanolamine from large-scale CO{sub 2}-removal processes

    Energy Technology Data Exchange (ETDEWEB)

    Ohtaguchi, Kazuhisa; Yokoyama, Takahisa [Tokyo Inst. of Tech. (Japan). Dept. of Chemical Engineering

    1998-12-31

    The alternatives for bioconversion of monoethanolamine (MEA), which would appear in large quantities in industrial effluent of CO{sub 2}-removal process of power companies, have been proposed by investigating the ability of some microorganisms to deaminate MEA. An evaluation of biotechnology, which includes productions from MEA of acetic acid and acetaldehyde with Escherichia coli, of formic and acetic acids with Clostridium formicoaceticum, confirms and extends our earlier remarks on availability of ecotechnology for solving the above problem. (Author)

  19. Human Remains from the Pleistocene-Holocene Transition of Southwest China Suggest a Complex Evolutionary History for East Asians

    Science.gov (United States)

    Curnoe, Darren; Xueping, Ji; Herries, Andy I. R.; Kanning, Bai; Taçon, Paul S. C.; Zhende, Bao; Fink, David; Yunsheng, Zhu; Hellstrom, John; Yun, Luo; Cassis, Gerasimos; Bing, Su; Wroe, Stephen; Shi, Hong; Parr, William C. H.; Shengmin, Huang; Rogers, Natalie

    2012-01-01

    Background Later Pleistocene human evolution in East Asia remains poorly understood owing to a scarcity of well described, reliably classified and accurately dated fossils. Southwest China has been identified from genetic research as a hotspot of human diversity, containing ancient mtDNA and Y-DNA lineages, and has yielded a number of human remains thought to derive from Pleistocene deposits. We have prepared, reconstructed, described and dated a new partial skull from a consolidated sediment block collected in 1979 from the site of Longlin Cave (Guangxi Province). We also undertook new excavations at Maludong (Yunnan Province) to clarify the stratigraphy and dating of a large sample of mostly undescribed human remains from the site. Methodology/Principal Findings We undertook a detailed comparison of cranial, including a virtual endocast for the Maludong calotte, mandibular and dental remains from these two localities. Both samples probably derive from the same population, exhibiting an unusual mixture of modern human traits, characters probably plesiomorphic for later Homo, and some unusual features. We dated charcoal with AMS radiocarbon dating and speleothem with the Uranium-series technique and the results show both samples to be from the Pleistocene-Holocene transition: ∼14.3-11.5 ka. Conclusions/Significance Our analysis suggests two plausible explanations for the morphology sampled at Longlin Cave and Maludong. First, it may represent a late-surviving archaic population, perhaps paralleling the situation seen in North Africa as indicated by remains from Dar-es-Soltane and Temara, and maybe also in southern China at Zhirendong. Alternatively, East Asia may have been colonised during multiple waves during the Pleistocene, with the Longlin-Maludong morphology possibly reflecting deep population substructure in Africa prior to modern humans dispersing into Eurasia. PMID:22431968

  20. An analysis of the alleged skeletal remains of Carin Göring.

    Directory of Open Access Journals (Sweden)

    Anna Kjellström

    Full Text Available In 1991, treasure hunters found skeletal remains in an area close to the destroyed country residence of former Nazi leader Hermann Göring in northeastern Berlin. The remains, which were believed to belong to Carin Göring, who was buried at the site, were examined to determine whether it was possible to make a positive identification. The anthropological analysis showed that the remains come from an adult woman. The DNA analysis of several bone elements showed female sex, and a reference sample from Carin's son revealed mtDNA sequences identical to the remains. The profile has one nucleotide difference from the Cambridge reference sequence (rCRS, the common variant 263G. A database search resulted in a frequency of this mtDNA sequence of about 10% out of more than 7,000 European haplotypes. The mtDNA sequence found in the ulna, the cranium and the reference sample is, thus, very common among Europeans. Therefore, nuclear DNA analysis was attempted. The remains as well as a sample from Carin's son were successfully analysed for the three nuclear markers TH01, D7S820 and D8S1179. The nuclear DNA analysis of the two samples revealed one shared allele for each of the three markers, supporting a mother and son relationship. This genetic information together with anthropological and historical files provides an additional piece of circumstantial evidence in our efforts to identify the remains of Carin Göring.

  1. Does hypertension remain after kidney transplantation?

    Directory of Open Access Journals (Sweden)

    Gholamreza Pourmand

    2015-05-01

    Full Text Available Hypertension is a common complication of kidney transplantation with the prevalence of 80%. Studies in adults have shown a high prevalence of hypertension (HTN in the first three months of transplantation while this rate is reduced to 50- 60% at the end of the first year. HTN remains as a major risk factor for cardiovascular diseases, lower graft survival rates and poor function of transplanted kidney in adults and children. In this retrospective study, medical records of 400 kidney transplantation patients of Sina Hospital were evaluated. Patients were followed monthly for the 1st year, every two months in the 2nd year and every three months after that. In this study 244 (61% patients were male. Mean ± SD age of recipients was 39.3 ± 13.8 years. In most patients (40.8% the cause of end-stage renal disease (ESRD was unknown followed by HTN (26.3%. A total of 166 (41.5% patients had been hypertensive before transplantation and 234 (58.5% had normal blood pressure. Among these 234 individuals, 94 (40.2% developed post-transplantation HTN. On the other hand, among 166 pre-transplant hypertensive patients, 86 patients (56.8% remained hypertensive after transplantation. Totally 180 (45% patients had post-transplantation HTN and 220 patients (55% didn't develop HTN. Based on the findings, the incidence of post-transplantation hypertension is high, and kidney transplantation does not lead to remission of hypertension. On the other hand, hypertension is one of the main causes of ESRD. Thus, early screening of hypertension can prevent kidney damage and reduce further problems in renal transplant recipients.

  2. Global variations of large megathrust earthquake rupture characteristics

    Science.gov (United States)

    Kanamori, Hiroo

    2018-01-01

    Despite the surge of great earthquakes along subduction zones over the last decade and advances in observations and analysis techniques, it remains unclear whether earthquake complexity is primarily controlled by persistent fault properties or by dynamics of the failure process. We introduce the radiated energy enhancement factor (REEF), given by the ratio of an event’s directly measured radiated energy to the calculated minimum radiated energy for a source with the same seismic moment and duration, to quantify the rupture complexity. The REEF measurements for 119 large [moment magnitude (Mw) 7.0 to 9.2] megathrust earthquakes distributed globally show marked systematic regional patterns, suggesting that the rupture complexity is strongly influenced by persistent geological factors. We characterize this as the existence of smooth and rough rupture patches with varying interpatch separation, along with failure dynamics producing triggering interactions that augment the regional influences on large events. We present an improved asperity scenario incorporating both effects and categorize global subduction zones and great earthquakes based on their REEF values and slip patterns. Giant earthquakes rupturing over several hundred kilometers can occur in regions with low-REEF patches and small interpatch spacing, such as for the 1960 Chile, 1964 Alaska, and 2011 Tohoku earthquakes, or in regions with high-REEF patches and large interpatch spacing as in the case for the 2004 Sumatra and 1906 Ecuador-Colombia earthquakes. Thus, combining seismic magnitude Mw and REEF, we provide a quantitative framework to better represent the span of rupture characteristics of great earthquakes and to understand global seismicity. PMID:29750186

  3. Vertebral Adaptations to Large Body Size in Theropod Dinosaurs.

    Directory of Open Access Journals (Sweden)

    John P Wilson

    Full Text Available Rugose projections on the anterior and posterior aspects of vertebral neural spines appear throughout Amniota and result from the mineralization of the supraspinous and interspinous ligaments via metaplasia, the process of permanent tissue-type transformation. In mammals, this metaplasia is generally pathological or stress induced, but is a normal part of development in some clades of birds. Such structures, though phylogenetically sporadic, appear throughout the fossil record of non-avian theropod dinosaurs, yet their physiological and adaptive significance has remained unexamined. Here we show novel histologic and phylogenetic evidence that neural spine projections were a physiological response to biomechanical stress in large-bodied theropod species. Metaplastic projections also appear to vary between immature and mature individuals of the same species, with immature animals either lacking them or exhibiting smaller projections, supporting the hypothesis that these structures develop through ontogeny as a result of increasing bending stress subjected to the spinal column. Metaplastic mineralization of spinal ligaments would likely affect the flexibility of the spinal column, increasing passive support for body weight. A stiff spinal column would also provide biomechanical support for the primary hip flexors and, therefore, may have played a role in locomotor efficiency and mobility in large-bodied species. This new association of interspinal ligament metaplasia in Theropoda with large body size contributes additional insight to our understanding of the diverse biomechanical coping mechanisms developed throughout Dinosauria, and stresses the significance of phylogenetic methods when testing for biological trends, evolutionary or not.

  4. Vertebral Adaptations to Large Body Size in Theropod Dinosaurs.

    Science.gov (United States)

    Wilson, John P; Woodruff, D Cary; Gardner, Jacob D; Flora, Holley M; Horner, John R; Organ, Chris L

    2016-01-01

    Rugose projections on the anterior and posterior aspects of vertebral neural spines appear throughout Amniota and result from the mineralization of the supraspinous and interspinous ligaments via metaplasia, the process of permanent tissue-type transformation. In mammals, this metaplasia is generally pathological or stress induced, but is a normal part of development in some clades of birds. Such structures, though phylogenetically sporadic, appear throughout the fossil record of non-avian theropod dinosaurs, yet their physiological and adaptive significance has remained unexamined. Here we show novel histologic and phylogenetic evidence that neural spine projections were a physiological response to biomechanical stress in large-bodied theropod species. Metaplastic projections also appear to vary between immature and mature individuals of the same species, with immature animals either lacking them or exhibiting smaller projections, supporting the hypothesis that these structures develop through ontogeny as a result of increasing bending stress subjected to the spinal column. Metaplastic mineralization of spinal ligaments would likely affect the flexibility of the spinal column, increasing passive support for body weight. A stiff spinal column would also provide biomechanical support for the primary hip flexors and, therefore, may have played a role in locomotor efficiency and mobility in large-bodied species. This new association of interspinal ligament metaplasia in Theropoda with large body size contributes additional insight to our understanding of the diverse biomechanical coping mechanisms developed throughout Dinosauria, and stresses the significance of phylogenetic methods when testing for biological trends, evolutionary or not.

  5. Imaging of the optic disk in caring for patients with glaucoma: ophthalmoscopy and photography remain the gold standard.

    Science.gov (United States)

    Spaeth, George L; Reddy, Swathi C

    2014-01-01

    Optic disk imaging is integral to the diagnosis and treatment of patients with glaucoma. We discuss the various forms of imaging the optic nerve, including ophthalmoscopy, photography, and newer imaging modalities, including optical coherence tomography (OCT), confocal scanning laser ophthalmoscopy (HRT), and scanning laser polarimetry (GDx), specifically highlighting their benefits and disadvantages. We argue that ophthalmoscopy and photography remain the gold standard of imaging due to portability, ease of interpretation, and the presence of a large database of images for comparison. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Lung Abscess Remains a Life-Threatening Condition in Pediatrics – A Case Report

    Directory of Open Access Journals (Sweden)

    Chirteș Ioana Raluca

    2017-07-01

    Full Text Available Pulmonary abscess or lung abscess is a lung infection which destroys the lung parenchyma leading to cavitations and central necrosis in localised areas formed by thick-walled purulent material. It can be primary or secondary. Lung abscesses can occur at any age, but it seems that paediatric pulmonary abscess morbidity is lower than in adults. We present the case of a one year and 5-month-old male child admitted to our clinic for fever, loss of appetite and an overall altered general status. Laboratory tests revealed elevated inflammatory biomarkers, leukocytosis with neutrophilia, anaemia, thrombocytosis, low serum iron concentration and increased lactate dehydrogenase level. Despite wide-spectrum antibiotic therapy, the patient’s progress remained poor after seven days of treatment and a CT scan established the diagnosis of a large lung abscess. Despite changing the antibiotic therapy, surgical intervention was eventually needed. There was a slow but steady improvment and eventually, the patient was discharged after approximately five weeks.

  7. A Study on the regulation improvement through the analysis of domestic and international categorization and licensing process for large particle accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Gwon, Da-Yeong; Jeon, Yeo-Ryeong; Kim, Yong-Min [Catholic University of Daegu, Gyeongsan (Korea, Republic of); Jung, Nam-Suk; Lee, Hee-Seock [POSTECH, Pohang (Korea, Republic of)

    2016-10-15

    Many foreign countries use separate criteria and regulation procedure according to the categorization of accelerators. In Korea, nuclear and radiation related facilities are divided into 4 groups: 1) Nuclear Reactor and related facilities, 2) Nuclear fuel cycle, nuclear material facilities, 3) Disposal and transport, 4) Radioisotope and radiation generating devices related facilities. All accelerator facilities are categorized as group 4 regardless of their size and type. For facilities that belong to group 1 and 2, Radiation Environmental Impact Assessment Report(REIR) and Preliminary Decommissioning Plan Report(PDPR) should be submitted in construction licensing stage, but there are no rules about above documents for large particle accelerator facilities. Facilities that belong to 4) RI and RG, only two documents of Radiation Safety Report(RSR) and Safety Control Regulation(SCR) are submitted in licensing stage. Because there is no detailed guidelines according to facilities type, properties of each facility are not considered in preparation and licensing process. If we set up the categorization of accelerator facilities, we can expect the effective and safe construction and operation of the large accelerator facilities on the licensing and operation process. Similarly to other counties' criteria, 50 MeV of particle energy could be used as energy band of large particle accelerator. According to categorization, it is necessary to adopt graded licensing stages and separated safety documents. In case of large particle accelerators, it is appropriate to divide the licensing stages to construction and operation. We currently submit PDPR in case of reactor and related facilities, nuclear fuel cycle, and nuclear material facilities. Depending on the energy of particle accelerators, it is necessary to prepare the decontamination and decommissioning for the decrease of current and future burden from radioactive waste. From the arrangement of separated guidelines on

  8. A Study on the regulation improvement through the analysis of domestic and international categorization and licensing process for large particle accelerator

    International Nuclear Information System (INIS)

    Gwon, Da-Yeong; Jeon, Yeo-Ryeong; Kim, Yong-Min; Jung, Nam-Suk; Lee, Hee-Seock

    2016-01-01

    Many foreign countries use separate criteria and regulation procedure according to the categorization of accelerators. In Korea, nuclear and radiation related facilities are divided into 4 groups: 1) Nuclear Reactor and related facilities, 2) Nuclear fuel cycle, nuclear material facilities, 3) Disposal and transport, 4) Radioisotope and radiation generating devices related facilities. All accelerator facilities are categorized as group 4 regardless of their size and type. For facilities that belong to group 1 and 2, Radiation Environmental Impact Assessment Report(REIR) and Preliminary Decommissioning Plan Report(PDPR) should be submitted in construction licensing stage, but there are no rules about above documents for large particle accelerator facilities. Facilities that belong to 4) RI and RG, only two documents of Radiation Safety Report(RSR) and Safety Control Regulation(SCR) are submitted in licensing stage. Because there is no detailed guidelines according to facilities type, properties of each facility are not considered in preparation and licensing process. If we set up the categorization of accelerator facilities, we can expect the effective and safe construction and operation of the large accelerator facilities on the licensing and operation process. Similarly to other counties' criteria, 50 MeV of particle energy could be used as energy band of large particle accelerator. According to categorization, it is necessary to adopt graded licensing stages and separated safety documents. In case of large particle accelerators, it is appropriate to divide the licensing stages to construction and operation. We currently submit PDPR in case of reactor and related facilities, nuclear fuel cycle, and nuclear material facilities. Depending on the energy of particle accelerators, it is necessary to prepare the decontamination and decommissioning for the decrease of current and future burden from radioactive waste. From the arrangement of separated guidelines on

  9. An On-Board Remaining Useful Life Estimation Algorithm for Lithium-Ion Batteries of Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Xiaoyu Li

    2017-05-01

    Full Text Available Battery remaining useful life (RUL estimation is critical to battery management and performance optimization of electric vehicles (EVs. In this paper, we present an effective way to estimate RUL online by using the support vector machine (SVM algorithm. By studying the characteristics of the battery degradation process, the rising of the terminal voltage and changing characteristics of the voltage derivative (DV during the charging process are introduced as the training variables of the SVM algorithm to determine the battery RUL. The SVM is then applied to build the battery degradation model and predict the battery real cycle numbers. Experimental results prove that the built battery degradation model shows higher accuracy and less computation time compared with those of the neural network (NN method, thereby making it a potential candidate for realizing online RUL estimation in a battery management system (BMS.

  10. And the Dead Remain Behind

    Directory of Open Access Journals (Sweden)

    Peter Read

    2013-08-01

    Full Text Available In most cultures the dead and their living relatives are held in a dialogic relationship. The dead have made it clear, while living, what they expect from their descendants. The living, for their part, wish to honour the tombs of their ancestors; at the least, to keep the graves of the recent dead from disrepair. Despite the strictures, the living can fail their responsibilities, for example, by migration to foreign countries. The peripatetic Chinese are one of the few cultures able to overcome the dilemma of the wanderer or the exile. With the help of a priest, an Australian Chinese migrant may summon the soul of an ancestor from an Asian grave to a Melbourne temple, where the spirit, though removed from its earthly vessel, will rest and remain at peace. Amongst cultures in which such practices are not culturally appropriate, to fail to honour the family dead can be exquisitely painful. Violence is the cause of most failure.

  11. Palmar, Patellar, and Pedal Human Remains from Pavlov

    Czech Academy of Sciences Publication Activity Database

    Trinkaus, E.; Wojtal, P.; Wilczyński, J.; Sázelová, Sandra; Svoboda, Jiří

    2017-01-01

    Roč. 2017, June (2017), s. 73-101 ISSN 1545-0031 Institutional support: RVO:68081758 Keywords : Gravettian * human remains * isolated bones * anatomically modern humans * Upper Paleolithic Subject RIV: AC - Archeology, Anthropology, Ethnology OBOR OECD: Archaeology http://paleoanthro.org/media/journal/content/PA20170073.pdf

  12. Robotics to enable older adults to remain living at home.

    Science.gov (United States)

    Pearce, Alan J; Adair, Brooke; Miller, Kimberly; Ozanne, Elizabeth; Said, Catherine; Santamaria, Nick; Morris, Meg E

    2012-01-01

    Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1) what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2) what is the evidence demonstrating that robotic devices are effective in enabling independent living in community dwelling older people? Following database searches for relevant literature an initial yield of 161 articles was obtained. Titles and abstracts of articles were then reviewed by 2 independent people to determine suitability for inclusion. Forty-two articles met the criteria for question 1. Of these, 4 articles met the criteria for question 2. Results showed that robotics is currently available to assist older healthy people and people with disabilities to remain independent and to monitor their safety and social connectedness. Most studies were conducted in laboratories and hospital clinics. Currently limited evidence demonstrates that robots can be used to enable people to remain living at home, although this is an emerging smart technology that is rapidly evolving.

  13. Robotics to Enable Older Adults to Remain Living at Home

    Directory of Open Access Journals (Sweden)

    Alan J. Pearce

    2012-01-01

    Full Text Available Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1 what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2 what is the evidence demonstrating that robotic devices are effective in enabling independent living in community dwelling older people? Following database searches for relevant literature an initial yield of 161 articles was obtained. Titles and abstracts of articles were then reviewed by 2 independent people to determine suitability for inclusion. Forty-two articles met the criteria for question 1. Of these, 4 articles met the criteria for question 2. Results showed that robotics is currently available to assist older healthy people and people with disabilities to remain independent and to monitor their safety and social connectedness. Most studies were conducted in laboratories and hospital clinics. Currently limited evidence demonstrates that robots can be used to enable people to remain living at home, although this is an emerging smart technology that is rapidly evolving.

  14. Dinosaur remains from the type Maastrichtian: An update

    NARCIS (Netherlands)

    Weishampel, David B.; Mulder, Eric W A; Dortangs, Rudi W.; Jagt, John W M; Jianu, Coralia Maria; Kuypers, Marcel M M; Peeters, Hans H G; Schulp, Anne S.

    1999-01-01

    Isolated cranial and post-cranial remains of hadrosaurid dinosaurs have been collected from various outcrops in the type area of the Maastrichtian stage during the last few years. In the present contribution, dentary and maxillary teeth are recorded from the area for the first time. Post-cranial

  15. Seropositive abdominal and thoracic donor organs are largely underutilized.

    Science.gov (United States)

    Taylor, R M; Pietroski, R E; Hagan, M; Eisenbrey, A B; Fontana, R J

    2010-12-01

    The aim of this study was to describe the epidemiology and utilization of anti-hepatitis B core protein(+) and anti-hepatitis C virus(+) organ donor referrals in a large organ procurement organization. Between 1995 and 2006, 3,134 deceased organ donor referrals were tested for anti-HBc and anti-HCV using commercial assays. The prevalence of anti-HCV(+) organ donor referrals significantly increased from 3.4% in 1994-1996 to 8.1% in 2003-2005 (P organ donor referrals remained unchanged at 3%-4% (P = .20). The 112 anti-HBc(+) (3.5%) and 173 anti-HCV(+) (5.5%) organ donor referrals were significantly older and more likely to be noncaucasian than seronegative organ donor referrals (P donor organs were significantly lower compared with seronegative organ donors (P donors over time (21% vs 46%; P = .026), whereas utilization of anti-HCV(+) liver donors remained unchanged over time (5% vs 18%; P = .303). In summary, the proportion of anti-HCV(+) organ donor referrals has significantly increased and the proportion of anti-HBc(+) organ donor referrals has remained stable. Both thoracic and abdominal organs from seropositive donors are largely underutilized. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. UCLALES-SALSA v1.0: a large-eddy model with interactive sectional microphysics for aerosol, clouds and precipitation

    Science.gov (United States)

    Tonttila, Juha; Maalick, Zubair; Raatikainen, Tomi; Kokkola, Harri; Kühn, Thomas; Romakkaniemi, Sami

    2017-01-01

    Challenges in understanding the aerosol-cloud interactions and their impacts on global climate highlight the need for improved knowledge of the underlying physical processes and feedbacks as well as their interactions with cloud and boundary layer dynamics. To pursue this goal, increasingly sophisticated cloud-scale models are needed to complement the limited supply of observations of the interactions between aerosols and clouds. For this purpose, a new large-eddy simulation (LES) model, coupled with an interactive sectional description for aerosols and clouds, is introduced. The new model builds and extends upon the well-characterized UCLA Large-Eddy Simulation Code (UCLALES) and the Sectional Aerosol module for Large-Scale Applications (SALSA), hereafter denoted as UCLALES-SALSA. Novel strategies for the aerosol, cloud and precipitation bin discretisation are presented. These enable tracking the effects of cloud processing and wet scavenging on the aerosol size distribution as accurately as possible, while keeping the computational cost of the model as low as possible. The model is tested with two different simulation set-ups: a marine stratocumulus case in the DYCOMS-II campaign and another case focusing on the formation and evolution of a nocturnal radiation fog. It is shown that, in both cases, the size-resolved interactions between aerosols and clouds have a critical influence on the dynamics of the boundary layer. The results demonstrate the importance of accurately representing the wet scavenging of aerosol in the model. Specifically, in a case with marine stratocumulus, precipitation and the subsequent removal of cloud activating particles lead to thinning of the cloud deck and the formation of a decoupled boundary layer structure. In radiation fog, the growth and sedimentation of droplets strongly affect their radiative properties, which in turn drive new droplet formation. The size-resolved diagnostics provided by the model enable investigations of these

  17. 76 FR 14057 - Notice of Inventory Completion: University of Wyoming, Anthropology Department, Human Remains...

    Science.gov (United States)

    2011-03-15

    ...: University of Wyoming, Anthropology Department, Human Remains Repository, Laramie, WY AGENCY: National Park... Anthropology Department, Human Remains Repository, Laramie, WY. The human remains and associated funerary... the human remains was made by University of Wyoming, Anthropology Department, Human Remains Repository...

  18. Large-area, lightweight and thick biomimetic composites with superior material properties via fast, economic, and green pathways.

    Science.gov (United States)

    Walther, Andreas; Bjurhager, Ingela; Malho, Jani-Markus; Pere, Jaakko; Ruokolainen, Janne; Berglund, Lars A; Ikkala, Olli

    2010-08-11

    Although remarkable success has been achieved to mimic the mechanically excellent structure of nacre in laboratory-scale models, it remains difficult to foresee mainstream applications due to time-consuming sequential depositions or energy-intensive processes. Here, we introduce a surprisingly simple and rapid methodology for large-area, lightweight, and thick nacre-mimetic films and laminates with superior material properties. Nanoclay sheets with soft polymer coatings are used as ideal building blocks with intrinsic hard/soft character. They are forced to rapidly self-assemble into aligned nacre-mimetic films via paper-making, doctor-blading or simple painting, giving rise to strong and thick films with tensile modulus of 45 GPa and strength of 250 MPa, that is, partly exceeding nacre. The concepts are environmentally friendly, energy-efficient, and economic and are ready for scale-up via continuous roll-to-roll processes. Excellent gas barrier properties, optical translucency, and extraordinary shape-persistent fire-resistance are demonstrated. We foresee advanced large-scale biomimetic materials, relevant for lightweight sustainable construction and energy-efficient transportation.

  19. Process component inventory in a large commercial reprocessing facility

    International Nuclear Information System (INIS)

    Canty, M.J.; Berliner, A.; Spannagel, G.

    1983-01-01

    Using a computer simulation program, the equilibrium operation of the Pu-extraction and purification processes of a reference commercial reprocessing facility was investigated. Particular attention was given to the long-term net fluctuations of Pu inventories in hard-to-measure components such as the solvent extraction contractors. Comparing the variance of these inventories with the measurement variance for Pu contained in feed, analysis and buffer tanks, it was concluded that direct or indirect periodic estimation of contactor inventories would not contribute significantly to improving the quality of closed material balances over the process MBA

  20. Using value stream mapping technique through the lean production transformation process: An implementation in a large-scaled tractor company

    Directory of Open Access Journals (Sweden)

    Mehmet Rıza Adalı

    2017-04-01

    Full Text Available Today’s world, manufacturing industries have to continue their development and continuity in more competitive environment via decreasing their costs. As a first step in the lean production process transformation is to analyze the value added activities and non-value adding activities. This study aims at applying the concepts of Value Stream Mapping (VSM in a large-scaled tractor company in Sakarya. Waste and process time are identified by mapping the current state in the production line of platform. The future state was suggested with improvements for elimination of waste and reduction of lead time, which went from 13,08 to 4,35 days. Analysis are made using current and future states to support the suggested improvements and cycle time of the production line of platform is improved 8%. Results showed that VSM is a good alternative in the decision-making for change in production process.

  1. A review of sex estimation techniques during examination of skeletal remains in forensic anthropology casework.

    Science.gov (United States)

    Krishan, Kewal; Chatterjee, Preetika M; Kanchan, Tanuj; Kaur, Sandeep; Baryah, Neha; Singh, R K

    2016-04-01

    Sex estimation is considered as one of the essential parameters in forensic anthropology casework, and requires foremost consideration in the examination of skeletal remains. Forensic anthropologists frequently employ morphologic and metric methods for sex estimation of human remains. These methods are still very imperative in identification process in spite of the advent and accomplishment of molecular techniques. A constant boost in the use of imaging techniques in forensic anthropology research has facilitated to derive as well as revise the available population data. These methods however, are less reliable owing to high variance and indistinct landmark details. The present review discusses the reliability and reproducibility of various analytical approaches; morphological, metric, molecular and radiographic methods in sex estimation of skeletal remains. Numerous studies have shown a higher reliability and reproducibility of measurements taken directly on the bones and hence, such direct methods of sex estimation are considered to be more reliable than the other methods. Geometric morphometric (GM) method and Diagnose Sexuelle Probabiliste (DSP) method are emerging as valid methods and widely used techniques in forensic anthropology in terms of accuracy and reliability. Besides, the newer 3D methods are shown to exhibit specific sexual dimorphism patterns not readily revealed by traditional methods. Development of newer and better methodologies for sex estimation as well as re-evaluation of the existing ones will continue in the endeavour of forensic researchers for more accurate results. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Multitrophic microbial interactions for eco- and agro-biotechnological processes: theory and practice.

    Science.gov (United States)

    Saleem, Muhammad; Moe, Luke A

    2014-10-01

    Multitrophic level microbial loop interactions mediated by protist predators, bacteria, and viruses drive eco- and agro-biotechnological processes such as bioremediation, wastewater treatment, plant growth promotion, and ecosystem functioning. To what extent these microbial interactions are context-dependent in performing biotechnological and ecosystem processes remains largely unstudied. Theory-driven research may advance the understanding of eco-evolutionary processes underlying the patterns and functioning of microbial interactions for successful development of microbe-based biotechnologies for real world applications. This could also be a great avenue to test the validity or limitations of ecology theory for managing diverse microbial resources in an era of altering microbial niches, multitrophic interactions, and microbial diversity loss caused by climate and land use changes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  4. Large, but not small, antigens require time- and temperature-dependent processing in accessory cells before they can be recognized by T cells

    DEFF Research Database (Denmark)

    Buus, S; Werdelin, O

    1986-01-01

    We have studied if antigens of different size and structure all require processing in antigen-presenting cells of guinea-pigs before they can be recognized by T cells. The method of mild paraformaldehyde fixation was used to stop antigen-processing in the antigen-presenting cells. As a measure...... of antigen presentation we used the proliferative response of appropriately primed T cells during a co-culture with the paraformaldehyde-fixed and antigen-exposed presenting cells. We demonstrate that the large synthetic polypeptide antigen, dinitrophenyl-poly-L-lysine, requires processing. After an initial......-dependent and consequently energy-requiring. Processing is strongly inhibited by the lysosomotrophic drug, chloroquine, suggesting a lysosomal involvement in antigen processing. The existence of a minor, non-lysosomal pathway is suggested, since small amounts of antigen were processed even at 10 degrees C, at which...

  5. Performance of the front-end signal processing electronics for the drift chambers of the Stanford Large Detector

    International Nuclear Information System (INIS)

    Honma, A.; Haller, G.M.; Usher, T.; Shypit, R.

    1990-10-01

    This paper reports on the performance of the front-end analog and digital signal processing electronics for the drift chambers of the Stanford Large Detector (SLD) detector at the Stanford Linear Collider. The electronics mounted on printed circuit boards include up to 64 channels of transimpedance amplification, analog sampling, A/D conversion, and associated control circuitry. Measurements of the time resolution, gain, noise, linearity, crosstalk, and stability of the readout electronics are described and presented. The expected contribution of the electronics to the relevant drift chamber measurement resolutions (i.e., timing and charge division) is given

  6. Ocean acidification induces biochemical and morphological changes in the calcification process of large benthic foraminifera.

    Science.gov (United States)

    Prazeres, Martina; Uthicke, Sven; Pandolfi, John M

    2015-03-22

    Large benthic foraminifera are significant contributors to sediment formation on coral reefs, yet they are vulnerable to ocean acidification. Here, we assessed the biochemical and morphological impacts of acidification on the calcification of Amphistegina lessonii and Marginopora vertebralis exposed to different pH conditions. We measured growth rates (surface area and buoyant weight) and Ca-ATPase and Mg-ATPase activities and calculated shell density using micro-computer tomography images. In A. lessonii, we detected a significant decrease in buoyant weight, a reduction in the density of inner skeletal chambers, and an increase of Ca-ATPase and Mg-ATPase activities at pH 7.6 when compared with ambient conditions of pH 8.1. By contrast, M. vertebralis showed an inhibition in Mg-ATPase activity under lowered pH, with growth rate and skeletal density remaining constant. While M. vertebralis is considered to be more sensitive than A. lessonii owing to its high-Mg-calcite skeleton, it appears to be less affected by changes in pH, based on the parameters assessed in this study. We suggest difference in biochemical pathways of calcification as the main factor influencing response to changes in pH levels, and that A. lessonii and M. vertebralis have the ability to regulate biochemical functions to cope with short-term increases in acidity. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  7. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    underestimation of wet-to-dry-season droughts and snow-related droughts. Furthermore, almost no composite droughts were simulated for slowly responding areas, while many multi-year drought events were expected in these systems.

    We conclude that most drought propagation processes are reasonably well reproduced by the ensemble mean of large-scale models in contrasting catchments in Europe. Challenges, however, remain in catchments with cold and semi-arid climates and catchments with large storage in aquifers or lakes. This leads to a high uncertainty in hydrological drought simulation at large scales. Improvement of drought simulation in large-scale models should focus on a better representation of hydrological processes that are important for drought development, such as evapotranspiration, snow accumulation and melt, and especially storage. Besides the more explicit inclusion of storage in large-scale models, also parametrisation of storage processes requires attention, for example through a global-scale dataset on aquifer characteristics, improved large-scale datasets on other land characteristics (e.g. soils, land cover, and calibration/evaluation of the models against observations of storage (e.g. in snow, groundwater.

  8. How engineering data management and system support the main process[-oriented] functions of a large-scale project

    CERN Document Server

    Hameri, A P

    1999-01-01

    By dividing the development process into successive functional operations, this paper studies the benefits of establishing configuration management procedures and of using an engineering data management systems (EDMS) in order to execute the tasks. The underlying environment is that of CERN and the ongoing, a decade long, Large Hadron Collider (LHC)-project. By identifying the main functional groups who will use the EDMS the paper outlines the basic motivations and services provided by such a system to each process function. The implications of strict configuration management on the daily operation of each functional user group are also discussed. The main argument of the paper is that each and every user of the EDMS must act in compliance with the configuration management procedures to guarantee the overall benefits from the system. The pilot EDMS being developed at CERN, which serves as a test-bed to discover the real functional needs of the organisation of an EDMS supports the conclusions. The preliminary ...

  9. Methodology for Extraction of Remaining Sodium of Used Sodium Containers

    International Nuclear Information System (INIS)

    Jung, Minhwan; Kim, Jongman; Cho, Youngil; Jeong, Jiyoung

    2014-01-01

    Sodium used as a coolant in the SFR (Sodium-cooled Fast Reactor) reacts easily with most elements due to its high reactivity. If sodium at high temperature leaks outside of a system boundary and makes contact with oxygen, it starts to burn and toxic aerosols are produced. In addition, it generates flammable hydrogen gas through a reaction with water. Hydrogen gas can be explosive within the range of 4.75 vol%. Therefore, the sodium should be handled carefully in accordance with standard procedures even though there is a small amount of target sodium remainings inside the containers and drums used for experiment. After the experiment, all sodium experimental apparatuses should be dismantled carefully through a series of draining, residual sodium extraction, and cleaning if they are no longer reused. In this work, a system for the extraction of the remaining sodium of used sodium drums has been developed and an operation procedure for the system has been established. In this work, a methodology for the extraction of remaining sodium out of the used sodium container has been developed as one of the sodium facility maintenance works. The sodium extraction system for remaining sodium of the used drums was designed and tested successfully. This work will contribute to an establishment of sodium handling technology for PGSFR. (Prototype Gen-IV Sodium-cooled Fast Reactor)

  10. Large-Scale Consumption and Zero-Waste Recycling Method of Red Mud in Steel Making Process

    Directory of Open Access Journals (Sweden)

    Guoshan Ning

    2018-03-01

    Full Text Available To release the environmental pressure from the massive discharge of bauxite residue (red mud, a novel recycling method of red mud in steel making process was investigated through high-temperature experiments and thermodynamic analysis. The results showed that after the reduction roasting of the carbon-bearing red mud pellets at 1100–1200 °C for 12–20 min, the metallic pellets were obtained with the metallization ratio of ≥88%. Then, the separation of slag and iron achieved from the metallic pellets at 1550 °C, after composition adjustment targeting the primary crystal region of the 12CaO·7Al2O3 phase. After iron removal and composition adjustment, the smelting-separation slag had good smelting performance and desulfurization capability, which meets the demand of sulfurization flux in steel making process. The pig iron quality meets the requirements of the high-quality raw material for steel making. In virtue of the huge scale and output of steel industry, the large-scale consumption and zero-waste recycling method of red mud was proposed, which comprised of the carbon-bearing red mud pellets roasting in the rotary hearth furnace and smelting separation in the electric arc furnace after composition adjustment.

  11. Distributed processing and network of data acquisition and diagnostics control for Large Helical Device (LHD)

    International Nuclear Information System (INIS)

    Nakanishi, H.; Kojima, M.; Hidekuma, S.

    1997-11-01

    The LHD (Large Helical Device) data processing system has been designed in order to deal with the huge amount of diagnostics data of 600-900 MB per 10-second short-pulse experiment. It prepares the first plasma experiment in March 1998. The recent increase of the data volume obliged to adopt the fully distributed system structure which uses multiple data transfer paths in parallel and separates all of the computer functions into clients and servers. The fundamental element installed for every diagnostic device consists of two kinds of server computers; the data acquisition PC/Windows NT and the real-time diagnostics control VME/VxWorks. To cope with diversified kinds of both device control channels and diagnostics data, the object-oriented method are utilized wholly for the development of this system. It not only reduces the development burden, but also widen the software portability and flexibility. 100Mbps EDDI-based fast networks will re-integrate the distributed server computers so that they can behave as one virtual macro-machine for users. Network methods applied for the LHD data processing system are completely based on the TCP/IP internet technology, and it provides the same accessibility to the remote collaborators as local participants can operate. (author)

  12. Colorful niches of phytoplankton shaped by the spatial connectivity in a large river ecosystem: a riverscape perspective.

    Directory of Open Access Journals (Sweden)

    Jean-Jacques Frenette

    Full Text Available Large rivers represent a significant component of inland waters and are considered sentinels and integrators of terrestrial and atmospheric processes. They represent hotspots for the transport and processing of organic and inorganic material from the surrounding landscape, which ultimately impacts the bio-optical properties and food webs of the rivers. In large rivers, hydraulic connectivity operates as a major forcing variable to structure the functioning of the riverscape, and--despite increasing interest in large-river studies--riverscape structural properties, such as the underwater spectral regime, and their impact on autotrophic ecological processes remain poorly studied. Here we used the St. Lawrence River to identify the mechanisms structuring the underwater spectral environment and their consequences on pico- and nanophytoplankton communities, which are good biological tracers of environmental changes. Our results, obtained from a 450 km sampling transect, demonstrate that tributaries exert a profound impact on the receiving river's photosynthetic potential. This occurs mainly through injection of chromophoric dissolved organic matter (CDOM and non-algal material (tripton. CDOM and tripton in the water column selectively absorbed wavelengths in a gradient from blue to red, and the resulting underwater light climate was in turn a strong driver of the phytoplankton community structure (prokaryote/eukaryote relative and absolute abundances at scales of many kilometers from the tributary confluence. Our results conclusively demonstrate the proximal impact of watershed properties on underwater spectral composition in a highly dynamic river environment characterized by unique structuring properties such as high directional connectivity, numerous sources and forms of carbon, and a rapidly varying hydrodynamic regime. We surmise that the underwater spectral composition represents a key integrating and structural property of large, heterogeneous

  13. Cyclododecane as support material for clean and facile transfer of large-area few-layer graphene

    International Nuclear Information System (INIS)

    Capasso, A.; Leoni, E.; Dikonimos, T.; Buonocore, F.; Lisi, N.; De Francesco, M.; Lancellotti, L.; Bobeico, E.; Sarto, M. S.; Tamburrano, A.; De Bellis, G.

    2014-01-01

    The transfer of chemical vapor deposited graphene is a crucial process, which can affect the quality of the transferred films and compromise their application in devices. Finding a robust and intrinsically clean material capable of easing the transfer of graphene without interfering with its properties remains a challenge. We here propose the use of an organic compound, cyclododecane, as a transfer material. This material can be easily spin coated on graphene and assist the transfer, leaving no residues and requiring no further removal processes. The effectiveness of this transfer method for few-layer graphene on a large area was evaluated and confirmed by microscopy, Raman spectroscopy, x-ray photoemission spectroscopy, and four-point probe measurements. Schottky-barrier solar cells with few-layer graphene were fabricated on silicon wafers by using the cyclododecane transfer method and outperformed reference cells made by standard methods.

  14. Robotics to Enable Older Adults to Remain Living at Home

    OpenAIRE

    Pearce, Alan J.; Adair, Brooke; Miller, Kimberly; Ozanne, Elizabeth; Said, Catherine; Santamaria, Nick; Morris, Meg E.

    2012-01-01

    Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1) what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2) what is the evidence demonstrating that robotic devices are effec...

  15. State and municipal innovations in obesity policy: why localities remain a necessary laboratory for innovation.

    Science.gov (United States)

    Reeve, Belinda; Ashe, Marice; Farias, Ruben; Gostin, Lawrence

    2015-03-01

    Municipal and state governments are surging ahead in obesity prevention, providing a testing ground for innovative policies and shifting social norms in the process. Though high-profile measures such as New York City's soda portion rule attract significant media attention, we catalog the broader array of initiatives in less-known localities. Local innovation advances prevention policy, but faces legal and political constraints-constitutional challenges, preemption, charges of paternalism, lack of evidence, and widening health inequalities. These arguments can be met with astute framing, empirical evidence, and policy design, enabling local governments to remain at the forefront in transforming obesogenic environments.

  16. New implementation of OGC Web Processing Service in Python programming language. PyWPS-4 and issues we are facing with processing of large raster data using OGC WPS

    Directory of Open Access Journals (Sweden)

    J. Čepický

    2016-06-01

    Full Text Available The OGC® Web Processing Service (WPS Interface Standard provides rules for standardizing inputs and outputs (requests and responses for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4 completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.

  17. Plasma processing conditions substantially influence circulating microRNA biomarker levels.

    Science.gov (United States)

    Cheng, Heather H; Yi, Hye Son; Kim, Yeonju; Kroh, Evan M; Chien, Jason W; Eaton, Keith D; Goodman, Marc T; Tait, Jonathan F; Tewari, Muneesh; Pritchard, Colin C

    2013-01-01

    Circulating, cell-free microRNAs (miRNAs) are promising candidate biomarkers, but optimal conditions for processing blood specimens for miRNA measurement remain to be established. Our previous work showed that the majority of plasma miRNAs are likely blood cell-derived. In the course of profiling lung cancer cases versus healthy controls, we observed a broad increase in circulating miRNA levels in cases compared to controls and that higher miRNA expression correlated with higher platelet and particle counts. We therefore hypothesized that the quantity of residual platelets and microparticles remaining after plasma processing might impact miRNA measurements. To systematically investigate this, we subjected matched plasma from healthy individuals to stepwise processing with differential centrifugation and 0.22 µm filtration and performed miRNA profiling. We found a major effect on circulating miRNAs, with the majority (72%) of detectable miRNAs substantially affected by processing alone. Specifically, 10% of miRNAs showed 4-30x variation, 46% showed 30-1,000x variation, and 15% showed >1,000x variation in expression solely from processing. This was predominantly due to platelet contamination, which persisted despite using standard laboratory protocols. Importantly, we show that platelet contamination in archived samples could largely be eliminated by additional centrifugation, even in frozen samples stored for six years. To minimize confounding effects in microRNA biomarker studies, additional steps to limit platelet contamination for circulating miRNA biomarker studies are necessary. We provide specific practical recommendations to help minimize confounding variation attributable to plasma processing and platelet contamination.

  18. Inducing a health-promoting change process within an organization: the effectiveness of a large-scale intervention on social capital, openness, and autonomous motivation toward health.

    Science.gov (United States)

    van Scheppingen, Arjella R; de Vroome, Ernest M M; Ten Have, Kristin C J M; Bos, Ellen H; Zwetsloot, Gerard I J M; van Mechelen, W

    2014-11-01

    To examine the effectiveness of an organizational large-scale intervention applied to induce a health-promoting organizational change process. A quasi-experimental, "as-treated" design was used. Regression analyses on data of employees of a Dutch dairy company (n = 324) were used to examine the effects on bonding social capital, openness, and autonomous motivation toward health and on employees' lifestyle, health, vitality, and sustainable employability. Also, the sensitivity of the intervention components was examined. Intervention effects were found for bonding social capital, openness toward health, smoking, healthy eating, and sustainable employability. The effects were primarily attributable to the intervention's dialogue component. The change process initiated by the large-scale intervention contributed to a social climate in the workplace that promoted health and ownership toward health. The study confirms the relevance of collective change processes for health promotion.

  19. Enhanced microbial coalbed methane generation: A review of research, commercial activity, and remaining challenges

    Science.gov (United States)

    Ritter, Daniel J.; Vinson, David S.; Barnhart, Elliott P.; Akob, Denise M.; Fields, Matthew W.; Cunningham, Al B.; Orem, William H.; McIntosh, Jennifer C.

    2015-01-01

    Coalbed methane (CBM) makes up a significant portion of the world’s natural gas resources. The discovery that approximately 20% of natural gas is microbial in origin has led to interest in microbially enhanced CBM (MECoM), which involves stimulating microorganisms to produce additional CBM from existing production wells. This paper reviews current laboratory and field research on understanding processes and reservoir conditions which are essential for microbial CBM generation, the progress of efforts to stimulate microbial methane generation in coal beds, and key remaining knowledge gaps. Research has been primarily focused on identifying microbial communities present in areas of CBM generation and attempting to determine their function, in-situ reservoir conditions that are most favorable for microbial CBM generation, and geochemical indicators of metabolic pathways of methanogenesis (i.e., acetoclastic or hydrogenotrophic methanogenesis). Meanwhile, researchers at universities, government agencies, and companies have focused on four primary MECoM strategies: 1) microbial stimulation (i.e., addition of nutrients to stimulate native microbes); 2) microbial augmentation (i.e., addition of microbes not native to or abundant in the reservoir of interest); 3) physically increasing microbial access to coal and distribution of amendments; and 4) chemically increasing the bioavailability of coal organics. Most companies interested in MECoM have pursued microbial stimulation: Luca Technologies, Inc., successfully completed a pilot scale field test of their stimulation strategy, while two others, Ciris Energy and Next Fuel, Inc., have undertaken smaller scale field tests. Several key knowledge gaps remain that need to be addressed before MECoM strategies can be implemented commercially. Little is known about the bacterial community responsible for coal biodegradation and how these microorganisms may be stimulated to enhance microbial methanogenesis. In addition, research

  20. Career Motivation in Newly Licensed Registered Nurses: What Makes Them Remain

    Science.gov (United States)

    Banks, Zarata Mann; Bailey, Jessica H.

    2010-01-01

    Despite vast research on newly licensed registered nurses (RNs), we don't know why some newly licensed registered nurses remain in their current jobs and others leave the nursing profession early in their career. Job satisfaction, the most significant factor emerging from the literature, plays a significant role in nurses' decisions to remain in…

  1. Calculating the price of tanks, vessels and process equipment of petrochemical industry second criteria of integrity and survival remaining of API RP 579 (Fitness for service); Calculo do preco de tanques, vasos e equipamentos de processo da industria petroquimica segundo criterios de integridade e sobrevida remanescente do API RP 579 (Fitness for service)

    Energy Technology Data Exchange (ETDEWEB)

    Morato, Paulo Cesar Vidal Morato [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    By owning many tanks, vessels and process equipment, PETROBRAS has developed the concept of 'Fitness-For-Service' (suitability for use) under the standard API RP 579, i.e. to verify the structural integrity and remaining useful life of equipment in service. In this paper we will discuss how to calculate the remaining useful life of equipment used in accordance with such criteria and with this technical data, calculate the depreciated price. Steps: verification of applicability; surveys of the technical data of the equipment; surveys the minimum thickness of plating equipment over the years; calculation of the average annual rate of corrosion (tc); calculation of the required minimum thickness according to the criteria of API RP 579 (tr); calculation of remaining useful life (nr); calculation of the depreciated price (Vd) equipment. Conclusions: intended for evaluation of tanks price, vessels and process equipment according to API RP 579 concepts. Estimate the remaining useful life of equipment used and calculates the depreciated price. Scientific method based, consistent and robust, due to calculating established the remaining useful life. (author)

  2. Micromorphological Aspects of Forensic Geopedology: can vivianite be a marker of human remains permanence in soil?

    Science.gov (United States)

    Ern, Stephania Irmgard Elena; Trombino, Luca; Cattaneo, Cristina

    2010-05-01

    The number of death cases of forensic interest grows up every year. When decomposed or skeletal remains come out from the soil, the bones become of anthropological competence and the scene of crime become of soil specialists competence. The present study concerns real cases of buried/hidden remains in clandestine graves which have been studied in order to prove the permanence in soil even if the soil particles have been washed away or the body is no more buried. One hypothesis has been taken in account, related to the evidences of vivianite crystallization on the bones. The vivianite is an iron hydrate phosphate (Fe3(PO4)2·8(H2O)) that usually forms in anoxic, reducing and rich in organic matter conditions. In these conditions the iron in the soil is in reduced form (Fe2+) and associates with the phosphorous, present in the environment, as attested in archaeological contexts. Going back to the cases of buried/hidden remains, it is possible to state that the soil can be source of iron, while the bones can supply phosphorous and the decomposition process induces the anoxic/reducing conditions in the burial area. In this light, the presence of vivianite crystallizations on the bones could be a method to discriminate burial (i.e. permanence in soil) even if the remains are found in a different context than a clandestine grave. Analyses have been performed using petrographic microscope and scanning electron microscope microanalysis (SEM-EDS) on bones, and point out the presence of vivianite crystallizations on the bones. This evidence, thanks to the significance of vivianite in the archaeological context, can be regarded as a marker of the permanence of the human remains into the soil, like a ‘buried evidence' testimonial; on the contrary the absence of vivianite is not indicative of a ‘non buried status'. Further studies and new experiments are in progress in order to clarify the pathways of vivianite crystallization on different skeletal districts, in different

  3. New insight in the template decomposition process of large zeolite ZSM-5 crystals: an in situ UV-Vis/fluorescence micro-spectroscopy study

    NARCIS (Netherlands)

    Karwacki, L.|info:eu-repo/dai/nl/304824283; Weckhuysen, B.M.|info:eu-repo/dai/nl/285484397

    2011-01-01

    A combination of in situ UV-Vis and confocal fluorescence micro-spectroscopy was used to study the template decomposition process in large zeolite ZSM-5 crystals. Correlation of polarized light dependent UV-Vis absorption spectra with confocal fluorescence emission spectra in the 400–750 nm region

  4. Karhunen-Loève (PCA) based detection of multiple oscillations in multiple measurement signals from large-scale process plants

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Wickerhauser, M.V.

    2007-01-01

     In the perspective of optimizing the control and operation of large scale process plants, it is important to detect and to locate oscillations in the plants. This paper presents a scheme for detecting and localizing multiple oscillations in multiple measurements from such a large-scale power plant....... The scheme is based on a Karhunen-Lo\\`{e}ve analysis of the data from the plant. The proposed scheme is subsequently tested on two sets of data: a set of synthetic data and a set of data from a coal-fired power plant. In both cases the scheme detects the beginning of the oscillation within only a few samples....... In addition the oscillation localization has also shown its potential by localizing the oscillations in both data sets....

  5. The future of large old trees in urban landscapes.

    Science.gov (United States)

    Le Roux, Darren S; Ikin, Karen; Lindenmayer, David B; Manning, Adrian D; Gibbons, Philip

    2014-01-01

    Large old trees are disproportionate providers of structural elements (e.g. hollows, coarse woody debris), which are crucial habitat resources for many species. The decline of large old trees in modified landscapes is of global conservation concern. Once large old trees are removed, they are difficult to replace in the short term due to typically prolonged time periods needed for trees to mature (i.e. centuries). Few studies have investigated the decline of large old trees in urban landscapes. Using a simulation model, we predicted the future availability of native hollow-bearing trees (a surrogate for large old trees) in an expanding city in southeastern Australia. In urban greenspace, we predicted that the number of hollow-bearing trees is likely to decline by 87% over 300 years under existing management practices. Under a worst case scenario, hollow-bearing trees may be completely lost within 115 years. Conversely, we predicted that the number of hollow-bearing trees will likely remain stable in semi-natural nature reserves. Sensitivity analysis revealed that the number of hollow-bearing trees perpetuated in urban greenspace over the long term is most sensitive to the: (1) maximum standing life of trees; (2) number of regenerating seedlings ha(-1); and (3) rate of hollow formation. We tested the efficacy of alternative urban management strategies and found that the only way to arrest the decline of large old trees requires a collective management strategy that ensures: (1) trees remain standing for at least 40% longer than currently tolerated lifespans; (2) the number of seedlings established is increased by at least 60%; and (3) the formation of habitat structures provided by large old trees is accelerated by at least 30% (e.g. artificial structures) to compensate for short term deficits in habitat resources. Immediate implementation of these recommendations is needed to avert long term risk to urban biodiversity.

  6. On random age and remaining lifetime for populations of items

    DEFF Research Database (Denmark)

    Finkelstein, M.; Vaupel, J.

    2015-01-01

    We consider items that are incepted into operation having already a random (initial) age and define the corresponding remaining lifetime. We show that these lifetimes are identically distributed when the age distribution is equal to the equilibrium distribution of the renewal theory. Then we...... develop the population studies approach to the problem and generalize the setting in terms of stationary and stable populations of items. We obtain new stochastic comparisons for the corresponding population ages and remaining lifetimes that can be useful in applications. Copyright (c) 2014 John Wiley...

  7. Beyond single syllables: large-scale modeling of reading aloud with the Connectionist Dual Process (CDP++) model.

    Science.gov (United States)

    Perry, Conrad; Ziegler, Johannes C; Zorzi, Marco

    2010-09-01

    Most words in English have more than one syllable, yet the most influential computational models of reading aloud are restricted to processing monosyllabic words. Here, we present CDP++, a new version of the Connectionist Dual Process model (Perry, Ziegler, & Zorzi, 2007). CDP++ is able to simulate the reading aloud of mono- and disyllabic words and nonwords, and learns to assign stress in exactly the same way as it learns to associate graphemes with phonemes. CDP++ is able to simulate the monosyllabic benchmark effects its predecessor could, and therefore shows full backwards compatibility. CDP++ also accounts for a number of novel effects specific to disyllabic words, including the effects of stress regularity and syllable number. In terms of database performance, CDP++ accounts for over 49% of the reaction time variance on items selected from the English Lexicon Project, a very large database of several thousand of words. With its lexicon of over 32,000 words, CDP++ is therefore a notable example of the successful scaling-up of a connectionist model to a size that more realistically approximates the human lexical system. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Technical basis and programmatic requirements for large block testing of coupled thermal-mechanical-hydrological-chemical processes

    International Nuclear Information System (INIS)

    Lin, Wunan.

    1993-09-01

    This document contains the technical basis and programmatic requirements for a scientific investigation plan that governs tests on a large block of tuff for understanding the coupled thermal- mechanical-hydrological-chemical processes. This study is part of the field testing described in Section 8.3.4.2.4.4.1 of the Site Characterization Plan (SCP) for the Yucca Mountain Project. The first, and most important objective is to understand the coupled TMHC processes in order to develop models that will predict the performance of a nuclear waste repository. The block and fracture properties (including hydrology and geochemistry) can be well characterized from at least five exposed surfaces, and the block can be dismantled for post-test examinations. The second objective is to provide preliminary data for development of models that will predict the quality and quantity of water in the near-field environment of a repository over the current 10,000 year regulatory period of radioactive decay. The third objective is to develop and evaluate the various measurement systems and techniques that will later be employed in the Engineered Barrier System Field Tests (EBSFT)

  9. Engaging the public with low-carbon energy technologies: Results from a Scottish large group process

    International Nuclear Information System (INIS)

    Howell, Rhys; Shackley, Simon; Mabon, Leslie; Ashworth, Peta; Jeanneret, Talia

    2014-01-01

    This paper presents the results of a large group process conducted in Edinburgh, Scotland investigating public perceptions of climate change and low-carbon energy technologies, specifically carbon dioxide capture and storage (CCS). The quantitative and qualitative results reported show that the participants were broadly supportive of efforts to reduce carbon dioxide emissions, and that there is an expressed preference for renewable energy technologies to be employed to achieve this. CCS was considered in detail during the research due to its climate mitigation potential; results show that the workshop participants were cautious about its deployment. The paper discusses a number of interrelated factors which appear to influence perceptions of CCS; factors such as the perceived costs and benefits of the technology, and people's personal values and trust in others all impacted upon participants’ attitudes towards the technology. The paper thus argues for the need to provide the public with broad-based, balanced and trustworthy information when discussing CCS, and to take seriously the full range of factors that influence public perceptions of low-carbon technologies. - Highlights: • We report the results of a Scottish large group workshop on energy technologies. • There is strong public support for renewable energy and mixed opinions towards CCS. • The workshop was successful in initiating discussion around climate change and energy technologies. • Issues of trust, uncertainty, costs, benefits, values and emotions all inform public perceptions. • Need to take seriously the full range of factors that inform perceptions

  10. Mineral remains of early life on Earth? On Mars?

    Science.gov (United States)

    Iberall, Robbins E.; Iberall, A.S.

    1991-01-01

    The oldest sedimentary rocks on Earth, the 3.8-Ga Isua Iron-Formation in southwestern Greenland, are metamorphosed past the point where organic-walled fossils would remain. Acid residues and thin sections of these rocks reveal ferric microstructures that have filamentous, hollow rod, and spherical shapes not characteristic of crystalline minerals. Instead, they resemble ferric-coated remains of bacteria. Because there are no earlier sedimentary rocks to study on Earth, it may be necessary to expand the search elsewhere in the solar system for clues to any biotic precursors or other types of early life. A study of morphologies of iron oxide minerals collected in the southern highlands during a Mars sample return mission may therefore help to fill in important gaps in the history of Earth's earliest biosphere. -from Authors

  11. Process for recovering yttrium and lanthanides from wet-process phosphoric acid

    Energy Technology Data Exchange (ETDEWEB)

    Janssen, J.A.; Weterings, C.A.

    1983-06-28

    Process for recovering yttrium and lanthanides from wet-process phosphoric acid by adding a flocculant to the phosphoric acid, separating out the resultant precipitate and then recovering yttrium and lanthanides from the precipitate. Uranium is recovered from the remaining phosphoric acid.

  12. Large packages for reactor decommissioning waste

    International Nuclear Information System (INIS)

    Price, M.S.T.

    1991-01-01

    This study was carried out jointly by the Atomic Energy Establishment at Winfrith (now called the Winfrith Technology Centre), Windscale Laboratory and Ove Arup and Partners. The work involved the investigation of the design of large transport containers for intermediate level reactor decommissioning waste, ie waste which requires shielding, and is aimed at European requirements (ie for both LWR and gas cooled reactors). It proposes a design methodology for such containers covering the whole lifetime of a waste disposal package. The design methodology presented takes account of various relevant constraints. Both large self shielded and returnable shielded concepts were developed. The work was generic, rather than specific; the results obtained, and the lessons learned, remain to be applied in practice

  13. USING THE BUSINESS ENGINEERING APPROACH IN THE DEVELOPMENT OF A STRATEGIC MANAGEMENT PROCESS FOR A LARGE CORPORATION: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    C.M. Moll

    2012-01-01

    Full Text Available Most South African organisations were historically part of a closed competitive system with little global competition and a relatively stable economy (Manning: 18, Sunter: 32. Since the political transformation, the globalisation of the world economy, the decline of world economic fundamentals and specific challenges in the South African scenario such as GEAR and employment equity, the whole playingfield has changed. With these changes, new challenges ', appear. A significant challenge for organisations within this scenario is to think, plan and manage strategically. In order to do so, the organisation must understand its relationship with its environment and establish innovative new strategies to manipulate; interact with; and ultimately survive in the environment. The legacy of the past has, in many organisations, implanted an operational short-term focus because the planning horizon was stable. It was sufficient to construct annual plans rather than strategies. These plans were typically internally focused rather than driven by the external environment. Strategic planning in this environment tended to be a form of team building through which the various members of the organisation 's management team discussed and documented the problems of the day. A case study is presented of the development of a strategic management process for a large South African Mining company. The authors believe that the approach is a new and different way of addressing a problem that exists in many organisations - the establishment of a process of strategic thinking, whilst at the same time ensuring that a formal process of strategic planning is followed in order to prompt the management of the organisation for strategic action. The lessons that were drawn from this process are applicable to a larger audience due to the homogenous nature of the management style of a large number of South African organisations.

  14. The anaerobic digestion process

    Energy Technology Data Exchange (ETDEWEB)

    Rivard, C.J. [National Renewable Energy Lab., Golden, CO (United States); Boone, D.R. [Oregon Graduate Inst., Portland, OR (United States)

    1996-01-01

    The microbial process of converting organic matter into methane and carbon dioxide is so complex that anaerobic digesters have long been treated as {open_quotes}black boxes.{close_quotes} Research into this process during the past few decades has gradually unraveled this complexity, but many questions remain. The major biochemical reactions for forming methane by methanogens are largely understood, and evolutionary studies indicate that these microbes are as different from bacteria as they are from plants and animals. In anaerobic digesters, methanogens are at the terminus of a metabolic web, in which the reactions of myriads of other microbes produce a very limited range of compounds - mainly acetate, hydrogen, and formate - on which the methanogens grow and from which they form methane. {open_quotes}Interspecies hydrogen-transfer{close_quotes} and {open_quotes}interspecies formate-transfer{close_quotes} are major mechanisms by which methanogens obtain their substrates and by which volatile fatty acids are degraded. Present understanding of these reactions and other complex interactions among the bacteria involved in anaerobic digestion is only now to the point where anaerobic digesters need no longer be treated as black boxes.

  15. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  16. Shotgun microbial profiling of fossil remains

    DEFF Research Database (Denmark)

    Der Sarkissian, Clio; Ermini, Luca; Jónsson, Hákon

    2014-01-01

    the specimen of interest, but instead reflect environmental organisms that colonized the specimen after death. Here, we characterize the microbial diversity recovered from seven c. 200- to 13 000-year-old horse bones collected from northern Siberia. We use a robust, taxonomy-based assignment approach...... to identify the microorganisms present in ancient DNA extracts and quantify their relative abundance. Our results suggest that molecular preservation niches exist within ancient samples that can potentially be used to characterize the environments from which the remains are recovered. In addition, microbial...... community profiling of the seven specimens revealed site-specific environmental signatures. These microbial communities appear to comprise mainly organisms that colonized the fossils recently. Our approach significantly extends the amount of useful data that can be recovered from ancient specimens using...

  17. Platelet-rich plasma and chronic wounds: remaining fibronectin may influence matrix remodeling and regeneration success.

    Science.gov (United States)

    Moroz, Andrei; Deffune, Elenice

    2013-11-01

    Platelet-rich plasma has been largely used as a therapeutic option for the treatment of chronic wounds of different etiologies. The enhanced regeneration observed after the use of platelet-rich plasma has been systematically attributed to the growth factors that are present inside platelets' granules. We hypothesize that the remaining plasma and platelet-bound fibronectin may act as a further bioactive protein in platelet-rich plasma preparations. Recent reports were analyzed and presented as direct evidences of this hypotheses. Fibronectin may directly influence the extracellular matrix remodeling during wound repair. This effect is probably through matrix metalloproteinase expression, thus exerting an extra effect on chronic wound regeneration. Physicians should be well aware of the possible fibronectin-induced effects in their future endeavors with PRP in chronic wound treatment. Copyright © 2013 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  18. Bio-inspired wooden actuators for large scale applications.

    Science.gov (United States)

    Rüggeberg, Markus; Burgert, Ingo

    2015-01-01

    Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules.

  19. Production of High Quality Die Steels from Large ESR Slab Ingots

    Science.gov (United States)

    Geng, Xin; Jiang, Zhou-hua; Li, Hua-bing; Liu, Fu-bin; Li, Xing

    With the rapid development of manufacture industry in China, die steels are in great need of large slab ingot of high quality and large tonnage, such as P20, WSM718R and so on. Solidification structure and size of large slab ingots produced with conventional methods are not satisfied. However, large slab ingots manufactured by ESR process have a good solidification structure and enough section size. In the present research, the new slab ESR process was used to produce the die steels large slab ingots with the maximum size of 980×2000×3200mm. The compact and sound ingot can be manufactured by the slab ESR process. The ultra-heavy plates with the maximum thickness of 410 mm can be obtained after rolling the 49 tons ingots. Due to reducing the cogging and forging process, the ESR for large slab ingots process can increase greatly the yield and production efficiency, and evidently cut off product costs.

  20. Neanderthal infant and adult infracranial remains from Marillac (Charente, France).

    Science.gov (United States)

    Dolores Garralda, María; Maureille, Bruno; Vandermeersch, Bernard

    2014-09-01

    At the site of Marillac, near the Ligonne River in Marillac-le-Franc (Charente, France), a remarkable stratigraphic sequence has yielded a wealth of archaeological information, palaeoenvironmental data, as well as faunal and human remains. Marillac must have been a sinkhole used by Neanderthal groups as a hunting camp during MIS 4 (TL date 57,600 ± 4,600BP), where Quina Mousterian lithics and fragmented bones of reindeer predominate. This article describes three infracranial skeleton fragments. Two of them are from adults and consist of the incomplete shafts of a right radius (Marillac 24) and a left fibula (Marillac 26). The third fragment is the diaphysis of the right femur of an immature individual (Marillac 25), the size and shape of which resembles those from Teshik-Tash and could be assigned to a child of a similar age. The three fossils have been compared with the remains of other Neanderthals or anatomically Modern Humans (AMH). Furthermore, the comparison of the infantile femora, Marillac 25 and Teshik-Tash, with the remains of several European children from the early Middle Ages clearly demonstrates the robustness and rounded shape of both Neanderthal diaphyses. Evidence of peri-mortem manipulations have been identified on all three bones, with spiral fractures, percussion pits and, in the case of the radius and femur, unquestionable cutmarks made with flint implements, probably during defleshing. Traces of periostosis appear on the fibula fragment and on the immature femoral diaphysis, although their aetiology remains unknown. Copyright © 2014 Wiley Periodicals, Inc.

  1. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  2. Large Eddy Simulation of Transient Flow, Solidification, and Particle Transport Processes in Continuous-Casting Mold

    Science.gov (United States)

    Liu, Zhongqiu; Li, Linmin; Li, Baokuan; Jiang, Maofa

    2014-07-01

    The current study developed a coupled computational model to simulate the transient fluid flow, solidification, and particle transport processes in a slab continuous-casting mold. Transient flow of molten steel in the mold is calculated using the large eddy simulation. An enthalpy-porosity approach is used for the analysis of solidification processes. The transport of bubble and non-metallic inclusion inside the liquid pool is calculated using the Lagrangian approach based on the transient flow field. A criterion of particle entrapment in the solidified shell is developed using the user-defined functions of FLUENT software (ANSYS, Inc., Canonsburg, PA). The predicted results of this model are compared with the measurements of the ultrasonic testing of the rolled steel plates and the water model experiments. The transient asymmetrical flow pattern inside the liquid pool exhibits quite satisfactory agreement with the corresponding measurements. The predicted complex instantaneous velocity field is composed of various small recirculation zones and multiple vortices. The transport of particles inside the liquid pool and the entrapment of particles in the solidified shell are not symmetric. The Magnus force can reduce the entrapment ratio of particles in the solidified shell, especially for smaller particles, but the effect is not obvious. The Marangoni force can play an important role in controlling the motion of particles, which increases the entrapment ratio of particles in the solidified shell obviously.

  3. Remote Methodology used at B Plant Hanford to Map High Radiation and Contamination Fields and Document Remaining Hazards

    Energy Technology Data Exchange (ETDEWEB)

    SIMMONS, F.M.

    2000-01-01

    A remote radiation mapping system using the Gammacam{trademark} (AIL Systems Inc. Trademark) with real-time response was used in deactivating the B Plant at Hanford to produce digitized images showing actual radiation fields and dose rates. Deployment of this technology has significantly reduced labor requirements, decreased personnel exposure, and increased the accuracy of the measurements. Personnel entries into the high radiation/contamination areas was minimized for a dose savings of 30 Rem (.3 Seivert) and a cost savings of $640K. In addition, the data gathered was utilized along with historical information to estimate the amount of remaining hazardous waste in the process cells. The B Plant facility is a canyon facility containing 40 process cells which were used to separate cesium and strontium from high level waste. The cells and vessels are contaminated with chemicals used in the separation and purification processes. Most of the contaminants have been removed but the residual contamination from spills in the cells and heels in the tanks contribute to the localized high radioactivity. The Gammacam{trademark} system consists of a high density terbium-activated scintillating glass detector coupled with a digitized video camera. Composite images generated by the system are presented in pseudo color over a black and white image. Exposure times can be set from 10 milliseconds to 1 hour depending on the field intensity. This information coupled with process knowledge is then used to document the hazardous waste remaining in each cell. Additional uses for this radiation mapping system would be in support of facilities stabilization and deactivation activities at Hanford or other DOE sites. The system is currently scheduled for installation and mapping of the U Plant in 1999. This system is unique due to its portability and its suitability for use in high dose rate areas.

  4. Remote Methodology used at B Plant Hanford to Map High Radiation and Contamination Fields and Document Remaining Hazards

    International Nuclear Information System (INIS)

    SIMMONS, F.M.

    2000-01-01

    A remote radiation mapping system using the Gammacam(trademark) (AIL Systems Inc. Trademark) with real-time response was used in deactivating the B Plant at Hanford to produce digitized images showing actual radiation fields and dose rates. Deployment of this technology has significantly reduced labor requirements, decreased personnel exposure, and increased the accuracy of the measurements. Personnel entries into the high radiation/contamination areas was minimized for a dose savings of 30 Rem (.3 Seivert) and a cost savings of $640K. In addition, the data gathered was utilized along with historical information to estimate the amount of remaining hazardous waste in the process cells. The B Plant facility is a canyon facility containing 40 process cells which were used to separate cesium and strontium from high level waste. The cells and vessels are contaminated with chemicals used in the separation and purification processes. Most of the contaminants have been removed but the residual contamination from spills in the cells and heels in the tanks contribute to the localized high radioactivity. The Gammacam(trademark) system consists of a high density terbium-activated scintillating glass detector coupled with a digitized video camera. Composite images generated by the system are presented in pseudo color over a black and white image. Exposure times can be set from 10 milliseconds to 1 hour depending on the field intensity. This information coupled with process knowledge is then used to document the hazardous waste remaining in each cell. Additional uses for this radiation mapping system would be in support of facilities stabilization and deactivation activities at Hanford or other DOE sites. The system is currently scheduled for installation and mapping of the U Plant in 1999. This system is unique due to its portability and its suitability for use in high dose rate areas

  5. Endogenous System Microbes as Treatment Process ...

    Science.gov (United States)

    Monitoring the efficacy of treatment strategies to remove pathogens in decentralized systems remains a challenge. Evaluating log reduction targets by measuring pathogen levels is hampered by their sporadic and low occurrence rates. Fecal indicator bacteria are used in centralized systems to indicate the presence of fecal pathogens, but are ineffective decentralized treatment process indicators as they generally occur at levels too low to assess log reduction targets. System challenge testing by spiking with high loads of fecal indicator organisms, like MS2 coliphage, has limitations, especially for large systems. Microbes that are endogenous to the decentralized system, occur in high abundances and mimic removal rates of bacterial, viral and/or parasitic protozoan pathogens during treatment could serve as alternative treatment process indicators to verify log reduction targets. To identify abundant microbes in wastewater, the bacterial and viral communities were examined using deep sequencing. Building infrastructure-associated bacteria, like Zoogloea, were observed as dominant members of the bacterial community in graywater. In blackwater, bacteriophage of the order Caudovirales constituted the majority of contiguous sequences from the viral community. This study identifies candidate treatment process indicators in decentralized systems that could be used to verify log removal during treatment. The association of the presence of treatment process indic

  6. Remaining useful life estimation based on discriminating shapelet extraction

    International Nuclear Information System (INIS)

    Malinowski, Simon; Chebel-Morello, Brigitte; Zerhouni, Noureddine

    2015-01-01

    In the Prognostics and Health Management domain, estimating the remaining useful life (RUL) of critical machinery is a challenging task. Various research topics including data acquisition, fusion, diagnostics and prognostics are involved in this domain. This paper presents an approach, based on shapelet extraction, to estimate the RUL of equipment. This approach extracts, in an offline step, discriminative rul-shapelets from an history of run-to-failure data. These rul-shapelets are patterns that are selected for their correlation with the remaining useful life of the equipment. In other words, every selected rul-shapelet conveys its own information about the RUL of the equipment. In an online step, these rul-shapelets are compared to testing units and the ones that match these units are used to estimate their RULs. Therefore, RUL estimation is based on patterns that have been selected for their high correlation with the RUL. This approach is different from classical similarity-based approaches that attempt to match complete testing units (or only late instants of testing units) with training ones to estimate the RUL. The performance of our approach is evaluated on a case study on the remaining useful life estimation of turbofan engines and performance is compared with other similarity-based approaches. - Highlights: • A data-driven RUL estimation technique based on pattern extraction is proposed. • Patterns are extracted for their correlation with the RUL. • The proposed method shows good performance compared to other techniques

  7. A comparison of parallel dust and fibre measurements of airborne chrysotile asbestos in a large mine and processing factories in the Russian Federation

    NARCIS (Netherlands)

    Feletto, Eleonora; Schonfeld, Sara J; Kovalevskiy, Evgeny V; Bukhtiyarov, Igor V; Kashanskiy, Sergey V; Moissonnier, Monika; Straif, Kurt; Kromhout, Hans

    2017-01-01

    INTRODUCTION: Historic dust concentrations are available in a large-scale cohort study of workers in a chrysotile mine and processing factories in Asbest, Russian Federation. Parallel dust (gravimetric) and fibre (phase-contrast optical microscopy) concentrations collected in 1995, 2007 and 2013/14

  8. Signal Formation Processes in Micromegas Detectors and Quality Control for large size Detector Construction for the ATLAS New Small Wheel

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00387450; Rembser, Christoph

    2017-08-04

    The Micromegas technology is one of the most successful modern gaseous detector concepts and widely utilized in nuclear and particle physics experiments. Twenty years of R & D rendered the technology sufficiently mature to be selected as precision tracking detector for the New Small Wheel (NSW) upgrade of the ATLAS Muon spectrometer. This will be the first large scale application of Micromegas in one of the major LHC experiments. However, many of the fundamental microscopic processes in these gaseous detectors are still not fully understood and studies on several detector aspects, like the micromesh geometry, have never been addressed systematically. The studies on signal formation in Micromegas, presented in the first part of this thesis, focuses on the microscopic signal electron loss mechanisms and the amplification processes in electron gas interaction. Based on a detailed model of detector parameter dependencies, these processes are scrutinized in an iterating comparison between exper- imental result...

  9. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  10. A High Density Low Cost Digital Signal Processing Module for Large Scale Radiation Detectors

    International Nuclear Information System (INIS)

    Tan, Hui; Hennig, Wolfgang; Walby, Mark D.; Breus, Dimitry; Harris, Jackson T.; Grudberg, Peter M.; Warburton, William K.

    2013-06-01

    A 32-channel digital spectrometer PIXIE-32 is being developed for nuclear physics or other radiation detection applications requiring digital signal processing with large number of channels at relatively low cost. A single PIXIE-32 provides spectrometry and waveform acquisition for 32 input signals per module whereas multiple modules can be combined into larger systems. It is based on the PCI Express standard which allows data transfer rates to the host computer of up to 800 MB/s. Each of the 32 channels in a PIXIE-32 module accepts signals directly from a detector preamplifier or photomultiplier. Digitally controlled offsets can be individually adjusted for each channel. Signals are digitized in 12-bit, 50 MHz multi-channel ADCs. Triggering, pile-up inspection and filtering of the data stream are performed in real time, and pulse heights and other event data are calculated on an event-by event basis. The hardware architecture, internal and external triggering features, and the spectrometry and waveform acquisition capability of the PIXIE- 32 as well as its capability to distribute clock and triggers among multiple modules, are presented. (authors)

  11. Accelerating solidification process simulation for large-sized system of liquid metal atoms using GPU with CUDA

    Energy Technology Data Exchange (ETDEWEB)

    Jie, Liang [School of Information Science and Engineering, Hunan University, Changshang, 410082 (China); Li, KenLi, E-mail: lkl@hnu.edu.cn [School of Information Science and Engineering, Hunan University, Changshang, 410082 (China); National Supercomputing Center in Changsha, 410082 (China); Shi, Lin [School of Information Science and Engineering, Hunan University, Changshang, 410082 (China); Liu, RangSu [School of Physics and Micro Electronic, Hunan University, Changshang, 410082 (China); Mei, Jing [School of Information Science and Engineering, Hunan University, Changshang, 410082 (China)

    2014-01-15

    Molecular dynamics simulation is a powerful tool to simulate and analyze complex physical processes and phenomena at atomic characteristic for predicting the natural time-evolution of a system of atoms. Precise simulation of physical processes has strong requirements both in the simulation size and computing timescale. Therefore, finding available computing resources is crucial to accelerate computation. However, a tremendous computational resource (GPGPU) are recently being utilized for general purpose computing due to its high performance of floating-point arithmetic operation, wide memory bandwidth and enhanced programmability. As for the most time-consuming component in MD simulation calculation during the case of studying liquid metal solidification processes, this paper presents a fine-grained spatial decomposition method to accelerate the computation of update of neighbor lists and interaction force calculation by take advantage of modern graphics processors units (GPU), enlarging the scale of the simulation system to a simulation system involving 10 000 000 atoms. In addition, a number of evaluations and tests, ranging from executions on different precision enabled-CUDA versions, over various types of GPU (NVIDIA 480GTX, 580GTX and M2050) to CPU clusters with different number of CPU cores are discussed. The experimental results demonstrate that GPU-based calculations are typically 9∼11 times faster than the corresponding sequential execution and approximately 1.5∼2 times faster than 16 CPU cores clusters implementations. On the basis of the simulated results, the comparisons between the theoretical results and the experimental ones are executed, and the good agreement between the two and more complete and larger cluster structures in the actual macroscopic materials are observed. Moreover, different nucleation and evolution mechanism of nano-clusters and nano-crystals formed in the processes of metal solidification is observed with large

  12. MODELING OF MANAGEMENT PROCESSES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Stefan Iovan

    2016-05-01

    Full Text Available When driving any major change within an organization, strategy and execution are intrinsic to a project’s success. Nevertheless, closing the gap between strategy and execution remains a challenge for many organizations [1]. Companies tend to focus more on execution than strategy for quick results, instead of taking the time needed to understand the parts that make up the whole, so the right execution plan can be put in place to deliver the best outcomes. A large part of this understands that business operations don’t fit neatly within the traditional organizational hierarchy. Business processes are often messy, collaborative efforts that cross teams, departments and systems, making them difficult to manage within a hierarchical structure [2]. Business process management (BPM fills this gap by redefining an organization according to its end-to-end processes, so opportunities for improvement can be identified and processes streamlined for growth, revenue and transformation. This white paper provides guidelines on what to consider when using business process applications to solve your BPM initiatives, and the unique capabilities software systems provides that can help ensure both your project’s success and the success of your organization as a whole. majority of medium and small businesses, big companies and even some guvermental organizations [2].

  13. Archosauriform remains from the Late Triassic of San Luis province, Argentina, Quebrada del Barro Formation, Marayes-El Carrizal Basin

    Science.gov (United States)

    Gianechini, Federico A.; Codorniú, Laura; Arcucci, Andrea B.; Castillo Elías, Gabriela; Rivarola, David

    2016-03-01

    Here we present archosauriform remains from 'Abra de los Colorados', a fossiliferous locality at Sierra de Guayaguas, NW San Luis Province. Two fossiliferous levels were identified in outcrops of the Quebrada del Barro Formation (Norian), which represent the southernmost outcrops of the Marayes-El Carrizal Basin. These levels are composed by massive muddy lithofacies, interpreted as floodplain deposits. The specimens consist of one incomplete maxilla (MIC-V718), one caudal vertebra (MIC-V719), one metatarsal (MIC-V720) and one indeterminate appendicular bone (MIC-V721). The materials can be assigned to Archosauriformes but the fragmentary nature and lack of unambiguous synapomorphies preclude a more precise taxomic assignment. The maxilla is remarkably large and robust and represents the posterior process. It preserved one partially erupted tooth with ziphodont morphology. This bone shows some anatomical traits and size match with 'rauisuchians' and theropods. MIC-V719 corresponds to a proximal caudal vertebra. It has a high centrum, a ventral longitudinal furrow, expanded articular processes for the chevrons, a posteriorly displaced diapophysis located below the level of the prezygapophyses, and short prezygapophyses. This vertebra would be from an indeterminate archosauriform. MIC-V720 presents a cylindrical diaphysis, with a well-developed distal trochlea, which present resemblances with metatarsals of theropods, pseudosuchians, and silesaurids, although the size matches better with theropods. MIC-V721 has a slender diaphysis and a convex triangular articular surface, and corresponds to an indeterminate archosauriform. Despite being fragmentary, these materials indicate the presence of a diverse archosauriforms association from Late Triassic beds of San Luis. Thus, they add to the faunal assemblage recently reported from this basin at San Juan Province, which is much rich and diverse than the coeval paleofauna well known from Los Colorados Formation in the

  14. Modeling of a Large-Scale High Temperature Regenerative Sulfur Removal Process

    DEFF Research Database (Denmark)

    Konttinen, Jukka T.; Johnsson, Jan Erik

    1999-01-01

    model that does not account for bed hydrodynamics. The pilot-scale test run results, obtained in the test runs of the sulfur removal process with real coal gasifier gas, have been used for parameter estimation. The validity of the reactor model for commercial-scale design applications is discussed.......Regenerable mixed metal oxide sorbents are prime candidates for the removal of hydrogen sulfide from hot gasifier gas in the simplified integrated gasification combined cycle (IGCC) process. As part of the regenerative sulfur removal process development, reactor models are needed for scale......-up. Steady-state kinetic reactor models are needed for reactor sizing, and dynamic models can be used for process control design and operator training. The regenerative sulfur removal process to be studied in this paper consists of two side-by-side fluidized bed reactors operating at temperatures of 400...

  15. Highly efficient DNA extraction method from skeletal remains

    Directory of Open Access Journals (Sweden)

    Irena Zupanič Pajnič

    2011-03-01

    Full Text Available Background: This paper precisely describes the method of DNA extraction developed to acquire high quality DNA from the Second World War skeletal remains. The same method is also used for molecular genetic identification of unknown decomposed bodies in routine forensic casework where only bones and teeth are suitable for DNA typing. We analysed 109 bones and two teeth from WWII mass graves in Slovenia. Methods: We cleaned the bones and teeth, removed surface contaminants and ground the bones into powder, using liquid nitrogen . Prior to isolating the DNA in parallel using the BioRobot EZ1 (Qiagen, the powder was decalcified for three days. The nuclear DNA of the samples were quantified by real-time PCR method. We acquired autosomal genetic profiles and Y-chromosome haplotypes of the bones and teeth with PCR amplification of microsatellites, and mtDNA haplotypes 99. For the purpose of traceability in the event of contamination, we prepared elimination data bases including genetic profiles of the nuclear and mtDNA of all persons who have been in touch with the skeletal remains in any way. Results: We extracted up to 55 ng DNA/g of the teeth, up to 100 ng DNA/g of the femurs, up to 30 ng DNA/g of the tibias and up to 0.5 ng DNA/g of the humerus. The typing of autosomal and YSTR loci was successful in all of the teeth, in 98 % dekalof the femurs, and in 75 % to 81 % of the tibias and humerus. The typing of mtDNA was successful in all of the teeth, and in 96 % to 98 % of the bones. Conclusions: We managed to obtain nuclear DNA for successful STR typing from skeletal remains that were over 60 years old . The method of DNA extraction described here has proved to be highly efficient. We obtained 0.8 to 100 ng DNA/g of teeth or bones and complete genetic profiles of autosomal DNA, Y-STR haplotypes, and mtDNA haplotypes from only 0.5g bone and teeth samples.

  16. "SINCE I MUST PLEASE THOSE BELOW": HUMAN SKELETAL REMAINS RESEARCH AND THE LAW.

    Science.gov (United States)

    Holland, Thomas D

    2015-01-01

    The ethics of non-invasive scientific research on human skeletal remains are poorly articulated and lack a single, definitive analogue in western law. Laws governing invasive research on human fleshed remains, as well as bio-ethical principles established for research on living subjects, provide effective models for the establishment of ethical guidelines for non-invasive research on human skeletal remains. Specifically, non-invasive analysis of human remains is permissible provided that the analysis and collection of resulting data (1) are accomplished with respect for the dignity of the individual, (2) do not violate the last-known desire of the deceased, (3) do not adversely impact the right of the next of kin to perform a ceremonious and decent disposal of the remains, and (4) do not unduly or maliciously violate the privacy interests of the next of kin.

  17. Supersize me: Remains of three white-tailed deer (Odocoileus virginianus) in an invasive Burmese python (Python molurus bivittatus) in Florida

    Science.gov (United States)

    Boback, Scott M.; Snow, Ray W.; Hsu, Teresa; Peurach, Suzanne C.; Dove, Carla J.; Reed, Robert N.

    2016-01-01

    Snakes have become successful invaders in a wide variety of ecosystems worldwide. In southern Florida, USA, the Burmese python (Python molurus bivittatus) has become established across thousands of square kilometers including all of Everglades National Park (ENP). Both experimental and correlative data have supported a relationship between Burmese python predation and declines or extirpations of mid- to large-sized mammals in ENP. In June 2013 a large python (4.32 m snout-vent length, 48.3 kg) was captured and removed from the park. Subsequent necropsy revealed a massive amount of fecal matter (79 cm in length, 6.5 kg) within the snake’s large intestine. A comparative examination of bone, teeth, and hooves extracted from the fecal contents revealed that this snake consumed three white-tailed deer (Odocoileus virginianus). This is the first report of an invasive Burmese python containing the remains of multiple white-tailed deer in its gut. Because the largest snakes native to southern Florida are not capable of consuming even mid-sized mammals, pythons likely represent a novel predatory threat to white-tailed deer in these habitats. This work highlights the potential impact of this large-bodied invasive snake and supports the need for more work on invasive predator-native prey relationships.

  18. Utilization of Workflow Process Maps to Analyze Gaps in Critical Event Notification at a Large, Urban Hospital.

    Science.gov (United States)

    Bowen, Meredith; Prater, Adam; Safdar, Nabile M; Dehkharghani, Seena; Fountain, Jack A

    2016-08-01

    Stroke care is a time-sensitive workflow involving multiple specialties acting in unison, often relying on one-way paging systems to alert care providers. The goal of this study was to map and quantitatively evaluate such a system and address communication gaps with system improvements. A workflow process map of the stroke notification system at a large, urban hospital was created via observation and interviews with hospital staff. We recorded pager communication regarding 45 patients in the emergency department (ED), neuroradiology reading room (NRR), and a clinician residence (CR), categorizing transmissions as successful or unsuccessful (dropped or unintelligible). Data analysis and consultation with information technology staff and the vendor informed a quality intervention-replacing one paging antenna and adding another. Data from a 1-month post-intervention period was collected. Error rates before and after were compared using a chi-squared test. Seventy-five pages regarding 45 patients were recorded pre-intervention; 88 pages regarding 86 patients were recorded post-intervention. Initial transmission error rates in the ED, NRR, and CR were 40.0, 22.7, and 12.0 %. Post-intervention, error rates were 5.1, 18.8, and 1.1 %, a statistically significant improvement in the ED (p workflow process maps. The workflow process map effectively defined communication failure parameters, allowing for systematic testing and intervention to improve communication in essential clinical locations.

  19. Reflections on Teaching a Large Class.

    Science.gov (United States)

    Miner, Rick

    1992-01-01

    Uses an analysis of small- and large-class differences as a framework for planning for and teaching a large class. Analyzes the process of developing and offering an organizational behavior class to 141 college students. Suggests ways to improve teaching effectiveness by minimizing psychological and physical distances, redistributing resources,…

  20. Discovering Reference Process Models by Mining Process Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAIS) has emerged, which allows for dynamic process and service changes (e.g., to insert, delete, and move activities and service executions in a running process). This, in turn, has led to a large number of process variants

  1. Lama guanicoe remains from the Chaco ecoregion (Córdoba, Argentina): An osteological approach to the characterization of a relict wild population.

    Science.gov (United States)

    Costa, Thiago; Barri, Fernando

    2018-01-01

    Guanacos (Lama guanicoe) are large ungulates that have been valued by human populations in South America since the Late Pleistocene. Even though they were very abundant until the end of the 19th century (before the high deforestation rate of the last decades), guanacos have nearly disappeared in the Gran Chaco ecoregion, with relicts and isolated populations surviving in some areas, such as the shrubland area near the saline depressions of Córdoba province, Argentina. In this report, we present the first data from a locally endangered guanaco wild population, through the study of skeletal remains recovered in La Providencia ranch. Our results showed that most of the elements belonged to adults aged between 36 and 96 months; sex evaluation showed similar numbers of males and females. Statistical analysis of the body size of modern samples from Córdoba demonstrated that guanacos from the Chaco had large dimensions and presented lower size variability than the modern and archaeological specimens in our database. Moreover, they exhibited dimensions similar to those of modern guanacos from Patagonia and San Juan, and to archaeological specimens from Ongamira and Cerro Colorado, although further genetic studies are needed to corroborate a possible phylogenetic relationship. Finally, we used archaeozoological techniques to provide a first characterization of a relict guanaco population from the Chaco ecoregion, demonstrating its value to the study of modern skeletal remains and species conservation biology.

  2. A digital gigapixel large-format tile-scan camera.

    Science.gov (United States)

    Ben-Ezra, M

    2011-01-01

    Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.

  3. Run-of-River Impoundments Can Remain Unfilled While Transporting Gravel Bedload: Numerical Modeling Results

    Science.gov (United States)

    Pearson, A.; Pizzuto, J. E.

    2015-12-01

    Previous work at run-of-river (ROR) dams in northern Delaware has shown that bedload supplied to ROR impoundments can be transported over the dam when impoundments remain unfilled. Transport is facilitated by high levels of sand in the impoundment that lowers the critical shear stresses for particle entrainment, and an inversely sloping sediment ramp connecting the impoundment bed (where the water depth is typically equal to the dam height) with the top of the dam (Pearson and Pizzuto, in press). We demonstrate with one-dimensional bed material transport modeling that bed material can move through impoundments and that equilibrium transport (i.e., a balance between supply to and export from the impoundment, with a constant bed elevation) is possible even when the bed elevation is below the top of the dam. Based on our field work and previous HEC-RAS modeling, we assess bed material transport capacity at the base of the sediment ramp (and ignore detailed processes carrying sediment up and ramp and over the dam). The hydraulics at the base of the ramp are computed using a weir equation, providing estimates of water depth, velocity, and friction, based on the discharge and sediment grain size distribution of the impoundment. Bedload transport rates are computed using the Wilcock-Crowe equation, and changes in the impoundment's bed elevation are determined by sediment continuity. Our results indicate that impoundments pass the gravel supplied from upstream with deep pools when gravel supply rate is low, gravel grain sizes are relatively small, sand supply is high, and discharge is high. Conversely, impoundments will tend to fill their pools when gravel supply rate is high, gravel grain sizes are relatively large, sand supply is low, and discharge is low. The rate of bedload supplied to an impoundment is the primary control on how fast equilibrium transport is reached, with discharge having almost no influence on the timing of equilibrium.

  4. Predicting the Remaining Useful Life of Rolling Element Bearings

    DEFF Research Database (Denmark)

    Hooghoudt, Jan Otto; Jantunen, E; Yi, Yang

    2018-01-01

    Condition monitoring of rolling element bearings is of vital importance in order to keep the industrial wheels running. In wind industry this is especially important due to the challenges in practical maintenance. The paper presents an attempt to improve the capability of prediction of remaining...

  5. Computer-Controlled Cylindrical Polishing Process for Large X-Ray Mirror Mandrels

    Science.gov (United States)

    Khan, Gufran S.; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian

    2010-01-01

    We are developing high-energy grazing incidence shell optics for hard-x-ray telescopes. The resolution of a mirror shells depends on the quality of cylindrical mandrel from which they are being replicated. Mid-spatial-frequency axial figure error is a dominant contributor in the error budget of the mandrel. This paper presents our efforts to develop a deterministic cylindrical polishing process in order to keep the mid-spatial-frequency axial figure errors to a minimum. Simulation software is developed to model the residual surface figure errors of a mandrel due to the polishing process parameters and the tools used, as well as to compute the optical performance of the optics. The study carried out using the developed software was focused on establishing a relationship between the polishing process parameters and the mid-spatial-frequency error generation. The process parameters modeled are the speeds of the lap and the mandrel, the tool s influence function, the contour path (dwell) of the tools, their shape and the distribution of the tools on the polishing lap. Using the inputs from the mathematical model, a mandrel having conical approximated Wolter-1 geometry, has been polished on a newly developed computer-controlled cylindrical polishing machine. The preliminary results of a series of polishing experiments demonstrate a qualitative agreement with the developed model. We report our first experimental results and discuss plans for further improvements in the polishing process. The ability to simulate the polishing process is critical to optimize the polishing process, improve the mandrel quality and significantly reduce the cost of mandrel production

  6. Quality Function Deployment for Large Systems

    Science.gov (United States)

    Dean, Edwin B.

    1992-01-01

    Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.

  7. Glyphosate-tolerant soybeans remain compositionally equivalent to conventional soybeans (Glycine max L.) during three years of field testing.

    Science.gov (United States)

    McCann, Melinda C; Liu, Keshun; Trujillo, William A; Dobert, Raymond C

    2005-06-29

    Previous studies have shown that the composition of glyphosate-tolerant soybeans (GTS) and selected processed fractions was substantially equivalent to that of conventional soybeans over a wide range of analytes. This study was designed to determine if the composition of GTS remains substantially equivalent to conventional soybeans over the course of several years and when introduced into multiple genetic backgrounds. Soybean seed samples of both GTS and conventional varieties were harvested during 2000, 2001, and 2002 and analyzed for the levels of proximates, lectin, trypsin inhibitor, and isoflavones. The measured analytes are representative of the basic nutritional and biologically active components in soybeans. Results show a similar range of natural variability for the GTS soybeans as well as conventional soybeans. It was concluded that the composition of commercial GTS over the three years of breeding into multiple varieties remains equivalent to that of conventional soybeans.

  8. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  9. The neural basis of novelty and appropriateness in processing of creative chunk decomposition.

    Science.gov (United States)

    Huang, Furong; Fan, Jin; Luo, Jing

    2015-06-01

    Novelty and appropriateness have been recognized as the fundamental features of creative thinking. However, the brain mechanisms underlying these features remain largely unknown. In this study, we used event-related functional magnetic resonance imaging (fMRI) to dissociate these mechanisms in a revised creative chunk decomposition task in which participants were required to perform different types of chunk decomposition that systematically varied in novelty and appropriateness. We found that novelty processing involved functional areas for procedural memory (caudate), mental rewarding (substantia nigra, SN), and visual-spatial processing, whereas appropriateness processing was mediated by areas for declarative memory (hippocampus), emotional arousal (amygdala), and orthography recognition. These results indicate that non-declarative and declarative memory systems may jointly contribute to the two fundamental features of creative thinking. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Screw Remaining Life Prediction Based on Quantum Genetic Algorithm and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Xiaochen Zhang

    2017-01-01

    Full Text Available To predict the remaining life of ball screw, a screw remaining life prediction method based on quantum genetic algorithm (QGA and support vector machine (SVM is proposed. A screw accelerated test bench is introduced. Accelerometers are installed to monitor the performance degradation of ball screw. Combined with wavelet packet decomposition and isometric mapping (Isomap, the sensitive feature vectors are obtained and stored in database. Meanwhile, the sensitive feature vectors are randomly chosen from the database and constitute training samples and testing samples. Then the optimal kernel function parameter and penalty factor of SVM are searched with the method of QGA. Finally, the training samples are used to train optimized SVM while testing samples are adopted to test the prediction accuracy of the trained SVM so the screw remaining life prediction model can be got. The experiment results show that the screw remaining life prediction model could effectively predict screw remaining life.

  11. Authentic leadership: becoming and remaining an authentic nurse leader.

    Science.gov (United States)

    Murphy, Lin G

    2012-11-01

    This article explores how chief nurse executives became and remained authentic leaders. Using narrative inquiry, this qualitative study focused on the life stories of participants. Results demonstrate the importance of reframing, reflection in alignment with values, and the courage needed as nurse leaders progress to authenticity.

  12. Liquid fuels from food waste: An alternative process to co-digestion

    Science.gov (United States)

    Sim, Yoke-Leng; Ch'ng, Boon-Juok; Mok, Yau-Cheng; Goh, Sok-Yee; Hilaire, Dickens Saint; Pinnock, Travis; Adams, Shemlyn; Cassis, Islande; Ibrahim, Zainab; Johnson, Camille; Johnson, Chantel; Khatim, Fatima; McCormack, Andrece; Okotiuero, Mary; Owens, Charity; Place, Meoak; Remy, Cristine; Strothers, Joel; Waithe, Shannon; Blaszczak-Boxe, Christopher; Pratt, Lawrence M.

    2017-04-01

    Waste from uneaten, spoiled, or otherwise unusable food is an untapped source of material for biofuels. A process is described to recover the oil from mixed food waste, together with a solid residue. This process includes grinding the food waste to an aqueous slurry, skimming off the oil, a combined steam treatment of the remaining solids concurrent with extrusion through a porous cylinder to release the remaining oil, a second oil skimming step, and centrifuging the solids to obtain a moist solid cake for fermentation. The water, together with any resulting oil from the centrifuging step, is recycled back to the grinding step, and the cycle is repeated. The efficiency of oil extraction increases with the oil content of the waste, and greater than 90% of the oil was collected from waste containing at least 3% oil based on the wet mass. Fermentation was performed on the solid cake to obtain ethanol, and the dried solid fermentation residue was a nearly odorless material with potential uses of biochar, gasification, or compost production. This technology has the potential to enable large producers of food waste to comply with new laws which require this material to be diverted from landfills.

  13. Large deviations for noninteracting infinite-particle systems

    International Nuclear Information System (INIS)

    Donsker, M.D.; Varadhan, S.R.S.

    1987-01-01

    A large deviation property is established for noninteracting infinite particle systems. Previous large deviation results obtained by the authors involved a single I-function because the cases treated always involved a unique invariant measure for the process. In the context of this paper there is an infinite family of invariant measures and a corresponding infinite family of I-functions governing the large deviations

  14. Recovery of human remains after shark attack.

    Science.gov (United States)

    Byard, Roger W; James, Ross A; Heath, Karen J

    2006-09-01

    Two cases of fatal shark attack are reported where the only tissues recovered were fragments of lung. Case 1: An 18-year-old male who was in the sea behind a boat was observed by friends to be taken by a great white shark (Carcharodon carcharias). The shark dragged him under the water and then, with a second shark, dismembered the body. Witnesses noted a large amount of blood and unrecognizable body parts coming to the surface. The only tissues recovered despite an intensive beach and sea search were 2 fragments of lung. Case 2: A 19-year-old male was attacked by a great white shark while diving. A witness saw the shark swim away with the victim's body in its mouth. Again, despite intensive beach and sea searches, the only tissue recovered was a single piece of lung, along with pieces of wetsuit and diving equipment. These cases indicate that the only tissue to escape being consumed or lost in fatal shark attacks, where there is a significant attack with dismemberment and disruption of the integrity of the body, may be lung. The buoyancy of aerated pulmonary tissue ensures that it rises quickly to the surface, where it may be recovered by searchers soon after the attack. Aeration of the lung would be in keeping with death from trauma rather than from drowning and may be a useful marker in unwitnessed deaths to separate ante- from postmortem injury, using only relatively small amounts of tissues. Early organ recovery enhances the identification of human tissues as the extent of morphologic alterations by putrefactive processes and sea scavengers will have been minimized. DNA testing is also possible on such recovered fragments, enabling confirmation of the identity of the victim.

  15. 20 CFR 408.330 - How long will your application remain in effect?

    Science.gov (United States)

    2010-04-01

    ... effect? 408.330 Section 408.330 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SPECIAL BENEFITS FOR CERTAIN WORLD WAR II VETERANS Filing Applications Filing Your Application § 408.330 How long will your application remain in effect? Your application for SVB will remain in effect from the date it is filed until...

  16. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    Science.gov (United States)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  17. LHCb Online event processing and filtering

    CERN Document Server

    Alessio, F; Brarda, L; Frank, M; Franek, B; Galli, D; Gaspar, C; Van Herwijnen, E; Jacobsson, R; Jost, B; Köstner, S; Moine, G; Neufeld, N; Somogyi, P; Stoica, R; Suman, S

    2008-01-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. ...

  18. Structural remains at the early mediaeval fort at Raibania, Orissa

    Directory of Open Access Journals (Sweden)

    Bratati Sen

    2013-11-01

    Full Text Available The fortifications of mediaeval India occupy an eminent position in the history of military architecture. The present paper deals with the preliminary study of the structural remains at the early mediaeval fort at Raibania in the district of Balasore in Orissa. The fort was built of stone very loosely kept together. The three-walled fortification interspersed by two consecutive moats, a feature evidenced at Raibania, which is unparallel in the history of ancient and mediaeval forts and fortifications in India. Several other structures like the Jay-Chandi Temple Complex, a huge well, numerous tanks and remains of an ancient bridge add to the uniqueness of the Fort in the entire eastern region.

  19. Behavioral and electrophysiological signatures of word translation processes.

    Science.gov (United States)

    Jost, Lea B; Radman, Narges; Buetler, Karin A; Annoni, Jean-Marie

    2018-01-31

    Translation is a demanding process during which a message is analyzed, translated and communicated from one language to another. Despite numerous studies on translation mechanisms, the electrophysiological processes underlying translation with overt production remain largely unexplored. Here, we investigated how behavioral response patterns and spatial-temporal brain dynamics differ in a translation compared to a control within-language word-generation task. We also investigated how forward and backward translation differs on the behavioral and electrophysiological level. To address these questions, healthy late bilingual subjects performed a translation and a within-language control task while a 128-channel EEG was recorded. Behavioral data showed faster responses for translation compared to within-language word generation and faster responses for backward than forward translation. The ERP-analysis revealed stronger early ( processes for between than within word generation. Later (424-630ms) differences were characterized by distinct engagement of domain-general control networks, namely self-monitoring and lexical access interference. Language asymmetry effects occurred at a later stage (600ms), reflecting differences in conceptual processing characterized by a larger involvement of areas implicated in attention, arousal and awareness for forward versus backward translation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Large scale synthesis of α-Si3N4 nanowires through a kinetically favored chemical vapour deposition process

    Science.gov (United States)

    Liu, Haitao; Huang, Zhaohui; Zhang, Xiaoguang; Fang, Minghao; Liu, Yan-gai; Wu, Xiaowen; Min, Xin

    2018-01-01

    Understanding the kinetic barrier and driving force for crystal nucleation and growth is decisive for the synthesis of nanowires with controllable yield and morphology. In this research, we developed an effective reaction system to synthesize very large scale α-Si3N4 nanowires (hundreds of milligrams) and carried out a comparative study to characterize the kinetic influence of gas precursor supersaturation and liquid metal catalyst. The phase composition, morphology, microstructure and photoluminescence properties of the as-synthesized products were characterized by X-ray diffraction, fourier-transform infrared spectroscopy, field emission scanning electron microscopy, transmission electron microscopy and room temperature photoluminescence measurement. The yield of the products not only relates to the reaction temperature (thermodynamic condition) but also to the distribution of gas precursors (kinetic condition). As revealed in this research, by controlling the gas diffusion process, the yield of the nanowire products could be greatly improved. The experimental results indicate that the supersaturation is the dominant factor in the as-designed system rather than the catalyst. With excellent non-flammability and high thermal stability, the large scale α-Si3N4 products would have potential applications to the improvement of strength of high temperature ceramic composites. The photoluminescence spectrum of the α-Si3N4 shows a blue shift which could be valued for future applications in blue-green emitting devices. There is no doubt that the large scale products are the base of these applications.

  1. Genic regions of a large salamander genome contain long introns and novel genes

    Directory of Open Access Journals (Sweden)

    Bryant Susan V

    2009-01-01

    Full Text Available Abstract Background The basis of genome size variation remains an outstanding question because DNA sequence data are lacking for organisms with large genomes. Sixteen BAC clones from the Mexican axolotl (Ambystoma mexicanum: c-value = 32 × 109 bp were isolated and sequenced to characterize the structure of genic regions. Results Annotation of genes within BACs showed that axolotl introns are on average 10× longer than orthologous vertebrate introns and they are predicted to contain more functional elements, including miRNAs and snoRNAs. Loci were discovered within BACs for two novel EST transcripts that are differentially expressed during spinal cord regeneration and skin metamorphosis. Unexpectedly, a third novel gene was also discovered while manually annotating BACs. Analysis of human-axolotl protein-coding sequences suggests there are 2% more lineage specific genes in the axolotl genome than the human genome, but the great majority (86% of genes between axolotl and human are predicted to be 1:1 orthologs. Considering that axolotl genes are on average 5× larger than human genes, the genic component of the salamander genome is estimated to be incredibly large, approximately 2.8 gigabases! Conclusion This study shows that a large salamander genome has a correspondingly large genic component, primarily because genes have incredibly long introns. These intronic sequences may harbor novel coding and non-coding sequences that regulate biological processes that are unique to salamanders.

  2. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  3. 76 FR 14058 - Notice of Inventory Completion: University of Wyoming, Anthropology Department, Human Remains...

    Science.gov (United States)

    2011-03-15

    ...: University of Wyoming, Anthropology Department, Human Remains Repository, Laramie, WY AGENCY: National Park... in the possession and control of the University of Wyoming Anthropology Department, Human Remains... made by University of Wyoming, Anthropology Department, Human Remains Repository, professional staff in...

  4. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  5. Ensemble coding remains accurate under object and spatial visual working memory load.

    Science.gov (United States)

    Epstein, Michael L; Emmanouil, Tatiana A

    2017-10-01

    A number of studies have provided evidence that the visual system statistically summarizes large amounts of information that would exceed the limitations of attention and working memory (ensemble coding). However the necessity of working memory resources for ensemble coding has not yet been tested directly. In the current study, we used a dual task design to test the effect of object and spatial visual working memory load on size averaging accuracy. In Experiment 1, we tested participants' accuracy in comparing the mean size of two sets under various levels of object visual working memory load. Although the accuracy of average size judgments depended on the difference in mean size between the two sets, we found no effect of working memory load. In Experiment 2, we tested the same average size judgment while participants were under spatial visual working memory load, again finding no effect of load on averaging accuracy. Overall our results reveal that ensemble coding can proceed unimpeded and highly accurately under both object and spatial visual working memory load, providing further evidence that ensemble coding reflects a basic perceptual process distinct from that of individual object processing.

  6. The effects of Strongylus vulgaris parasitism on eosinophil distribution and accumulation in equine large intestinal mucosa.

    Science.gov (United States)

    Rötting, A K; Freeman, D E; Constable, P D; Moore, R M; Eurell, J C; Wallig, M A; Hubert, J D

    2008-06-01

    Eosinophilic granulocytes have been associated with parasite or immune-mediated diseases, but their functions in other disease processes remain unclear. Cause and timing of eosinophil migration into the equine gastrointestinal mucosa are also unknown. To determine the effects of intestinal parasitism on eosinophils in equine large intestinal mucosa. Large intestinal mucosal samples were collected from horses and ponies (n = 16) from the general veterinary hospital population, ponies (n = 3) raised in a parasite-free environment, ponies experimentally infected with 500 infective Strongylus vulgaris larvae and treated with a proprietary anthelmintic drug (n = 14), and a similar group of ponies (n = 7) that received no anthelmintic treatment. Total eosinophil counts and eosinophil distribution in the mucosa were determined by histological examination. A mixed model analysis was performed and appropriate Bonferroni adjusted P values used for each family of comparisons. Pvulgaris and those raised in a parasite-free environment. Experimental infection with S. vulgaris, with or without subsequent anthelmintic treatment, did not change eosinophil counts, and counts were similar to those for horses from the general population. Migration of eosinophils to the equine large intestinal mucosa appears to be independent of exposure to parasites. Large intestinal mucosal eosinophils may have more functions in addition to their role in defence against parasites.

  7. Factors affecting the performance of large-aperture microphone arrays

    Science.gov (United States)

    Silverman, Harvey F.; Patterson, William R.; Sachar, Joshua

    2002-05-01

    Large arrays of microphones have been proposed and studied as a possible means of acquiring data in offices, conference rooms, and auditoria without requiring close-talking microphones. When such an array essentially surrounds all possible sources, it is said to have a large aperture. Large-aperture arrays have attractive properties of spatial resolution and signal-to-noise enhancement. This paper presents a careful comparison of theoretical and measured performance for an array of 256 microphones using simple delay-and-sum beamforming. This is the largest currently functional, all digital-signal-processing array that we know of. The array is wall-mounted in the moderately adverse environment of a general-purpose laboratory (8 m×8 m×3 m). The room has a T60 reverberation time of 550 ms. Reverberation effects in this room severely impact the array's performance. However, the width of the main lobe remains comparable to that of a simplified prediction. Broadband spatial resolution shows a single central peak with 10 dB gain about 0.4 m in diameter at the -3 dB level. Away from that peak, the response is approximately flat over most of the room. Optimal weighting for signal-to-noise enhancement degrades the spatial resolution minimally. Experimentally, we verify that signal-to-noise gain is less than proportional to the square root of the number of microphones probably due to the partial correlation of the noise between channels, to variation of signal intensity with polar angle about the source, and to imperfect correlation of the signal over the array caused by reverberations. We show measurements of the relative importance of each effect in our environment.

  8. Hybrid Laser Welding of Large Steel Structures

    DEFF Research Database (Denmark)

    Farrokhi, Farhang

    Manufacturing of large steel structures requires the processing of thick-section steels. Welding is one of the main processes during the manufacturing of such structures and includes a significant part of the production costs. One of the ways to reduce the production costs is to use the hybrid...... laser welding technology instead of the conventional arc welding methods. However, hybrid laser welding is a complicated process that involves several complex physical phenomena that are highly coupled. Understanding of the process is very important for obtaining quality welds in an efficient way....... This thesis investigates two different challenges related to the hybrid laser welding of thick-section steel plates. Employing empirical and analytical approaches, this thesis attempts to provide further knowledge towards obtaining quality welds in the manufacturing of large steel structures....

  9. TO BE OR NOT TO BE: AN INFORMATIVE NON-SYMBOLIC NUMERICAL MAGNITUDE PROCESSING STUDY ABOUT SMALL VERSUS LARGE NUMBERS IN INFANTS

    Directory of Open Access Journals (Sweden)

    Annelies CEULEMANS

    2014-03-01

    Full Text Available Many studies tested the association between numerical magnitude processing and mathematical achievement with conflicting findings reported for individuals with mathematical learning disorders. Some of the inconsistencies might be explained by the number of non-symbolic stimuli or dot collections used in studies. It has been hypothesized that there is an object-file system for ‘small’ and an analogue magnitude system for ‘large’ numbers. This two-system account has been supported by the set size limit of the object-file system (three items. A boundary was defined, accordingly, categorizing numbers below four as ‘small’ and from four and above as ‘large’. However, data on ‘small’ number processing and on the ‘boundary’ between small and large numbers are missing. In this contribution we provide data from infants discriminating between the number sets 4 vs. 8 and 1 vs. 4, both containing the number four combined with a small and a large number respectively. Participants were 25 and 26 full term 9-month-olds for 4 vs. 8 and 1 vs. 4 respectively. The stimuli (dots were controlled for continuous variables. Eye-tracking was combined with the habituation paradigm. The results showed that the infants were successful in discriminating 1 from 4, but failed to discriminate 4 from 8 dots. This finding supports the assumption of the number four as a ‘small’ number and enlarges the object-file system’s limit. This study might help to explain inconsistencies in studies. Moreover, the information may be useful in answering parent’s questions about challenges that vulnerable children with number processing problems, such as children with mathematical learning disorders, might encounter. In addition, the study might give some information on the stimuli that can be used to effectively foster children’s magnitude processing skills.

  10. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  11. Manufacturing process to reduce large grain growth in zirconium alloys

    International Nuclear Information System (INIS)

    Rosecrans, P.M.

    1987-01-01

    A method is described of treating cold worked zirconium alloys to reduce large grain growth during thermal treatment above its recrystallization temperature. The method comprises heating the zirconium alloy at a temperature of about 1300 0 F. to 1350 0 F. for about 1 to 3 hours subsequent to cold working the zirconium alloy and prior to the thermal treatment at a temperature of between 1450 0 -1550 0 F., the thermal treatment temperature being above the recrystallization temperature

  12. Support vector machine based estimation of remaining useful life: current research status and future trends

    International Nuclear Information System (INIS)

    Huang, Hong Zhong; Wang, Hai Kun; Li, Yan Feng; Zhang, Longlong; Liu, Zhiliang

    2015-01-01

    Estimation of remaining useful life (RUL) is helpful to manage life cycles of machines and to reduce maintenance cost. Support vector machine (SVM) is a promising algorithm for estimation of RUL because it can easily process small training sets and multi-dimensional data. Many SVM based methods have been proposed to predict RUL of some key components. We did a literature review related to SVM based RUL estimation within a decade. The references reviewed are classified into two categories: improved SVM algorithms and their applications to RUL estimation. The latter category can be further divided into two types: one, to predict the condition state in the future and then build a relationship between state and RUL; two, to establish a direct relationship between current state and RUL. However, SVM is seldom used to track the degradation process and build an accurate relationship between the current health condition state and RUL. Based on the above review and summary, this paper points out that the ability to continually improve SVM, and obtain a novel idea for RUL prediction using SVM will be future works.

  13. Large deviations in the presence of cooperativity and slow dynamics

    Science.gov (United States)

    Whitelam, Stephen

    2018-06-01

    We study simple models of intermittency, involving switching between two states, within the dynamical large-deviation formalism. Singularities appear in the formalism when switching is cooperative or when its basic time scale diverges. In the first case the unbiased trajectory distribution undergoes a symmetry breaking, leading to a change in shape of the large-deviation rate function for a particular dynamical observable. In the second case the symmetry of the unbiased trajectory distribution remains unbroken. Comparison of these models suggests that singularities of the dynamical large-deviation formalism can signal the dynamical equivalent of an equilibrium phase transition but do not necessarily do so.

  14. Hidden supersymmetry and large N

    International Nuclear Information System (INIS)

    Alfaro, J.

    1988-01-01

    In this paper we present a new method to deal with the leading order in the large-N expansion of a quantum field theory. The method uses explicitly the hidden supersymmetry that is present in the path-integral formulation of a stochastic process. In addition to this we derive a new relation that is valid in the leading order of the large-N expansion of the hermitian-matrix model for any spacetime dimension. (orig.)

  15. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Role of Sediment Size and Biostratinomy on the Development of Biofilms in Recent Avian Vertebrate Remains

    Directory of Open Access Journals (Sweden)

    Joseph E. Peterson

    2017-04-01

    Full Text Available Microscopic soft tissues have been identified in fossil vertebrate remains collected from various lithologies. However, the diagenetic mechanisms to preserve such tissues have remained elusive. While previous studies have described infiltration of biofilms in Haversian and Volkmann's canals, biostratinomic alteration (e.g., trampling, and iron derived from hemoglobin as playing roles in the preservation processes, the influence of sediment texture has not previously been investigated. This study uses a Kolmogorov Smirnov Goodness-of-Fit test to explore the influence of biostratinomic variability and burial media against the infiltration of biofilms in bone samples. Controlled columns of sediment with bone samples were used to simulate burial and subsequent groundwater flow. Sediments used in this study include clay-, silt-, and sand-sized particles modeled after various fluvial facies commonly associated with fossil vertebrates. Extant limb bone samples obtained from Gallus gallus domesticus (Domestic Chicken buried in clay-rich sediment exhibit heavy biofilm infiltration, while bones buried in sands and silts exhibit moderate levels. Crushed bones exhibit significantly lower biofilm infiltration than whole bone samples. Strong interactions between biostratinomic alteration and sediment size are also identified with respect to biofilm development. Sediments modeling crevasse splay deposits exhibit considerable variability; whole-bone crevasse splay samples exhibit higher frequencies of high-level biofilm infiltration, and crushed-bone samples in modeled crevasse splay deposits display relatively high frequencies of low-level biofilm infiltration. These results suggest that sediment size, depositional setting, and biostratinomic condition play key roles in biofilm infiltration in vertebrate remains, and may influence soft tissue preservation in fossil vertebrates.

  17. Role of sediment size and biostratinomy on the development of biofilms in recent avian vertebrate remains

    Science.gov (United States)

    Peterson, Joseph E.; Lenczewski, Melissa E.; Clawson, Steven R.; Warnock, Jonathan P.

    2017-04-01

    Microscopic soft tissues have been identified in fossil vertebrate remains collected from various lithologies. However, the diagenetic mechanisms to preserve such tissues have remained elusive. While previous studies have described infiltration of biofilms in Haversian and Volkmann’s canals, biostratinomic alteration (e.g., trampling), and iron derived from hemoglobin as playing roles in the preservation processes, the influence of sediment texture has not previously been investigated. This study uses a Kolmogorov Smirnov Goodness-of-Fit test to explore the influence of biostratinomic variability and burial media against the infiltration of biofilms in bone samples. Controlled columns of sediment with bone samples were used to simulate burial and subsequent groundwater flow. Sediments used in this study include clay-, silt-, and sand-sized particles modeled after various fluvial facies commonly associated with fossil vertebrates. Extant limb bone samples obtained from Gallus gallus domesticus (Domestic Chicken) buried in clay-rich sediment exhibit heavy biofilm infiltration, while bones buried in sands and silts exhibit moderate levels. Crushed bones exhibit significantly lower biofilm infiltration than whole bone samples. Strong interactions between biostratinomic alteration and sediment size are also identified with respect to biofilm development. Sediments modeling crevasse splay deposits exhibit considerable variability; whole-bone crevasse splay samples exhibit higher frequencies of high-level biofilm infiltration, and crushed-bone samples in modeled crevasse splay deposits display relatively high frequencies of low-level biofilm infiltration. These results suggest that sediment size, depositional setting, and biostratinomic condition play key roles in biofilm infiltration in vertebrate remains, and may influence soft tissue preservation in fossil vertebrates.

  18. A probable case of gigantism/acromegaly in skeletal remains from the Jewish necropolis of "Ronda Sur" (Lucena, Córdoba, Spain; VIII-XII centuries CE).

    Science.gov (United States)

    Viciano, Joan; De Luca, Stefano; López-Lázaro, Sandra; Botella, Daniel; Diéguez-Ramírez, Juan Pablo

    2015-01-01

    Pituitary gigantism is a rare endocrine disorder caused by hypersecretion of growth hormone during growing period. Individuals with this disorder have an enormous growth in height and associated degenerative changes. The continued hypersecretion of growth hormone during adulthood leads to acromegaly, a condition related to the disproportionate bone growth of the skull, hands and feet. The skeletal remains studied belong to a young adult male from the Jewish necropolis of "Ronda Sur" in Lucena (Córdoba, Spain, VIII-XII centuries CE). The individual shows a very large and thick neurocranium, pronounced supraorbital ridges, an extremely prominent occipital protuberance, and an extremely large and massive mandible. Additional pathologies include enlargement of the vertebral bodies with degenerative changes, thickened ribs, and a slight increased length of the diaphysis with an increased cortical bone thickness of lower limbs. Comparative metric analysis of the mandible with other individuals from the same population and a contemporary Mediterranean population shows a trend toward acromegalic morphology. This case is an important contribution in paleopathological literature because it is a rare condition that has not been widely documented in ancient skeletal remains.

  19. Large-area aligned growth of single-crystalline organic nanowire arrays for high-performance photodetectors

    International Nuclear Information System (INIS)

    Wu Yiming; Zhang Xiujuan; Pan Huanhuan; Zhang Xiwei; Zhang Yuping; Zhang Xiaozhen; Jie Jiansheng

    2013-01-01

    Due to their extraordinary properties, single-crystalline organic nanowires (NWs) are important building blocks for future low-cost and efficient nano-optoelectronic devices. However, it remains a critical challenge to assemble organic NWs rationally in an orientation-, dimensionality- and location-controlled manner. Herein, we demonstrate a feasible method for aligned growth of single-crystalline copper phthalocyanine (CuPc) NW arrays with high density, large-area uniformity and perfect crossed alignment by using Au film as a template. The growth process was investigated in detail. The Au film was found to have a critical function in the aligned growth of NWs, but may only serve as the active site for NW nucleation because of the large surface energy, as well as direct the subsequent aligned growth. The as-prepared NWs were then transferred to construct single NW-based photoconductive devices, which demonstrated excellent photoresponse properties with robust stability and reproducibility; the device showed a high switching ratio of ∼180, a fast response speed of ∼100 ms and could stand continuous operation up to 2 h. Importantly, this strategy can be extended to other organic molecules for their synthesis of NW arrays, revealing great potential for use in the construction of large-scale high-performance functional nano-optoelectronic devices. (paper)

  20. Political, energy events will remain interwoven

    International Nuclear Information System (INIS)

    Jones, D.P.

    1991-01-01

    This paper reports that it is possible to discuss the significance of political and energy events separately, but, in truth, they are intricately interwoven. Furthermore, there are those who will argue that since the two are inseparable, the future is not predictable; so why bother in the endeavor. It is possible that the central point of the exercise may have been missed-yes, the future is unpredictable exclamation point However, the objective of prediction is secondary. The objective of understanding the dynamic forces of change is primary exclamation point With this view of recent history, it is perhaps appropriate to pause and think about the future of the petroleum industry. The future as shaped by political, energy, economic, environmental and technological forces will direct our lives and markets during this decade. Most importantly, what will be the direction that successful businesses take to remain competitive in a global environment? These are interesting issues worthy of provocative thoughts and innovative ideas