WorldWideScience

Sample records for cumulative frequency curves

  1. Prediction of a photovoltaic system performance using cumulative frequency curves of radiation

    Energy Technology Data Exchange (ETDEWEB)

    Lasnier, F; Sivoththaman, S [Asian Inst. of Technology, Bangkok (TH). Div. of Energy Technology

    1990-01-01

    The system performance of stand-alone photovoltaic systems is analysed. From the hourly radiation data for Bangkok (from 1984 to 1987) the cumulative frequency curves of radiation are generated and a typical meteorological day (TMD) is created each year. The system performance is determined using both the TMD radiation and the actual radiation values. The comparison results show that the TMD method can be applied for the sizing of stand-alone photovoltaic systems. The storage batteries of realistic sizes usually exhibit a daily cyclic variation in state-of-charge, with constant load consumption. Only very large and unrealistic sizes of batteries show a seasonal variation in state-of-charge. This is the fact that prompted the attempt to predict the system performance for a season by using a single representative day (TMD) of that season. Apart from giving reliable results, the TMD method significantly reduces the computation time and simplifies the process. (author).

  2. Eating behavior in humans, characterized by cumulative food intake curves--a review.

    Science.gov (United States)

    Westerterp-Plantenga, M S

    2000-03-01

    Cumulative food intake curves have been obtained by monitoring eating from a plate, placed on a scale built into a table, and connected to a digital computer. They describe and integrate parameters of consumption of an ad lib single course meal, i.e. meal size, meal duration, eating rate, change in eating rate, bite size and bite frequency. It is concluded that they are an adequate tool for analyzing dietary and clinical interventions on meal size, because the cumulative food intake curve parameters: are stable and consistent within subjects; show a clear relationship with the subject characteristics dietary restraint and obesity; show a clear relationship with the physiological parameters satiation, diet-induced thermogenesis and body-temperature near the liver, and with the cognitive parameter: estimating forthcoming ingestion; are sensitive to instructions, clinical and dietary interventions (preloads, palatability, energy density, macronutrient composition), and to a state of negative energy balance. Because of possible compensatory post-prandial effects, it is suggested that assessment of meal size should be part of a 24 h appetite profile and food intake observation.

  3. Lyapunov exponent of the random frequency oscillator: cumulant expansion approach

    International Nuclear Information System (INIS)

    Anteneodo, C; Vallejos, R O

    2010-01-01

    We consider a one-dimensional harmonic oscillator with a random frequency, focusing on both the standard and the generalized Lyapunov exponents, λ and λ* respectively. We discuss the numerical difficulties that arise in the numerical calculation of λ* in the case of strong intermittency. When the frequency corresponds to a Ornstein-Uhlenbeck process, we compute analytically λ* by using a cumulant expansion including up to the fourth order. Connections with the problem of finding an analytical estimate for the largest Lyapunov exponent of a many-body system with smooth interactions are discussed.

  4. The cumulative ash curve: a best tool to evaluate complete mill performance.

    Science.gov (United States)

    Sakhare, Suresh D; Inamdar, Aashitosh A

    2014-04-01

    Slick test is carried out by a flour miller to qualitatively segregate the flour from different streams in a roller flour mill. This test is done manually by pressing flour samples on tray using thin bladed paddle (the slick) and inspecting color or dress of the sample. However, the test is subjective and totally depends on human judgment. Cumulative ash curve relates to cumulative flour ash content and cumulative flour yield, which could help a flour miller to be more precise while selecting flour streams for different needs. In this study, cleaning and conditioning of wheat was carried out in the pilot plant of International School of Milling Technology (ISMT). Further, roller flour milling of wheat was done. Flour from different streams (four breaks, five reductions) was collected. Each flour stream was analyzed for ash content using standard AACC methods. The analytical values of ash content were used to plot the cumulative ash curve. It was found that ash content increased in the break passages from first to last break, with exception of first break (ash content 0.71%). An increase in percentage of ash was observed in the reduction passages (C1 to C5), however, C3 ash (0.76%) was slightly higher than that of C4 (0.65%). Higher yield of flour with minimum ash content was obtained from the front reduction passages C1 and C2; whereas, the break passages and the tail end reduction passages produce less flour with higher ash content.

  5. Learning curve for robotic-assisted surgery for rectal cancer: use of the cumulative sum method.

    Science.gov (United States)

    Yamaguchi, Tomohiro; Kinugasa, Yusuke; Shiomi, Akio; Sato, Sumito; Yamakawa, Yushi; Kagawa, Hiroyasu; Tomioka, Hiroyuki; Mori, Keita

    2015-07-01

    Few data are available to assess the learning curve for robotic-assisted surgery for rectal cancer. The aim of the present study was to evaluate the learning curve for robotic-assisted surgery for rectal cancer by a surgeon at a single institute. From December 2011 to August 2013, a total of 80 consecutive patients who underwent robotic-assisted surgery for rectal cancer performed by the same surgeon were included in this study. The learning curve was analyzed using the cumulative sum method. This method was used for all 80 cases, taking into account operative time. Operative procedures included anterior resections in 6 patients, low anterior resections in 46 patients, intersphincteric resections in 22 patients, and abdominoperineal resections in 6 patients. Lateral lymph node dissection was performed in 28 patients. Median operative time was 280 min (range 135-683 min), and median blood loss was 17 mL (range 0-690 mL). No postoperative complications of Clavien-Dindo classification Grade III or IV were encountered. We arranged operative times and calculated cumulative sum values, allowing differentiation of three phases: phase I, Cases 1-25; phase II, Cases 26-50; and phase III, Cases 51-80. Our data suggested three phases of the learning curve in robotic-assisted surgery for rectal cancer. The first 25 cases formed the learning phase.

  6. Percent relative cumulative frequency analysis in indirect calorimetry: application to studies of transgenic mice.

    Science.gov (United States)

    Riachi, Marc; Himms-Hagen, Jean; Harper, Mary-Ellen

    2004-12-01

    Indirect calorimetry is commonly used in research and clinical settings to assess characteristics of energy expenditure. Respiration chambers in indirect calorimetry allow measurements over long periods of time (e.g., hours to days) and thus the collection of large sets of data. Current methods of data analysis usually involve the extraction of only a selected small proportion of data, most commonly the data that reflects resting metabolic rate. Here, we describe a simple quantitative approach for the analysis of large data sets that is capable of detecting small differences in energy metabolism. We refer to it as the percent relative cumulative frequency (PRCF) approach and have applied it to the study of uncoupling protein-1 (UCP1) deficient and control mice. The approach involves sorting data in ascending order, calculating their cumulative frequency, and expressing the frequencies in the form of percentile curves. Results demonstrate the sensitivity of the PRCF approach for analyses of oxygen consumption (.VO2) as well as respiratory exchange ratio data. Statistical comparisons of PRCF curves are based on the 50th percentile values and curve slopes (H values). The application of the PRCF approach revealed that energy expenditure in UCP1-deficient mice housed and studied at room temperature (24 degrees C) is on average 10% lower (p lower environmental temperature, there were no differences in .VO2 between groups. The latter is likely due to augmented shivering thermogenesis in UCP1-deficient mice compared with controls. With the increased availability of murine models of metabolic disease, indirect calorimetry is increasingly used, and the PRCF approach provides a novel and powerful means for data analysis.

  7. Morphology of the cumulative logistic distribution when used as a model of radiologic film characteristic curves

    International Nuclear Information System (INIS)

    Prince, J.R.

    1988-01-01

    The cumulative logistic distribution (CLD) is an empiric model for film characteristic curves. Characterizing the shape parameters of the CLD in terms of contrast, latitude and speed is required. The CLD is written as Υ-F=D/[1+EXP-(Κ+κ 1 X)] where Υ is the optical density (OD) at log exposure X, F is fog level, D is a constant equal to Dm-F, Κ and κ 1 are shape parameters, and Dm is the maximum attainable OD. Further analysis demonstrates that when Κ is held constant, Κ 1 characterizes contrast (the larger κ 1 , the greater the contrast) and hence latitude; when κ 1 is held constant, Κ characterizes film speed (the larger Κ is, the faster the film). These equations and concepts are further illustrated with examples from radioscintigraphy, diagnostic radiology, and light sensitometry

  8. Highway travel time information system based on cumulative count curves and new tracking technologies

    Energy Technology Data Exchange (ETDEWEB)

    Soriguera Marti, F.; Martinez-Diaz, M.; Perez Perez, I.

    2016-07-01

    Travel time is probably the most important indicator of the level of service of a highway, and it is also the most appreciated information for its users. Administrations and private companies make increasing efforts to improve its real time estimation. The appearance of new technologies makes the precise measurement of travel times easier than never before. However, direct measurements of travel time are, by nature, outdated in real time, and lack of the desired forecasting capabilities. This paper introduces a new methodology to improve the real time estimation of travel times by using the equipment usually present in most highways, i.e., loop detectors, in combination with Automatic Vehicle Identification or Tracking Technologies. One of the most important features of the method is the usage of cumulative counts at detectors as an input, avoiding the drawbacks of common spot-speed methodologies. Cumulative count curves have great potential for freeway travel time information systems, as they provide spatial measurements and thus allow the calculation of instantaneous travel times. In addition, they exhibit predictive capabilities. Nevertheless, they have not been used extensively mainly because of the error introduced by the accumulation of the detector drift. The proposed methodology solves this problem by correcting the deviations using direct travel time measurements. The method results highly beneficial for its accuracy as well as for its low implementation cost. (Author)

  9. Evaluation of the learning curve for external cephalic version using cumulative sum analysis.

    Science.gov (United States)

    Kim, So Yun; Han, Jung Yeol; Chang, Eun Hye; Kwak, Dong Wook; Ahn, Hyun Kyung; Ryu, Hyun Mi; Kim, Moon Young

    2017-07-01

    We evaluated the learning curve for external cephalic version (ECV) using learning curve-cumulative sum (LC-CUSUM) analysis. This was a retrospective study involving 290 consecutive cases between October 2013 and March 2017. We evaluated the learning curve for ECV on nulli and over para 1 group using LC-CUSUM analysis on the assumption that 50% and 70% of ECV procedures succeeded by description a trend-line of quadratic function with reliable R 2 values. The overall success rate for ECV was 64.8% (188/290), while the success rate for nullipara and over para 1 groups was 56.2% (100/178) and 78.6% (88/112), respectively. 'H' value, that the actual failure rate does not differ from the acceptable failure rate, was -3.27 and -1.635 when considering ECV success rates of 50% and 70%, respectively. Consequently, in order to obtain a consistent 50% success rate, we would require 57 nullipara cases, and in order to obtain a consistent 70% success rate, we would require 130 nullipara cases. In contrast, 8 to 10 over para 1 cases would be required for an expected success rate of 50% and 70% on over para 1 group. Even a relatively inexperienced physician can experience success with multipara and after accumulating experience, they will manage nullipara cases. Further research is required for LC-CUSUM involving several practitioners instead of a single practitioner. This will lead to the gradual implementation of standard learning curve guidelines for ECV.

  10. Low frequency oscillatory flow in a rotating curved pipe

    Institute of Scientific and Technical Information of China (English)

    陈华军; 章本照; 苏霄燕

    2003-01-01

    The low frequency oscillatory flow in a rotating curved pipe was studied by using the method of bi-parameter perturbation. Perturbation solutions up to the second order were obtained and the effects of rotationon the low frequency oscillatory flow were examined in detail, The results indicated that there exists evident difference between the low frequency oscillatory flow in a rotating curved pipe and in a curved pipe without ro-tation. During a period, four secondary vortexes may exist on the circular cross-section and the distribution of axial velocity and wall shear stress are related to the ratio of the Coriolis foree to centrifugal foree and the axial pressure gradient.

  11. Low frequency oscillatory flow in a rotating curved pipe

    Institute of Scientific and Technical Information of China (English)

    陈华军; 章本照; 苏霄燕

    2003-01-01

    The low frequency oscillatory flow in a rotating curved pipe was studied by using the method of bi-parameter perturbation. Perturbation solutions up to the second order were obtained and the effects of rotation on the low frequency oscillatory flow were examined in detail. The results indicated that there exists evident difference between the low frequency oscillatory flow in a rotating curved pipe and in a curved pipe without rotation. During a period, four secondary vortexes may exist on the circular cross-section and the distribution of axial velocity and wall shear stress are related to the ratio of the Coriolis force to centrifugal force and the axial pressure gradient.

  12. Natural frequencies of the frames having curved member

    International Nuclear Information System (INIS)

    Tekelioglu, M.; Ozyigit, H.A.; Ridvan, H.

    2001-01-01

    In-plane and out-of-plane vibrations of a frame having a curved member are studied. Although the analysis is carried out on a frame having a straight and a curve beam, it can be applicable for all the frame type structures. Different end conditions are considered for the system. Rotary inertia and extensional effects are included for the curved member. Finite element method is used as analysis tool. Natural frequencies of the curved beams for different end conditions are calculated first, and then the frequencies of the frames are investigated. The transformation from local coordinates to global coordinates for curved beams needs special attention in the analysis. The results are compared with other methods. (author)

  13. Relationship between mutation frequency of GPA locus and cumulative dose among medical diagnostic X-ray workers

    International Nuclear Information System (INIS)

    Wang Jixian; Yu Wenru; Li Benxiao; Fan Tiqiang; Li Zhen; Gao Zhiwei; Chen Zhenjun; Zhao Yongcheng

    2000-01-01

    Objective: To explore the feasibility of using GPA locus mutation assay as a bio-dosimeter for occupational exposure to ionizing radiation. Methods: An improved technique of GPA locus mutation assay was used in th study. The frequencies of mutant RBC in peripheral blood of 55 medical X-ray workers and 50 controls employed in different calendar-year periods were detected. The relationship between mutation frequencies (MFs) and period of entry, working years and cumulative doses were analyzed. Results: The MFs were significantly elevated among X-ray workers employed before 1970. This finding is similar to the result of cancer epidemiological study among medical X-ray workers , in which the cancer risk was significantly increased only X-ray workers employed before 1970. The MFs of GPA increased with increasing cumulative dose. The dose-effect relationship of Nφ MF with cumulative dose was closer than that of NN MF. Conclusion: There are many problems to be solved for using GPA MF assay as a bio-dosimeter such as individual variation, specificity and calibration curve of dose-effect relationship

  14. Integração entre curvas de permanência de quantidade e qualidade da água como uma ferramenta para a gestão eficiente dos recursos hídricos Integration between cumulative frequency curves for water quantity and quality as a tool for effective water resources management

    Directory of Open Access Journals (Sweden)

    Davi Gasparini Fernandes Cunha

    2012-12-01

    Full Text Available A garantia dos usos múltiplos da água, dos serviços ambientais e do equilíbrio ecológico depende de uma combinação adequada entre aspectos quantitativos e qualitativos dos rios. A presente pesquisa descreve aplicações de uma nova abordagem das curvas de permanência de vazões, que foram associadas a curvas de frequência acumulada de qualidade da água. Foram compilados dados de fósforo total (2005 a 2009 e vazão média mensal (1959 a 2003 dos rios Paraíba do Sul e Sorocaba para ilustrar o conceito. A integração entre as curvas de quantidade e qualidade se mostrou desejável por oferecer subsídios a concessões de outorga, à cobrança pelo uso da água, ao monitoramento ambiental e ao enquadramento dos cursos de água. Outro aspecto positivo é que essas curvas podem incorporar variações no clima e no uso e ocupação do solo, o que permite o estabelecimento de cenários ambientais.The security of the different water uses, environmental services and ecological balance depends upon a well-weighted combination between quantitative and qualitative aspects in rivers. This research describes applications of a new approach of the frequency curves for discharge, which were associated with frequency curves for water quality. Data on total phosphorus (2005 to 2009 and monthly average flow (1959 to 2003 from the Paraíba do Sul and Sorocaba Rivers were compiled to further illustrate the concept. The integration between the curves of water quantity and quality was considered desirable as it can aid in the planning of water concessions, charging for water uses, environmental monitoring and establishment of water quality standards and framework. Moreover, these curves can accommodate variations in climate and land use, allowing the establishment of environmental scenarios

  15. A spot-matching method using cumulative frequency matrix in 2D gel images

    Science.gov (United States)

    Han, Chan-Myeong; Park, Joon-Ho; Chang, Chu-Seok; Ryoo, Myung-Chun

    2014-01-01

    A new method for spot matching in two-dimensional gel electrophoresis images using a cumulative frequency matrix is proposed. The method improves on the weak points of the previous method called ‘spot matching by topological patterns of neighbour spots’. It accumulates the frequencies of neighbour spot pairs produced through the entire matching process and determines spot pairs one by one in order of higher frequency. Spot matching by frequencies of neighbour spot pairs shows a fairly better performance. However, it can give researchers a hint for whether the matching results can be trustworthy or not, which can save researchers a lot of effort for verification of the results. PMID:26019609

  16. Pulse frequency and soil-litter mixing alter the control of cumulative precipitation over litter decomposition.

    Science.gov (United States)

    Joly, François-Xavier; Kurupas, Kelsey L; Throop, Heather L

    2017-09-01

    Macroclimate has traditionally been considered the predominant driver of litter decomposition. However, in drylands, cumulative monthly or annual precipitation typically fails to predict decomposition. In these systems, the windows of opportunity for decomposer activity may rather depend on the precipitation frequency and local factors affecting litter desiccation, such as soil-litter mixing. We used a full-factorial microcosm experiment to disentangle the relative importance of cumulative precipitation, pulse frequency, and soil-litter mixing on litter decomposition. Decomposition, measured as litter carbon loss, saturated with increasing cumulative precipitation when pulses were large and infrequent, suggesting that litter moisture no longer increased and/or microbial activity was no longer limited by water availability above a certain pulse size. More frequent precipitation pulses led to increased decomposition at high levels of cumulative precipitation. Soil-litter mixing consistently increased decomposition, with greatest relative increase (+194%) under the driest conditions. Collectively, our results highlight the need to consider precipitation at finer temporal scale and incorporate soil-litter mixing as key driver of decomposition in drylands. © 2017 by the Ecological Society of America.

  17. Regionalisation of low flow frequency curves for the Peninsular Malaysia

    Science.gov (United States)

    Mamun, Abdullah A.; Hashim, Alias; Daoud, Jamal I.

    2010-02-01

    SUMMARYRegional maps and equations for the magnitude and frequency of 1, 7 and 30-day low flows were derived and are presented in this paper. The river gauging stations of neighbouring catchments that produced similar low flow frequency curves were grouped together. As such, the Peninsular Malaysia was divided into seven low flow regions. Regional equations were developed using the multivariate regression technique. An empirical relationship was developed for mean annual minimum flow as a function of catchment area, mean annual rainfall and mean annual evaporation. The regional equations exhibited good coefficient of determination ( R2 > 0.90). Three low flow frequency curves showing the low, mean and high limits for each region were proposed based on a graphical best-fit technique. Knowing the catchment area, mean annual rainfall and evaporation in the region, design low flows of different durations can be easily estimated for the ungauged catchments. This procedure is expected to overcome the problem of data unavailability in estimating low flows in the Peninsular Malaysia.

  18. Study of cumulative fatigue damage detection for used parts with nonlinear output frequency response functions based on NARMAX modelling

    Science.gov (United States)

    Huang, Honglan; Mao, Hanying; Mao, Hanling; Zheng, Weixue; Huang, Zhenfeng; Li, Xinxin; Wang, Xianghong

    2017-12-01

    Cumulative fatigue damage detection for used parts plays a key role in the process of remanufacturing engineering and is related to the service safety of the remanufactured parts. In light of the nonlinear properties of used parts caused by cumulative fatigue damage, the based nonlinear output frequency response functions detection approach offers a breakthrough to solve this key problem. First, a modified PSO-adaptive lasso algorithm is introduced to improve the accuracy of the NARMAX model under impulse hammer excitation, and then, an effective new algorithm is derived to estimate the nonlinear output frequency response functions under rectangular pulse excitation, and a based nonlinear output frequency response functions index is introduced to detect the cumulative fatigue damage in used parts. Then, a novel damage detection approach that integrates the NARMAX model and the rectangular pulse is proposed for nonlinear output frequency response functions identification and cumulative fatigue damage detection of used parts. Finally, experimental studies of fatigued plate specimens and used connecting rod parts are conducted to verify the validity of the novel approach. The obtained results reveal that the new approach can detect cumulative fatigue damages of used parts effectively and efficiently and that the various values of the based nonlinear output frequency response functions index can be used to detect the different fatigue damages or working time. Since the proposed new approach can extract nonlinear properties of systems by only a single excitation of the inspected system, it shows great promise for use in remanufacturing engineering applications.

  19. An application of the learning curve-cumulative summation test to evaluate training for endotracheal intubation in emergency medicine.

    Science.gov (United States)

    Je, Sangmo; Cho, Youngsuk; Choi, Hyuk Joong; Kang, Boseung; Lim, Taeho; Kang, Hyunggoo

    2015-04-01

    The learning curve-cumulative summation (LC-CUSUM) test allows for quantitative and individual assessments of the learning process. In this study, we evaluated the process of skill acquisition for performing endotracheal intubation (ETI) in three emergency medicine (EM) residents over a 2 year period in their first 2 years of their EM residency. We evaluated 342 ETI cases performed by three EM residents using the LC-CUSUM test according to their rate of success or failure of ETI. A 90% success rate (SR) was chosen to define adequate performance and an SR of 80% was considered inadequate. After the learning phase, the standard CUSUM test was applied to ensure that performance was maintained. The mean number of ETI cases required to reach the predefined level of performance was 74.7 (95% CI 62.0 to 87.3). CUSUM tests confirmed that performance was maintained after the learning phase. By using the LC-CUSUM test, we were able to quantitatively monitor the acquisition of the skill of ETI by EM residents. The LC-CUSUM could be useful for monitoring the learning process for the training of airway management in the practice of EM. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. Nonlinear Filtering Effects of Reservoirs on Flood Frequency Curves at the Regional Scale: RESERVOIRS FILTER FLOOD FREQUENCY CURVES

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Wei; Li, Hong-Yi; Leung, Lai-Yung; Yigzaw, Wondmagegn Y.; Zhao, Jianshi; Lu, Hui; Deng, Zhiqun; Demissie, Yonas; Bloschl, Gunter

    2017-10-01

    Anthropogenic activities, e.g., reservoir operation, may alter the characteristics of Flood Frequency Curve (FFC) and challenge the basic assumption of stationarity used in flood frequency analysis. This paper presents a combined data-modeling analysis of the nonlinear filtering effects of reservoirs on the FFCs over the contiguous United States. A dimensionless Reservoir Impact Index (RII), defined as the total upstream reservoir storage capacity normalized by the annual streamflow volume, is used to quantify reservoir regulation effects. Analyses are performed for 388 river stations with an average record length of 50 years. The first two moments of the FFC, mean annual maximum flood (MAF) and coefficient of variations (CV), are calculated for the pre- and post-dam periods and compared to elucidate the reservoir regulation effects as a function of RII. It is found that MAF generally decreases with increasing RII but stabilizes when RII exceeds a threshold value, and CV increases with RII until a threshold value beyond which CV decreases with RII. The processes underlying the nonlinear threshold behavior of MAF and CV are investigated using three reservoir models with different levels of complexity. All models capture the non-linear relationships of MAF and CV with RII, suggesting that the basic flood control function of reservoirs is key to the non-linear relationships. The relative roles of reservoir storage capacity, operation objectives, available storage prior to a flood event, and reservoir inflow pattern are systematically investigated. Our findings may help improve flood-risk assessment and mitigation in regulated river systems at the regional scale.

  1. Rainfall Intensity and Frequency Explain Production Basis Risk in Cumulative Rain Index Insurance

    Science.gov (United States)

    Muneepeerakul, Chitsomanus P.; Muneepeerakul, Rachata; Huffaker, Ray G.

    2017-12-01

    With minimal moral hazard and adverse selection, weather index insurance promises financial resilience to farmers struck by harsh weather conditions through swift compensation at affordable premium. Despite these advantages, the very nature of indexing gives rise to production basis risk as the selected weather indexes do not sufficiently correspond to actual damages. To address this problem, we develop a stochastic yield model, built upon a stochastic soil moisture model driven by marked Poisson rainfall. Our analysis shows that even under similar temperature and rainfall amount yields can differ significantly; this was empirically supported by a 2-year field experiment in which rain-fed maize was grown under very similar total rainfall. Here, the year with more intense, less-frequent rainfall produces a better yield—a rare counter evidence to most climate change projections. Through a stochastic yield model, we demonstrate the crucial roles of rainfall intensity and frequency in determining the yield. Importantly, the model allows us to compute rainfall pattern-related basis risk inherent in cumulative rain index insurance. The model results and a case study herein clearly show that total rainfall is a poor indicator of yield, imposing unnecessary production basis risk on farmers and false-positive payouts on insurers. Incorporating rainfall intensity and frequency in the design of rain index insurance can offer farmers better protection, while maintaining the attractive features of the weather index insurance and thus fulfilling its promise of financial resilience.

  2. A personal radio-frequency dosimeter with cumulative-dose recording capabilities

    International Nuclear Information System (INIS)

    Rochelle, R.W.; Moore, M.R.; Thomas, R.S.; Ewing, P.D.; Hess, R.A.; Hoffheins, B.S.

    1990-01-01

    The radio-frequency (rf) dosimeter developed by the Oak Ridge National Laboratory is a portable, pocket-sized cumulative-dose recording device designed to detect and record the strengths and durations of electric fields present in the work areas of naval vessels. The device measures an integrated dose and records the electric fields that exceed the permissible levels set by the American National Standards Institute. Features of the rf dosimeter include a frequency range of 30 MHz to 10 GHz and a three-dimensional sensor. Data obtained with the rf dosimeter will be used to determine the ambient field-strength profile for shipboard personnel over an extended time. Readings are acquired and averaged over a 6-min period corresponding to the rise time of the core body temperature. These values are stored for up to 6 months, after which the data are transferred to a computer via the dosimeter's serial port. The rf dosimeter should increase knowledge of the levels of electric fields to which individuals are exposed. 13 refs., 16 figs., 2 tabs

  3. Use of the cumulative sum method (CUSUM) to assess the learning curves of ultrasound-guided continuous femoral nerve block.

    Science.gov (United States)

    Kollmann-Camaiora, A; Brogly, N; Alsina, E; Gilsanz, F

    2017-10-01

    Although ultrasound is a basic competence for anaesthesia residents (AR) there is few data available on the learning process. This prospective observational study aims to assess the learning process of ultrasound-guided continuous femoral nerve block and to determine the number of procedures that a resident would need to perform in order to reach proficiency using the cumulative sum (CUSUM) method. We recruited 19 AR without previous experience. Learning curves were constructed using the CUSUM method for ultrasound-guided continuous femoral nerve block considering 2 success criteria: a decrease of pain score>2 in a [0-10] scale after 15minutes, and time required to perform it. We analyse data from 17 AR for a total of 237 ultrasound-guided continuous femoral nerve blocks. 8/17 AR became proficient for pain relief, however all the AR who did more than 12 blocks (8/8) became proficient. As for time of performance 5/17 of AR achieved the objective of 12minutes, however all the AR who did more than 20 blocks (4/4) achieved it. The number of procedures needed to achieve proficiency seems to be 12, however it takes more procedures to reduce performance time. The CUSUM methodology could be useful in training programs to allow early interventions in case of repeated failures, and develop competence-based curriculum. Copyright © 2017 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Influence of horizontally curved roadway section characteristics on motorcycle-to-barrier crash frequency.

    Science.gov (United States)

    Gabauer, Douglas J; Li, Xiaolong

    2015-04-01

    The purpose of this study was to investigate motorcycle-to-barrier crash frequency on horizontally curved roadway sections in Washington State using police-reported crash data linked with roadway data and augmented with barrier presence information. Data included 4915 horizontal curved roadway sections with 252 of these sections experiencing 329 motorcycle-to-barrier crashes between 2002 and 2011. Negative binomial regression was used to predict motorcycle-to-barrier crash frequency using horizontal curvature and other roadway characteristics. Based on the model results, the strongest predictor of crash frequency was found to be curve radius. This supports a motorcycle-to-barrier crash countermeasure placement criterion based, at the very least, on horizontal curve radius. With respect to the existing horizontal curve criterion of 820 feet or less, curves meeting this criterion were found to increase motorcycle-to-barrier crash frequency rate by a factor of 10 compared to curves not meeting this criterion. Other statistically significant predictors were curve length, traffic volume and the location of adjacent curves. Assuming curves of identical radius, the model results suggest that longer curves, those with higher traffic volume, and those that have no adjacent curved sections within 300 feet of either curve end would likely be better candidates for a motorcycle-to-barrier crash countermeasure. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Effects of mistuning and matrix structure on the topology of frequency response curves

    Science.gov (United States)

    Afolabi, Dare

    1989-01-01

    The stability of a frequency response curve under mild perturbations of the system's matrix is investigated. Using recent developments in the theory of singularities of differentiable maps, it is shown that the stability of a response curve depends on the structure of the system's matrix. In particular, the frequency response curves of a cylic system are shown to be unstable. Consequently, slight parameter variations engendered by mistuning will induce a significant difference in the topology of the forced response curves, if the mistuning transformation crosses the bifurcation set.

  6. Cumulative Clearness Index Frequency Distributions on the Territory of the Russian Federation

    Science.gov (United States)

    Frid, S. E.; Lisitskaya, N. V.; Popel, O. S.

    2018-02-01

    Cumulative distributions of clearness index values are constructed for the territory of Russia based on ground observation results and NASA POWER data. The obtained distributions lie close to each other, which means that the NASA POWER data can be used in solar power installations simulation at temperate and high latitudes. Approximation of the obtained distributions is carried out. The values of equation coefficients for the cumulative clearness index distributions constructed for a wide range of climatic conditions are determined. Equations proposed for a tropical climate are used in the calculations, so they can be regarded as universal ones.

  7. Modelling and assessment of urban flood hazards based on rainfall intensity-duration-frequency curves reformation

    OpenAIRE

    Ghazavi, Reza; Moafi Rabori, Ali; Ahadnejad Reveshty, Mohsen

    2016-01-01

    Estimate design storm based on rainfall intensity–duration–frequency (IDF) curves is an important parameter for hydrologic planning of urban areas. The main aim of this study was to estimate rainfall intensities of Zanjan city watershed based on overall relationship of rainfall IDF curves and appropriate model of hourly rainfall estimation (Sherman method, Ghahreman and Abkhezr method). Hydrologic and hydraulic impacts of rainfall IDF curves change in flood properties was evaluated via Stormw...

  8. Modelling precipitation extremes in the Czech Republic: update of intensity–duration–frequency curves

    Directory of Open Access Journals (Sweden)

    Michal Fusek

    2016-11-01

    Full Text Available Precipitation records from six stations of the Czech Hydrometeorological Institute were subject to statistical analysis with the objectives of updating the intensity–duration–frequency (IDF curves, by applying extreme value distributions, and comparing the updated curves against those produced by an empirical procedure in 1958. Another objective was to investigate differences between both sets of curves, which could be explained by such factors as different measuring instruments, measuring stations altitudes and data analysis methods. It has been shown that the differences between the two sets of IDF curves are significantly influenced by the chosen method of data analysis.

  9. Dose-response curve for translocation frequency with single pair of painted chromosome. A comparison with dicentric and micronuclei frequency

    Energy Technology Data Exchange (ETDEWEB)

    Venkatachalam, P.; Paul, S.F.D.; Mohankumar, M.N.; Prabhu, B.K.; Gajendiran, N.; Jeevanram, R.K

    2000-07-01

    A translocation dose-response curve using a single pair of painted chromosomes was constructed. The translocation frequencies observed at different doses were compared to those obtained for dicentrics (DC) and micronuclei (MN). The translocation and DC frequency followed the Poisson distribution and MN showed over-dispersion. The translocation and DC frequencies were nearly the same for each dose point. Micronuclei showed a comparatively lower frequency. The alpha/beta ratio for translocations (0.916) and DC (0.974) were comparable, whereas the value for MN (1.526) was much higher. The equal frequencies of translocations and DC observed for a given dose indicated that genomic translocation frequency estimated using a single pair of painted chromosomes provides a reliable and easy method to measure translocation frequency. (autho000.

  10. Dose-response curve for translocation frequency with single pair of painted chromosome. A comparison with dicentric and micronuclei frequency

    International Nuclear Information System (INIS)

    Venkatachalam, P.; Paul, S.F.D.; Mohankumar, M.N.; Prabhu, B.K.; Gajendiran, N.; Jeevanram, R.K.

    2000-01-01

    A translocation dose-response curve using a single pair of painted chromosomes was constructed. The translocation frequencies observed at different doses were compared to those obtained for dicentrics (DC) and micronuclei (MN). The translocation and DC frequency followed the Poisson distribution and MN showed over-dispersion. The translocation and DC frequencies were nearly the same for each dose point. Micronuclei showed a comparatively lower frequency. The alpha/beta ratio for translocations (0.916) and DC (0.974) were comparable, whereas the value for MN (1.526) was much higher. The equal frequencies of translocations and DC observed for a given dose indicated that genomic translocation frequency estimated using a single pair of painted chromosomes provides a reliable and easy method to measure translocation frequency. (author)

  11. Bayesian Inference of Nonstationary Precipitation Intensity-Duration-Frequency Curves for Infrastructure Design

    Science.gov (United States)

    2016-03-01

    each IDF curve and subsequently used to force a calibrated and validated precipitation - runoff model. Probability-based, risk-informed hydrologic...ERDC/CHL CHETN-X-2 March 2016 Approved for public release; distribution is unlimited. Bayesian Inference of Nonstationary Precipitation Intensity...based means by which to develop local precipitation Intensity-Duration-Frequency (IDF) curves using historical rainfall time series data collected for

  12. Use of regionalisation approach to develop fire frequency curves for Victoria, Australia

    Science.gov (United States)

    Khastagir, Anirban; Jayasuriya, Niranjali; Bhuyian, Muhammed A.

    2017-11-01

    It is important to perform fire frequency analysis to obtain fire frequency curves (FFC) based on fire intensity at different parts of Victoria. In this paper fire frequency curves (FFCs) were derived based on forest fire danger index (FFDI). FFDI is a measure related to fire initiation, spreading speed and containment difficulty. The mean temperature (T), relative humidity (RH) and areal extent of open water (LC2) during summer months (Dec-Feb) were identified as the most important parameters for assessing the risk of occurrence of bushfire. Based on these parameters, Andrews' curve equation was applied to 40 selected meteorological stations to identify homogenous stations to form unique clusters. A methodology using peak FFDI from cluster averaged FFDIs was developed by applying Log Pearson Type III (LPIII) distribution to generate FFCs. A total of nine homogeneous clusters across Victoria were identified, and subsequently their FFC's were developed in order to estimate the regionalised fire occurrence characteristics.

  13. Fitness costs of increased cataract frequency and cumulative radiation dose in natural mammalian populations from Chernobyl

    OpenAIRE

    Lehmann, Philipp; Boraty?ski, Zbyszek; Mappes, Tapio; Mousseau, Timothy A.; M?ller, Anders P.

    2016-01-01

    A cataract is a clouding of the lens that reduces light transmission to the retina, and it decreases the visual acuity of the bearer. The prevalence of cataracts in natural populations of mammals, and their potential ecological significance, is poorly known. Cataracts have been reported to arise from high levels of oxidative stress and a major cause of oxidative stress is ionizing radiation. We investigated whether elevated frequencies of cataracts are found in eyes of bank voles Myodes glare...

  14. Fitness costs of increased cataract frequency and cumulative radiation dose in natural mammalian populations from Chernobyl.

    Science.gov (United States)

    Lehmann, Philipp; Boratyński, Zbyszek; Mappes, Tapio; Mousseau, Timothy A; Møller, Anders P

    2016-01-27

    A cataract is a clouding of the lens that reduces light transmission to the retina, and it decreases the visual acuity of the bearer. The prevalence of cataracts in natural populations of mammals, and their potential ecological significance, is poorly known. Cataracts have been reported to arise from high levels of oxidative stress and a major cause of oxidative stress is ionizing radiation. We investigated whether elevated frequencies of cataracts are found in eyes of bank voles Myodes glareolus collected from natural populations in areas with varying levels of background radiation in Chernobyl. We found high frequencies of cataracts in voles collected from different areas in Chernobyl. The frequency of cataracts was positively correlated with age, and in females also with the accumulated radiation dose. Furthermore, the number of offspring in female voles was negatively correlated with cataract severity. The results suggest that cataracts primarily develop as a function of ionizing background radiation, most likely as a plastic response to high levels of oxidative stress. It is therefore possible that the elevated levels of background radiation in Chernobyl affect the ecology and fitness of local mammals both directly through, for instance, reduced fertility and indirectly, through increased cataractogenesis.

  15. Recurrence and frequency of disturbance have cumulative effect on methanotrophic activity, abundance, and community structure.

    Directory of Open Access Journals (Sweden)

    Adrian eHo

    2016-01-01

    Full Text Available Alternate prolonged drought and heavy rainfall is predicted to intensify with global warming. Desiccation-rewetting events alter the soil quality and nutrient concentrations which drive microbial-mediated processes, including methane oxidation, a key biogeochemical process catalyzed by methanotrophic bacteria. Although aerobic methanotrophs showed remarkable resilience to a suite of physical disturbances induced as a single event, their resilience to recurring disturbances is less known. Here, using a rice field soil in a microcosm study, we determined whether recurrence and frequency of desiccation-rewetting impose an accumulating effect on the methanotrophic activity. The response of key aerobic methanotroph subgroups (type Ia, Ib, and II were monitored using qPCR assays, and was supported by a t-RFLP analysis. The methanotrophic activity was resilient to recurring desiccation-rewetting, but increasing the frequency of the disturbance by two-fold significantly decreased methane uptake rate. Both the qPCR and t-RFLP analyses were congruent, showing the dominance of type Ia/Ib methanotrophs prior to disturbance, and after disturbance, the recovering community was predominantly comprised of type Ia (Methylobacter methanotrophs. Both type Ib and type II (Methylosinus/Methylocystis methanotrophs were adversely affected by the disturbance, but type II methanotrophs showed recovery over time, indicating relatively higher resilience to the disturbance. This revealed distinct, yet unrecognized traits among the methanotroph community members. Our results show that recurring desiccation-rewetting before a recovery in community abundance had an accumulated effect, compromising methanotrophic activity. While methanotrophs may recover well following sporadic disturbances, their resilience may reach a ‘tipping point’ where activity no longer recovered if disturbance persists and increase in frequency.

  16. Next-Generation Intensity-Duration-Frequency Curves for Hydrologic Design in Snow-Dominated Environments

    Science.gov (United States)

    Yan, Hongxiang; Sun, Ning; Wigmosta, Mark; Skaggs, Richard; Hou, Zhangshuan; Leung, Ruby

    2018-02-01

    There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial overestimation/underestimation of design basis events and subsequent overdesign/underdesign of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to underdesign, many with significant underestimation of 100 year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for underdesign were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for overdesign at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.

  17. Next-Generation Intensity-Duration-Frequency Curves for Hydrologic Design in Snow-Dominated Environments

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Hongxiang [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Sun, Ning [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Wigmosta, Mark [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Distinguished Faculty Fellow, Department of Civil and Environmental Engineering, University of Washington, Seattle Washington United States; Skaggs, Richard [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Hou, Zhangshuan [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Leung, Ruby [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland Washington United States

    2018-02-01

    There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial over-/under-estimation of design basis events and subsequent over-/under-design of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to under-design, many with significant under-estimation of 100-year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for under-design were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for over-design at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.

  18. A comparison of a novel time-based summary measure of dairy cow health against cumulative disease frequency.

    Science.gov (United States)

    McConnel, Craig S; McNeil, Ashleigh A; Hadrich, Joleen C; Lombard, Jason E; Heller, Jane; Garry, Franklyn B

    2018-01-01

    There is an increasing push for dairy production to be scientifically grounded and ethically responsible in the oversight of animal health and well-being. Addressing underlying challenges affecting the quality and length of productive life necessitates novel assessment and accountability metrics. Human medical epidemiologists developed the Disability-Adjusted Life Year metric as a summary measure of health addressing the complementary nature of disease and death. The goal of this project was to develop and implement a dairy Disease-Adjusted Lactation (DALact) summary measure of health, as a comparison against cumulative disease frequency. A total of 5694 cows were enrolled at freshening from January 1st, 2014 through May 26th, 2015 on 3 similarly managed U.S. Midwestern Plains' region dairies. Eleven health categories of interest were tracked from enrollment until culling, death, or the study's completion date. The DALact accounted for the days of life lost due to illness, forced removal, and death relative to the average lactation length across the participating farms. The DALact consistently identified mastitis as the primary disease of concern on all 3 dairies (19,007-23,955 days lost). Secondary issues included musculoskeletal injuries (19,559 days), pneumonia (11,034 days), or lameness (8858 days). By comparison, cumulative frequency measures pointed to mastitis (31-50%) and lameness (25-54%) as the 2 most frequent diseases. Notably, the DALact provided a robust accounting of health events such as musculoskeletal injuries (5010-19,559 days) and calving trauma (2952-5868 days) otherwise overlooked by frequency measures (0-3%). The DALact provides a time-based method for assessing the overall burden of disease on dairies. It is important to emphasize that a summary measure of dairy health goes beyond simply linking morbidity to culling and mortality in a standardized fashion. A summary measure speaks to the burden of disease on both the well-being and

  19. Historical Cost Curves for Hydrogen Masers and Cesium Beam Frequency and Timing Standards

    Science.gov (United States)

    Remer, D. S.; Moore, R. C.

    1985-01-01

    Historical cost curves were developed for hydrogen masers and cesium beam standards used for frequency and timing calibration in the Deep Space Network. These curves may be used to calculate the cost of future hydrogen masers or cesium beam standards in either future or current dollars. The cesium beam standards are decreasing in cost by about 2.3% per year since 1966, and hydrogen masers are decreasing by about 0.8% per year since 1978 relative to the National Aeronautics and Space Administration inflation index.

  20. Displacement sensing based on resonant frequency monitoring of electrostatically actuated curved micro beams

    International Nuclear Information System (INIS)

    Krakover, Naftaly; Krylov, Slava; Ilic, B Robert

    2016-01-01

    The ability to control nonlinear interactions of suspended mechanical structures offers a unique opportunity to engineer rich dynamical behavior that extends the dynamic range and ultimate device sensitivity. We demonstrate a displacement sensing technique based on resonant frequency monitoring of curved, doubly clamped, bistable micromechanical beams interacting with a movable electrode. In this configuration, the electrode displacement influences the nonlinear electrostatic interactions, effective stiffness and frequency of the curved beam. Increased sensitivity is made possible by dynamically operating the beam near the snap-through bistability onset. Various in-plane device architectures were fabricated from single crystal silicon and measured under ambient conditions using laser Doppler vibrometry. In agreement with the reduced order Galerkin-based model predictions, our experimental results show a significant resonant frequency reduction near critical snap-through, followed by a frequency increase within the post-buckling configuration. Interactions with a stationary electrode yield a voltage sensitivity up to  ≈560 Hz V −1 and results with a movable electrode allow motion sensitivity up to  ≈1.5 Hz nm −1 . Our theoretical and experimental results collectively reveal the potential of displacement sensing using nonlinear interactions of geometrically curved beams near instabilities, with possible applications ranging from highly sensitive resonant inertial detectors to complex optomechanical platforms providing an interface between the classical and quantum domains. (paper)

  1. Probabilistic Rainfall Intensity-Duration-Frequency Curves for the October 2015 Flooding in South Carolina

    Science.gov (United States)

    Phillips, R.; Samadi, S. Z.; Meadows, M.

    2017-12-01

    The potential for the intensity of extreme rainfall to increase with climate change nonstationarity has emerged as a prevailing issue for the design of engineering infrastructure, underscoring the need to better characterize the statistical assumptions underlying hydrological frequency analysis. The focus of this study is on developing probabilistic rainfall intensity-duration-frequency (IDF) curves for the major catchments in South Carolina (SC) where the October 02-05, 2015 floods caused infrastructure damages and several lives to be lost. Continuous to discrete probability distributions including Weibull, the generalized extreme value (GEV), the Generalized Pareto (GP), the Gumbel, the Fréchet, the normal, and the log-normal functions were fitted to the short duration (i.e., 24-hr) intense rainfall. Analysis suggests that the GEV probability distribution provided the most adequate fit to rainfall records. Rainfall frequency analysis indicated return periods above 500 years for urban drainage systems with a maximum return level of approximately 2,744 years, whereas rainfall magnitude was much lower in rural catchments. Further, the return levels (i.e., 2, 20, 50,100, 500, and 1000 years) computed by Monte Carlo method were consistently higher than the NOAA design IDF curves. Given the potential increase in the magnitude of intense rainfall, current IDF curves can substantially underestimate the frequency of extremes, indicating the susceptibility of the storm drainage and flood control structures in SC that were designed under assumptions of a stationary climate.

  2. Development of Intensity-Duration-Frequency curves at ungauged sites: risk management under changing climate

    Science.gov (United States)

    Liew, San Chuin; Raghavan, Srivatsan V.; Liong, Shie-Yui

    2014-12-01

    The impact of a changing climate is already being felt on several hydrological systems both on a regional and sub-regional scale of the globe. Southeast Asia is one of the regions strongly affected by climate change. With climate change, one of the anticipated impacts is an increase in the intensity and frequency of extreme rainfall which further increase the region's flood catastrophes, human casualties and economic loss. Optimal mitigation measures can be undertaken only when stormwater systems are designed using rainfall Intensity-Duration-Frequency (IDF) curves derived from a long and good quality rainfall data. Developing IDF curves for the future climate can be even more challenging especially for ungauged sites. The current practice to derive current climate's IDF curves for ungauged sites is, for example, to `borrow' or `interpolate' data from regions of climatologically similar characteristics. Recent measures to derive IDF curves for present climate was performed by extracting rainfall data from a high spatial resolution Regional Climate Model driven by ERA-40 reanalysis dataset. This approach has been demonstrated on an ungauged site (Java, Indonesia) and the results were quite promising. In this paper, the authors extend the application of the approach to other ungauged sites particularly in Peninsular Malaysia. The results of the study undoubtedly have significance contribution in terms of local and regional hydrology (Malaysia and Southeast Asian countries). The anticipated impacts of climate change especially increase in rainfall intensity and its frequency appreciates the derivation of future IDF curves in this study. It also provides policy makers better information on the adequacy of storm drainage design, for the current climate at the ungauged sites, and the adequacy of the existing storm drainage to cope with the impacts of climate change.

  3. The Roles of Macrobenthic Mollusks as Bioindicator in Response to Environmental Disturbance : Cumulative k-dominance curves and bubble plots ordination approaches

    Science.gov (United States)

    Putro, Sapto P.; Muhammad, Fuad; Aininnur, Amalia; Widowati; Suhartana

    2017-02-01

    Floating net cage is one of the aquaculture practice operated in Indonesian coastal areas that has been growing rapidly over the last two decades. This study is aimed to assess the roles of macrobenthic mollusks as bioindicator in response to environmental disturbance caused by fish farming activities, and compare the samples within the locations using graphical methods. The research was done at the floating net cage fish farming area in the Awerange Gulf, South Sulawesi, Indonesia at the coordinates between 79°0500‧- 79°1500‧ LS and 953°1500‧- 953°2000‧ BT, at the polyculture and reference areas, which was located 1 km away from farming area. Sampling period was conducted between October 2014 to June 2015. The sediment samples were taken from the two locations with two sampling time and three replicates using Van Veen Grab for biotic and abiotic assessment. Mollusks as biotic parameter were fixed using 4% formalin solution and were preserved using 70% ethanol solution after 1mm mesh size. The macrobenthic mollusks were found as many as 15 species consisting of 14 families and 2 classes (gastropods and bivalves). Based on cumulative k-dominance analysis projected on each station, the line of station K3T1 (reference area; first sampling time) and KJAB P3T2 (polyculture area; second sampling time) are located below others curves, indicating the highest evenness and diversity compared to the other stations, whereas station K2T1 (reference area; first sampling time) and K3T2 (polyculture area, second sampling time) are located on the top, indicate the lowest value of evenness and diversity. Based on the bubble plots NMDS ordination, the four dominant taxa/species did not clearly show involvement in driving/shifting the ordinate position of station on the graph, except T. agilis. However, the two species showed involvement in driving/shifting the ordinate position of two stations of the reference areas from the first sampling time by Rynoclavis sordidula

  4. An Experimental Study on the Impact of Different-frequency Elastic Waves on Water Retention Curve

    Science.gov (United States)

    Deng, J. H.; Dai, J. Y.; Lee, J. W.; Lo, W. C.

    2017-12-01

    ABSTEACTOver the past few decades, theoretical and experimental studies on the connection between elastic wave attributes and the physical properties of a fluid-bearing porous medium have attracted the attention of many scholars in fields of porous medium flow and hydrogeology. It has been previously determined that the transmission of elastic waves in a porous medium containing two immiscible fluids will have an effect on the water retention curve, but it has not been found that the water retention curve will be affected by the frequency of elastic vibration waves or whether the effect on the soil is temporary or permanent. This research is based on a sand box test in which the soil is divided into three layers (a lower, middle, and upper layer). In this case, we discuss different impacts on the water retention curve during the drying process under sound waves (elastic waves) subject to three frequencies (150Hz, 300Hz, and 450Hz), respectively. The change in the water retention curve before and after the effect is then discussed. In addition, how sound waves affect the water retention curve at different depths is also observed. According to the experimental results, we discover that sound waves can cause soil either to expand or to contract. When the soil is induced to expand due to sound waves, it can contract naturally and return to the condition it was in before the influence of the sound waves. On the contrary, when the soil is induced to contract, it is unable to return to its initial condition. Due to the results discussed above, it is suggested that sound waves causing soil to expand have a temporary impact while those causing soil to contract have a permanent impact. In addition, our experimental results show how sound waves affect the water retention curve at different depths. The degree of soil expansion and contraction caused by the sound waves will differ at various soil depths. Nevertheless, the expanding or contracting of soil is only subject to the

  5. Influence of the turbulence typing scheme upon the cumulative frequency distribution of the calculated relative concentrations for different averaging times

    Energy Technology Data Exchange (ETDEWEB)

    Kretzschmar, J.G.; Mertens, I.

    1984-01-01

    Over the period 1977-1979, hourly meteorological measurements at the Nuclear Energy Research Centre, Mol, Belgium and simultaneous synoptic observations at the nearby military airport of Kleine Brogel, have been compiled as input data for a bi-Gaussian dispersion model. The available information has first of all been used to determine hourly stability classes in ten widely used turbulent diffusion typing schemes. Systematic correlations between different systems were rare. Twelve different combinations of diffusion typing scheme-dispersion parameters were then used for calculating cumulative frequency distributions of 1 h, 8 h, 16 h, 3 d, and 26 d average ground-level concentrations at receptors respectively at 500 m, 1 km, 2 km, 4 km and 8 km from continuous ground-level release and an elevated release at 100 m height. Major differences were noted as well in the extreme values, the higher percentiles, as in the annual mean concentrations. These differences are almost entirely due to the differences in the numercial values (as a function of distance) of the various sets of dispersion parameters actually in use for impact assessment studies. Dispersion parameter sets giving the lowest normalized ground-level concentration values for ground level releases give the highest results for elevated releases and vice versa. While it was illustrated once again that the applicability of a given set of dispersion parameters is restricted due to the specific conditions under which the given set derived, it was also concluded that systematic experimental work to validate certain assumptions is urgently needed.

  6. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  7. Cumulative response curves to enhance interpretation of treatment differences on the Self-Esteem And Relationship questionnaire for men with erectile dysfunction.

    Science.gov (United States)

    Cappelleri, Joseph C; Zou, Kelly H; Bushmakin, Andrew G; Carlsson, Martin O; Symonds, Tara

    2013-03-01

    What's known on the subject? and What does the study add? Studies on erectile dysfunction (ED) therapies rely heavily on patient-reported outcomes (PROs) to measure efficacy on treatment response. A challenge when using PROs is interpretation of the clinical meaning of changes in scores. A responder analysis provides a threshold score to indicate whether a change in score qualifies a patient as a responder. However, a major consideration with responder analysis is the sometimes arbitrary nature of defining the threshold for a response. By contrast, cumulative response curves (CRCs) display patient response rates over a continuum of possible thresholds, thus eliminating problems with a rigid threshold definition, allowing for a variety of response thresholds to be examined simultaneously, and encompassing all data. With respect to the psychosocial factors addressed in the Self-Esteem And Relationship questionnaire in ED, CRCs clearly, distinctly, and meaningfully highlighted the favourable profiles of responses to sildenafil compared with placebo. CRCs for PROs in urology can provide a clear, transparent and meaningful visual depiction of efficacy data that can supplement and complement other analyses. To use cumulative response curves (CRCs) to enrich meaning and enhance interpretation of scores on the Self-Esteem And Relationship (SEAR) questionnaire with respect to treatment differences for men with erectile dysfunction (ED). This post hoc analysis used data from all patients who took at least one dose of study drug and had at least one post-baseline efficacy evaluation in a previously published 12-week, multicentre, randomized, double-blind, placebo-controlled trial of flexible-dose (25, 50, or 100 mg) sildenafil citrate (Viagra) in adult men with ED who had scored ≤ 75 out of 100 on the Self-Esteem subscale of the SEAR questionnaire. CRCs were used on the numeric change in transformed SEAR scores from baseline to end-of-study for each SEAR component. The

  8. Geomorphology and Geology of the Southwestern Margaritifer Sinus and Argyre Regions of Mars. Part 2: Crater Size-frequency Distribution Curves and Geomorphic Unit Ages

    Science.gov (United States)

    Parker, T. J.; Pieri, D. C.

    1985-01-01

    In assessing the relative ages of the geomorphic/geologic units, crater counts of the entire unit or nearly the entire unit were made and summed in order to get a more accurate value than obtainable by counts of isolated sections of each unit. Cumulative size-frequency counts show some interesting relationships. Most of the units show two distinct crater populations with a flattening out of the distribution curve at and below 10 km diameter craters. Above this crater size the curves for the different units diverge most notably. In general, the variance may reflect the relative ages of these units. At times, however, in the larger crater size range, these curves can overlap and cross on another. Also the error bars at these larger sizes are broader (and thus more suspect), since counts of larger craters show more scatter, whereas the unit areas remain constant. Occasional clusters of relatively large craters within a given unit, particularly one of limited areal extent, can affect the curve so that the unit might seem to be older than units which it overlies or cuts.

  9. Development of Sub-Daily Intensity Duration Frequency (IDF) Curves for Major Urban Areas in India

    Science.gov (United States)

    Ali, H.; Mishra, V.

    2014-12-01

    Extreme precipitation events disrupt urban transportation and cause enormous damage to infrastructure. Urban areas are fast responding catchments due to significant impervious surface. Stormwater designs based on daily rainfall data provide inadequate information. We, therefore, develop intensity-duration-frequency curves using sub-daily (1 hour to 12 hour) rainfall data for 57 major urban areas in India. While rain gage stations data from urban areas are most suitable, but stations are unevenly distributed and their data have gaps and inconsistencies. Therefore, we used hourly rainfall data from the Modern Era Retrospective-analysis for Research and Applications (MERRA), which provides a long term data (1979 onwards). Since reanalysis products have uncertainty associated with them we need to enhance their accuracy before their application. We compared daily rain gage station data obtained from Global Surface Summary of Day Data (GSOD) available for 65 stations for the period of 2000-2010 with gridded daily rainfall data provided by Indian Meteorological Department (IMD). 3-hourly data from NOAA/Climate Prediction Center morphing technique (CMORPH), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and the Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA) were aggregated to daily for comparison with GSOD station data . TMPA is found to be best correlated with GSOD data. We used TMPA data to correct MERRA's hourly precipitation, which were applied to develop IDF curves. We compared results with IDF curves from empirical methods and found substantial disparities in the existing stormwater designs in India.

  10. Quantifying Changes in Future Intensity-Duration-Frequency Curves Using Multimodel Ensemble Simulations

    Science.gov (United States)

    Ragno, Elisa; AghaKouchak, Amir; Love, Charlotte A.; Cheng, Linyin; Vahedifard, Farshid; Lima, Carlos H. R.

    2018-03-01

    During the last century, we have observed a warming climate with more intense precipitation extremes in some regions, likely due to increases in the atmosphere's water holding capacity. Traditionally, infrastructure design and rainfall-triggered landslide models rely on the notion of stationarity, which assumes that the statistics of extremes do not change significantly over time. However, in a warming climate, infrastructures and natural slopes will likely face more severe climatic conditions, with potential human and socioeconomical consequences. Here we outline a framework for quantifying climate change impacts based on the magnitude and frequency of extreme rainfall events using bias corrected historical and multimodel projected precipitation extremes. The approach evaluates changes in rainfall Intensity-Duration-Frequency (IDF) curves and their uncertainty bounds using a nonstationary model based on Bayesian inference. We show that highly populated areas across the United States may experience extreme precipitation events up to 20% more intense and twice as frequent, relative to historical records, despite the expectation of unchanged annual mean precipitation. Since IDF curves are widely used for infrastructure design and risk assessment, the proposed framework offers an avenue for assessing resilience of infrastructure and landslide hazard in a warming climate.

  11. Developing Intensity–Duration–Frequency (IDF Curves under Climate Change Uncertainty: The Case of Bangkok, Thailand

    Directory of Open Access Journals (Sweden)

    Ashish Shrestha

    2017-02-01

    Full Text Available The magnitude and frequency of hydrological events are expected to increase in coming years due to climate change in megacities of Asia. Intensity–Duration–Frequency (IDF curves represent essential means to study effects on the performance of drainage systems. Therefore, the need for updating IDF curves comes from the necessity to gain better understanding of climate change effects. The present paper explores an approach based on spatial downscaling-temporal disaggregation method (DDM to develop future IDFs using stochastic weather generator, Long Ashton Research Station Weather Generator (LARS-WG and the rainfall disaggregation tool, Hyetos. The work was carried out for the case of Bangkok, Thailand. The application of LARS-WG to project extreme rainfalls showed promising results and nine global climate models (GCMs were used to estimate changes in IDF characteristics for future time periods of 2011–2030 and 2046–2065 under climate change scenarios. The IDFs derived from this approach were corrected using higher order equation to mitigate biases. IDFs from all GCMs showed increasing intensities in the future for all return periods. The work presented demonstrates the potential of this approach in projecting future climate scenarios for urban catchment where long term hourly rainfall data are not readily available.

  12. Repeated exposure to high-frequency spanking and child externalizing behavior across the first decade: a moderating role for cumulative risk.

    Science.gov (United States)

    MacKenzie, Michael J; Nicklas, Eric; Brooks-Gunn, Jeanne; Waldfogel, Jane

    2014-12-01

    This study used the Fragile Families and Child Well-Being Study to examine the effects of repeated exposure to harsh parenting on child externalizing behavior across the first decade of life, and a moderating role for cumulative ecological risk. Maternal report of harsh parenting, defined as high frequency spanking, was assessed at age 1, 3, 5, and 9, along with child externalizing at age 9 (N=2,768). Controlling for gender, race, maternal nativity, and city of residence, we found a cumulative risk index to significantly moderate the effects of repeated harsh parenting on child behavior, with the effects of repeated high-frequency spanking being amplified for those experiencing greater levels of cumulative risk. Harsh parenting, in the form of high frequency spanking, remains a too common experience for children, and results demonstrate that the effects of repeated exposure to harsh parenting across the first decade are amplified for those children already facing the most burden. Copyright © 2014. Published by Elsevier Ltd.

  13. Precipitation intensity-duration-frequency curves and their uncertainties for Ghaap plateau

    Directory of Open Access Journals (Sweden)

    C.M. Tfwala

    2017-01-01

    Full Text Available Engineering infrastructures such as stormwater drains and bridges are commonly designed using the concept of Intensity-Duration-Frequency (IDF curves, which assume that the occurrence of precipitation patterns and distributions are spatially similar within the drainage area and remain unchanged throughout the lifespan of the infrastructures (stationary. Based on the premise that climate change will alter the spatial and temporal variability of precipitation patterns, inaccuracy in the estimation of IDF curves may occur. As such, prior to developing IDF curves, it is crucial to analyse trends of annual precipitation maxima. The objective of this study was to estimate the precipitation intensities and their uncertainties (lower and upper limits for durations of 0.125, 0.25, 0.5, 1, 2, 4, and 6 h and return periods of 2, 10, 25, 50 and 100 years in the Ghaap plateau, Northern Cape Province, South Africa using the Generalized Extreme Value (GEV distribution. The annual precipitation maxima were extracted from long-term (1918–2014 precipitation data for four meteorological stations (Postmasburg, Douglas, Kuruman and Groblershoop sourced from the South African Weather Services (SAWS. On average, the estimated extreme precipitation intensities for the plateau ranged from 4.2 mm/h for 6 h storm duration to 55.8 mm/h for 0.125 h at 2 years return period. At 100 year return period, the intensity ranged from 13.3 mm/h for 6 h duration to 175.5 mm/h for the duration of 0.125 h. The lower limit of uncertainty ranged from 11.7% at 2 years return period to 26% at 100 year return period, and from 12.8% to 58.4% for the upper limit for the respective return periods. This methodology can be integrated into policy formulation for the design of stormwater and flood management infrastructures in the Ghaap plateau, where mining is the main economic activity.

  14. An Empirical Method to Derive Hourly Temperature Frequencies for Locations Possessing Only Summarized Climate Information

    National Research Council Canada - National Science Library

    Krause, Paul

    1997-01-01

    ... thresholds are equaled or exceeded. Prior research focused on both estimating temperature frequencies at any point along the cumulative temperature frequency curve, and on estimation at fixed frequency points within the tails of the distribution...

  15. Analysis of LDPE-ZnO-clay nanocomposites using novel cumulative rheological parameters

    Science.gov (United States)

    Kracalik, Milan

    2017-05-01

    Polymer nanocomposites exhibit complex rheological behaviour due to physical and also possibly chemical interactions between individual phases. Up to now, rheology of dispersive polymer systems has been usually described by evaluation of viscosity curve (shear thinning phenomenon), storage modulus curve (formation of secondary plateau) or plotting information about dumping behaviour (e.g. Van Gurp-Palmen-plot, comparison of loss factor tan δ). On the contrary to evaluation of damping behaviour, values of cot δ were calculated and called as "storage factor", analogically to loss factor. Then values of storage factor were integrated over specific frequency range and called as "cumulative storage factor". In this contribution, LDPE-ZnO-clay nanocomposites with different dispersion grades (physical networks) have been prepared and characterized by both conventional as well as novel analysis approach. Next to cumulative storage factor, further cumulative rheological parameters like cumulative complex viscosity, cumulative complex modulus or cumulative storage modulus have been introduced.

  16. The frequency content of Double-Mode Cepheids light curves and the importance of the cross-coupling terms

    OpenAIRE

    Poretti, Ennio

    1997-01-01

    The recent results (Pardo & Poretti 1997, A&A 324, 121; Poretti & Pardo 1997, A&A 324, 133) obtained on the frequency content of Double-Mode Cepheids light curves and the properties of their Fourier parameters are reviewed. Some points briefly discussed in previous papers (no third periodicity, methodological aspects on the true peaks detection, the action of the cross coupling terms and the impact on theoretical models) are described.

  17. Dealing with Non-stationarity in Intensity-Frequency-Duration Curve

    Science.gov (United States)

    Rengaraju, S.; Rajendran, V.; C T, D.

    2017-12-01

    Extremes like flood and drought are becoming frequent and more vulnerable in recent times, generally attributed to the recent revelation of climate change. One of the main concerns is that whether the present infrastructures like dams, storm water drainage networks, etc., which were designed following the so called `stationary' assumption, are capable of withstanding the expected severe extremes. Stationary assumption considers that extremes are not changing with respect to time. However, recent studies proved that climate change has altered the climate extremes both temporally and spatially. Traditionally, the observed non-stationary in the extreme precipitation is incorporated in the extreme value distributions in terms of changing parameters. Nevertheless, this raises a question which parameter needs to be changed, i.e. location or scale or shape, since either one or more of these parameters vary at a given location. Hence, this study aims to detect the changing parameters to reduce the complexity involved in the development of non-stationary IDF curve and to provide the uncertainty bound of estimated return level using Bayesian Differential Evolutionary Monte Carlo (DE-MC) algorithm. Firstly, the extreme precipitation series is extracted using Peak Over Threshold. Then, the time varying parameter(s) is(are) detected for the extracted series using Generalized Additive Models for Location Scale and Shape (GAMLSS). Then, the IDF curve is constructed using Generalized Pareto Distribution incorporating non-stationarity only if the parameter(s) is(are) changing with respect to time, otherwise IDF curve will follow stationary assumption. Finally, the posterior probability intervals of estimated return revel are computed through Bayesian DE-MC approach and the non-stationary based IDF curve is compared with the stationary based IDF curve. The results of this study emphasize that the time varying parameters also change spatially and the IDF curves should incorporate non

  18. Letting the (energy) Gini out of the bottle: Lorenz curves of cumulative electricity consumption and Gini coefficients as metrics of energy distribution and equity

    International Nuclear Information System (INIS)

    Jacobson, Arne; Milman, Anita D.; Kammen, Daniel M.

    2005-01-01

    Energy services are fundamental determinants of the quality of life as well as the economic vitality of both industrialized and developing nations. Few analytic tools exist, however, to explore changes in individual, household, and national levels of energy consumption and utilization. In order to contribute to such analyses, we extend the application of Lorenz curves to energy consumption. We examined the distribution of residential electricity consumption in five countries: Norway, USA, El Salvador, Thailand, and Kenya. These countries exhibit a dramatic range of energy profiles, with electricity consumption far more evenly distributed across the population in some industrialized nations than others, and with further significant differences in the Lorenz distribution between industrialized and industrializing economies. The metric also provides critical insights into the temporal evolution of energy management in different states and nations. We illustrate this with a preliminary longitudinal study of commercial and industrial electricity use in California during the economically volatile 1990s. Finally, we explore the limits of Lorenz analyses for understanding energy equity through a discussion of the roles that variations in energy conversion efficiency and climate play in shaping distributions of energy consumption. The Lorenz method, which is widely employed by economists to analyze income distribution, is largely unused in energy analysis, but provides a powerful new tool for estimating the distributional dimensions of energy consumption. Its widespread use can make significant contributions to scientific and policy debates about energy equity in the context of climate change mitigation, electric power industry deregulation and restructuring, and the development of national infrastructure

  19. Derivation of flood frequency curves in poorly gauged Mediterranean catchments using a simple stochastic hydrological rainfall-runoff model

    Science.gov (United States)

    Aronica, G. T.; Candela, A.

    2007-12-01

    SummaryIn this paper a Monte Carlo procedure for deriving frequency distributions of peak flows using a semi-distributed stochastic rainfall-runoff model is presented. The rainfall-runoff model here used is very simple one, with a limited number of parameters and practically does not require any calibration, resulting in a robust tool for those catchments which are partially or poorly gauged. The procedure is based on three modules: a stochastic rainfall generator module, a hydrologic loss module and a flood routing module. In the rainfall generator module the rainfall storm, i.e. the maximum rainfall depth for a fixed duration, is assumed to follow the two components extreme value (TCEV) distribution whose parameters have been estimated at regional scale for Sicily. The catchment response has been modelled by using the Soil Conservation Service-Curve Number (SCS-CN) method, in a semi-distributed form, for the transformation of total rainfall to effective rainfall and simple form of IUH for the flood routing. Here, SCS-CN method is implemented in probabilistic form with respect to prior-to-storm conditions, allowing to relax the classical iso-frequency assumption between rainfall and peak flow. The procedure is tested on six practical case studies where synthetic FFC (flood frequency curve) were obtained starting from model variables distributions by simulating 5000 flood events combining 5000 values of total rainfall depth for the storm duration and AMC (antecedent moisture conditions) conditions. The application of this procedure showed how Monte Carlo simulation technique can reproduce the observed flood frequency curves with reasonable accuracy over a wide range of return periods using a simple and parsimonious approach, limited data input and without any calibration of the rainfall-runoff model.

  20. Precipitation extremes on multiple timescales - Bartlett-Lewis rectangular pulse model and intensity-duration-frequency curves

    Science.gov (United States)

    Ritschel, Christoph; Ulbrich, Uwe; Névir, Peter; Rust, Henning W.

    2017-12-01

    For several hydrological modelling tasks, precipitation time series with a high (i.e. sub-daily) resolution are indispensable. The data are, however, not always available, and thus model simulations are used to compensate. A canonical class of stochastic models for sub-daily precipitation are Poisson cluster processes, with the original Bartlett-Lewis (OBL) model as a prominent representative. The OBL model has been shown to well reproduce certain characteristics found in observations. Our focus is on intensity-duration-frequency (IDF) relationships, which are of particular interest in risk assessment. Based on a high-resolution precipitation time series (5 min) from Berlin-Dahlem, OBL model parameters are estimated and IDF curves are obtained on the one hand directly from the observations and on the other hand from OBL model simulations. Comparing the resulting IDF curves suggests that the OBL model is able to reproduce the main features of IDF statistics across several durations but cannot capture rare events (here an event with a return period larger than 1000 years on the hourly timescale). In this paper, IDF curves are estimated based on a parametric model for the duration dependence of the scale parameter in the generalized extreme value distribution; this allows us to obtain a consistent set of curves over all durations. We use the OBL model to investigate the validity of this approach based on simulated long time series.

  1. Use of radar QPE for the derivation of Intensity-Duration-Frequency curves in a range of climatic regimes

    Science.gov (United States)

    Marra, Francesco; Morin, Efrat

    2015-12-01

    Intensity-Duration-Frequency (IDF) curves are widely used in flood risk management because they provide an easy link between the characteristics of a rainfall event and the probability of its occurrence. Weather radars provide distributed rainfall estimates with high spatial and temporal resolutions and overcome the scarce representativeness of point-based rainfall for regions characterized by large gradients in rainfall climatology. This work explores the use of radar quantitative precipitation estimation (QPE) for the identification of IDF curves over a region with steep climatic transitions (Israel) using a unique radar data record (23 yr) and combined physical and empirical adjustment of the radar data. IDF relationships were derived by fitting a generalized extreme value distribution to the annual maximum series for durations of 20 min, 1 h and 4 h. Arid, semi-arid and Mediterranean climates were explored using 14 study cases. IDF curves derived from the study rain gauges were compared to those derived from radar and from nearby rain gauges characterized by similar climatology, taking into account the uncertainty linked with the fitting technique. Radar annual maxima and IDF curves were generally overestimated but in 70% of the cases (60% for a 100 yr return period), they lay within the rain gauge IDF confidence intervals. Overestimation tended to increase with return period, and this effect was enhanced in arid climates. This was mainly associated with radar estimation uncertainty, even if other effects, such as rain gauge temporal resolution, cannot be neglected. Climatological classification remained meaningful for the analysis of rainfall extremes and radar was able to discern climatology from rainfall frequency analysis.

  2. Age frequency distribution and revised stable isotope curves for New Zealand speleothems: palaeoclimatic implications

    Directory of Open Access Journals (Sweden)

    Williams Paul W.

    2010-07-01

    Full Text Available The occurrence of speleothems in New Zealand with reversed magnetism indicates that secondary calcite deposition in caves has occurred for more than 780 thousand years (ka. 394 uranium-series dates on 148 speleothems show that such deposition has taken place somewhere in the country with little interruption for more than 500 ka. A relative probability distribution of speleothem ages indicates that most growth occurred in mild, moist interglacial and interstadial intervals, a conclusion reinforced by comparing peaks and troughs in the distribution with time series curves of speleothem δ18O and δ13C values. The stable isotope time series were constructed using data from 15 speleothems from two different regions of the country. The greater the number of overlapping speleothem series (i.e. the greater the sample depth for any one region, the more confidence is justified in considering the stacked record to be representative of the region. Revising and extending earlier work, composite records are produced for central-west North Island (CWNI and north-west South Island (NWSI. Both demonstrate that over the last 15 ka the regions responded similarly to global climatic events, but that the North Island site was also influenced by the waxing and waning of regional subtropical marine influences that penetrated from the north but did not reach the higher latitudes of the South Island. Cooling marking the commencement of the last glacial maximum (LGM was evident from about 28 ka. There was a mid-LGM interstadial at 23-21.7 ka and Termination 1 occurred around 18.1 ka. The glacial-interglacial transition was marked by a series of negative excursions in δ18O that coincide with dated recessional moraines in South Island glaciers. A late glacial cooling event, the NZ Late Glacial Reversal, occurred from 13.4-11.2 ka and this was followed by an early Holocene optimum at 10.8 ka. Comparison of δ18O records from NWSI and EPICA DML ice-core shows climatic

  3. Effect of training frequency on the learning curve on the da Vinci Skills Simulator.

    Science.gov (United States)

    Walliczek, Ute; Förtsch, Arne; Dworschak, Philipp; Teymoortash, Afshin; Mandapathil, Magis; Werner, Jochen; Güldner, Christian

    2016-04-01

    The purpose of this study was to evaluate the effect of training on the performance outcome with the da Vinci Skills Simulator. Forty novices were enrolled in a prospective training curriculum. Participants were separated into 2 groups. Group 1 performed 4 training sessions and group 2 had 2 training sessions over a 4-week period. Five exercises were performed 3 times consecutively. On the last training day, a new exercise was added. A significant skills gain from the first to the final practice day in overall performance, time to complete, and economy of motion was seen for both groups. Group 1 had a significantly better outcome in overall performance, time to complete, and economy of motion in all exercises. There was no significant difference found regarding the new exercise in group 1 versus group 2 in nearly all parameters. Longer time distances between training sessions are assumed to play a secondary role, whereas total repetition frequency is crucial for improvement of technical performance. © 2015 Wiley Periodicals, Inc. Head Neck 38: E1762-E1769, 2016. © 2015 Wiley Periodicals, Inc.

  4. An Optical Low-frequency Quasi-Periodic Oscillation in the Kepler Light Curve of an Active Galaxy

    Science.gov (United States)

    Mushotzky, Richard; Smith, Krista Lynne; Boyd, Patricia; Wagoner, Robert

    2018-01-01

    We report the discovery of a candidate quasi-periodic oscillation (QPO) in the optical light curve of KIC 9650712, a Seyfert 1 galaxy in the original Kepler field. After the development and application of a pipeline for Kepler data specific to active galactic nuclei (AGN), one of our sample of 21 AGN selected by infrared photometry and X-ray flux demonstrates a peak in the power spectrum at 10-6.58 Hz, corresponding to a temporal period of 44 days. >From optical spectroscopy, we measure the black hole mass of this AGN as log M = 8.17 M_sun. Despite this high mass, the optical spectrum of KIC 9650712 bears many similarities to Narrow Line Seyfert 1 (NLS1) galaxies, including strong Fe II emission and a low [O III]/Hβ ratio. So far, X-ray QPOs have primarily been seen in NLS1 galaxies. Finally, we find that this frequency lies along a correlation between low-frequency QPOs and black hole mass from stellar and intermediate mass black holes to AGN, similar to the known correlation in high-frequency QPOs.

  5. Aspirin plus dipyridamole has the highest surface under the cumulative ranking curves (SUCRA) values in terms of mortality, intracranial hemorrhage, and adverse event rate among 7 drug therapies in the treatment of cerebral infarction.

    Science.gov (United States)

    Zhang, Jian-Jun; Liu, Xin

    2018-03-01

    The standardization for the clinical use of drug therapy for cerebral infarction (CI) has not yet determined in some aspects. In this paper, we discussed the efficacies of different drug therapies (aspirin, aspirin plus dipyridamole, aspirin plus clopidogrel, aspirin plus warfarin, cilostazol, warfarin, and ticlopidine) for CI. We searched databases of PubMed and Cochrane Library from the inception to April, 2017, randomized controlled trials (RCTs) met the inclusion and exclusion criteria were enrolled in this study. The network meta-analysis integrated evidences of direct and indirect comparisons to assess odd ratios (OR) and surface under the cumulative ranking curves (SUCRA) value. Thirteen eligible RCTs including 7 drug therapies were included into this network meta-analysis. The network meta-analysis results showed that CI patients who received aspirin plus dipyridamole presented lower mortality when compared with those received aspirin plus clopidogrel (OR = 0.46, 95% CI = 0.18-0.99), indicating aspirin plus dipyridamole therapy had better efficacy for CI. As for intracranial hemorrhage (ICH), stroke recurrence, and adverse event (AE) rate, there were no significant differences of efficacy among 7 drug therapies. Besides, SUCRA values demonstrated that in the 7 drug therapies, aspirin plus dipyridamole therapy was more effective than others (mortality: 80.67%; ICH: 76.6%; AE rate: 90.2%). Our findings revealed that aspirin plus dipyridamole therapy might be the optimum one for patients with CI, which could help to improve the survival of CI patients.

  6. Intensity-Duration-Frequency curves from remote sensing datasets: direct comparison of weather radar and CMORPH over the Eastern Mediterranean

    Science.gov (United States)

    Morin, Efrat; Marra, Francesco; Peleg, Nadav; Mei, Yiwen; Anagnostou, Emmanouil N.

    2017-04-01

    Rainfall frequency analysis is used to quantify the probability of occurrence of extreme rainfall and is traditionally based on rain gauge records. The limited spatial coverage of rain gauges is insufficient to sample the spatiotemporal variability of extreme rainfall and to provide the areal information required by management and design applications. Conversely, remote sensing instruments, even if quantitative uncertain, offer coverage and spatiotemporal detail that allow overcoming these issues. In recent years, remote sensing datasets began to be used for frequency analyses, taking advantage of increased record lengths and quantitative adjustments of the data. However, the studies so far made use of concepts and techniques developed for rain gauge (i.e. point or multiple-point) data and have been validated by comparison with gauge-derived analyses. These procedures add further sources of uncertainty and prevent from isolating between data and methodological uncertainties and from fully exploiting the available information. In this study, we step out of the gauge-centered concept presenting a direct comparison between at-site Intensity-Duration-Frequency (IDF) curves derived from different remote sensing datasets on corresponding spatial scales, temporal resolutions and records. We analyzed 16 years of homogeneously corrected and gauge-adjusted C-Band weather radar estimates, high-resolution CMORPH and gauge-adjusted high-resolution CMORPH over the Eastern Mediterranean. Results of this study include: (a) good spatial correlation between radar and satellite IDFs ( 0.7 for 2-5 years return period); (b) consistent correlation and dispersion in the raw and gauge adjusted CMORPH; (c) bias is almost uniform with return period for 12-24 h durations; (d) radar identifies thicker tail distributions than CMORPH and the tail of the distributions depends on the spatial and temporal scales. These results demonstrate the potential of remote sensing datasets for rainfall

  7. Equivalent distributed capacitance model of oxide traps on frequency dispersion of C-V curve for MOS capacitors

    Science.gov (United States)

    Lu, Han-Han; Xu, Jing-Ping; Liu, Lu; Lai, Pui-To; Tang, Wing-Man

    2016-11-01

    An equivalent distributed capacitance model is established by considering only the gate oxide-trap capacitance to explain the frequency dispersion in the C-V curve of MOS capacitors measured for a frequency range from 1 kHz to 1 MHz. The proposed model is based on the Fermi-Dirac statistics and the charging/discharging effects of the oxide traps induced by a small ac signal. The validity of the proposed model is confirmed by the good agreement between the simulated results and experimental data. Simulations indicate that the capacitance dispersion of an MOS capacitor under accumulation and near flatband is mainly caused by traps adjacent to the oxide/semiconductor interface, with negligible effects from the traps far from the interface, and the relevant distance from the interface at which the traps can still contribute to the gate capacitance is also discussed. In addition, by excluding the negligible effect of oxide-trap conductance, the model avoids the use of imaginary numbers and complex calculations, and thus is simple and intuitive. Project supported by the National Natural Science Foundation of China (Grant Nos. 61176100 and 61274112), the University Development Fund of the University of Hong Kong, China (Grant No. 00600009), and the Hong Kong Polytechnic University, China (Grant No. 1-ZVB1).

  8. Generation of intensity duration frequency curves and intensity temporal variability pattern of intense rainfall for Lages/SC

    Directory of Open Access Journals (Sweden)

    Célio Orli Cardoso

    2014-04-01

    Full Text Available The objective of this work was to analyze the frequency distribution and intensity temporal variability of intense rainfall for Lages/SC from diary pluviograph data. Data on annual series of maximum rainfalls from rain gauges of the CAV-UDESC Weather Station in Lages/SC were used from 2000 to 2009. Gumbel statistic distribution was applied in order to obtain the rainfall height and intensity in the following return periods: 2, 5, 10, 15 and 20 years. Results showed intensity-duration-frequency curves (I-D-F for those return periods, as well as I-D-F equations: i=2050.Tr0,20.(t+30-0,89, where i was the intensity, Tr was the rainfall return periods and t was the rainfall duration. For the intensity of temporal variability pattern along of the rainfall duration time, the convective, or advanced pattern was the predominant, with larger precipitate rainfalls in the first half of the duration. The same pattern presented larger occurrences in the spring and summer stations.

  9. Equivalent distributed capacitance model of oxide traps on frequency dispersion of C – V curve for MOS capacitors

    International Nuclear Information System (INIS)

    Lu Han-Han; Xu Jing-Ping; Liu Lu; Lai Pui-To; Tang Wing-Man

    2016-01-01

    An equivalent distributed capacitance model is established by considering only the gate oxide-trap capacitance to explain the frequency dispersion in the C – V curve of MOS capacitors measured for a frequency range from 1 kHz to 1 MHz. The proposed model is based on the Fermi–Dirac statistics and the charging/discharging effects of the oxide traps induced by a small ac signal. The validity of the proposed model is confirmed by the good agreement between the simulated results and experimental data. Simulations indicate that the capacitance dispersion of an MOS capacitor under accumulation and near flatband is mainly caused by traps adjacent to the oxide/semiconductor interface, with negligible effects from the traps far from the interface, and the relevant distance from the interface at which the traps can still contribute to the gate capacitance is also discussed. In addition, by excluding the negligible effect of oxide-trap conductance, the model avoids the use of imaginary numbers and complex calculations, and thus is simple and intuitive. (paper)

  10. Rainfall and runoff Intensity-Duration-Frequency Curves for Washington State considering the change and uncertainty of observed and anticipated extreme rainfall and snow events

    Science.gov (United States)

    Demissie, Y. K.; Mortuza, M. R.; Li, H. Y.

    2015-12-01

    The observed and anticipated increasing trends in extreme storm magnitude and frequency, as well as the associated flooding risk in the Pacific Northwest highlighted the need for revising and updating the local intensity-duration-frequency (IDF) curves, which are commonly used for designing critical water infrastructure. In Washington State, much of the drainage system installed in the last several decades uses IDF curves that are outdated by as much as half a century, making the system inadequate and vulnerable for flooding as seen more frequently in recent years. In this study, we have developed new and forward looking rainfall and runoff IDF curves for each county in Washington State using recently observed and projected precipitation data. Regional frequency analysis coupled with Bayesian uncertainty quantification and model averaging methods were used to developed and update the rainfall IDF curves, which were then used in watershed and snow models to develop the runoff IDF curves that explicitly account for effects of snow and drainage characteristic into the IDF curves and related designs. The resulted rainfall and runoff IDF curves provide more reliable, forward looking, and spatially resolved characteristics of storm events that can assist local decision makers and engineers to thoroughly review and/or update the current design standards for urban and rural storm water management infrastructure in order to reduce the potential ramifications of increasing severe storms and resulting floods on existing and planned storm drainage and flood management systems in the state.

  11. Investigation of the SCS-CN initial abstraction ratio using a Monte Carlo simulation for the derived flood frequency curves

    Science.gov (United States)

    Caporali, E.; Chiarello, V.; Galeati, G.

    2014-12-01

    Peak discharges estimates for a given return period are of primary importance in engineering practice for risk assessment and hydraulic structure design. Different statistical methods are chosen here for the assessment of flood frequency curve: one indirect technique based on the extreme rainfall event analysis, the Peak Over Threshold (POT) model and the Annual Maxima approach as direct techniques using river discharge data. In the framework of the indirect method, a Monte Carlo simulation approach is adopted to determine a derived frequency distribution of peak runoff using a probabilistic formulation of the SCS-CN method as stochastic rainfall-runoff model. A Monte Carlo simulation is used to generate a sample of different runoff events from different stochastic combination of rainfall depth, storm duration, and initial loss inputs. The distribution of the rainfall storm events is assumed to follow the GP law whose parameters are estimated through GEV's parameters of annual maximum data. The evaluation of the initial abstraction ratio is investigated since it is one of the most questionable assumption in the SCS-CN model and plays a key role in river basin characterized by high-permeability soils, mainly governed by infiltration excess mechanism. In order to take into account the uncertainty of the model parameters, this modified approach, that is able to revise and re-evaluate the original value of the initial abstraction ratio, is implemented. In the POT model the choice of the threshold has been an essential issue, mainly based on a compromise between bias and variance. The Generalized Extreme Value (GEV) distribution fitted to the annual maxima discharges is therefore compared with the Pareto distributed peaks to check the suitability of the frequency of occurrence representation. The methodology is applied to a large dam in the Serchio river basin, located in the Tuscany Region. The application has shown as Monte Carlo simulation technique can be a useful

  12. Methods for estimating flow-duration curve and low-flow frequency statistics for ungaged locations on small streams in Minnesota

    Science.gov (United States)

    Ziegeweid, Jeffrey R.; Lorenz, David L.; Sanocki, Chris A.; Czuba, Christiana R.

    2015-12-24

    Knowledge of the magnitude and frequency of low flows in streams, which are flows in a stream during prolonged dry weather, is fundamental for water-supply planning and design; waste-load allocation; reservoir storage design; and maintenance of water quality and quantity for irrigation, recreation, and wildlife conservation. This report presents the results of a statewide study for which regional regression equations were developed for estimating 13 flow-duration curve statistics and 10 low-flow frequency statistics at ungaged stream locations in Minnesota. The 13 flow-duration curve statistics estimated by regression equations include the 0.0001, 0.001, 0.02, 0.05, 0.1, 0.25, 0.50, 0.75, 0.9, 0.95, 0.99, 0.999, and 0.9999 exceedance-probability quantiles. The low-flow frequency statistics include annual and seasonal (spring, summer, fall, winter) 7-day mean low flows, seasonal 30-day mean low flows, and summer 122-day mean low flows for a recurrence interval of 10 years. Estimates of the 13 flow-duration curve statistics and the 10 low-flow frequency statistics are provided for 196 U.S. Geological Survey continuous-record streamgages using streamflow data collected through September 30, 2012.

  13. Climatic and basin factors affecting the flood frequency curve: PART I – A simple sensitivity analysis based on the continuous simulation approach

    Directory of Open Access Journals (Sweden)

    A. M. Hashemi

    2000-01-01

    Full Text Available Regionalized and at-site flood frequency curves exhibit considerable variability in their shapes, but the factors controlling the variability (other than sampling effects are not well understood. An application of the Monte Carlo simulation-based derived distribution approach is presented in this two-part paper to explore the influence of climate, described by simulated rainfall and evapotranspiration time series, and basin factors on the flood frequency curve (ffc. The sensitivity analysis conducted in the paper should not be interpreted as reflecting possible climate changes, but the results can provide an indication of the changes to which the flood frequency curve might be sensitive. A single site Neyman Scott point process model of rainfall, with convective and stratiform cells (Cowpertwait, 1994; 1995, has been employed to generate synthetic rainfall inputs to a rainfall runoff model. The time series of the potential evapotranspiration (ETp demand has been represented through an AR(n model with seasonal component, while a simplified version of the ARNO rainfall-runoff model (Todini, 1996 has been employed to simulate the continuous discharge time series. All these models have been parameterised in a realistic manner using observed data and results from previous applications, to obtain ‘reference’ parameter sets for a synthetic case study. Subsequently, perturbations to the model parameters have been made one-at-a-time and the sensitivities of the generated annual maximum rainfall and flood frequency curves (unstandardised, and standardised by the mean have been assessed. Overall, the sensitivity analysis described in this paper suggests that the soil moisture regime, and, in particular, the probability distribution of soil moisture content at the storm arrival time, can be considered as a unifying link between the perturbations to the several parameters and their effects on the standardised and unstandardised ffcs, thus revealing the

  14. Dual frequency modulation with two cantilevers in series: a possible means to rapidly acquire tip–sample interaction force curves with dynamic AFM

    International Nuclear Information System (INIS)

    Solares, Santiago D; Chawla, Gaurav

    2008-01-01

    One common application of atomic force microscopy (AFM) is the acquisition of tip–sample interaction force curves. However, this can be a slow process when the user is interested in studying non-uniform samples, because existing contact- and dynamic-mode methods require that the measurement be performed at one fixed surface point at a time. This paper proposes an AFM method based on dual frequency modulation using two cantilevers in series, which could be used to measure the tip–sample interaction force curves and topography of the entire sample with a single surface scan, in a time that is comparable to the time needed to collect a topographic image with current AFM imaging modes. Numerical simulation results are provided along with recommended parameters to characterize tip–sample interactions resembling those of conventional silicon tips and carbon nanotube tips tapping on silicon surfaces

  15. Cumulative Poisson Distribution Program

    Science.gov (United States)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  16. Activation energy and R. Chen frequency factor in the Randall and Wilkins original equation for second order kinetics. Emission curve simulated in Microsoft Excel algebra

    International Nuclear Information System (INIS)

    Moreno M, A.; Moreno B, A.

    2000-01-01

    In this work the incorporation of activation energy and frequency factor parameters proposed by R. Chen are presented in the original formulation of Randall and wilkins second order kinetics. The results concordance are compared between the calculus following the R. Chen methodology with those ones obtained by direct incorporation of the previously indicated in the Randall-Wilkins-Levy expression for a simulated thermoluminescent emission curve of two peaks with maximum peak temperature (tm): t m1=120 and t m2=190. (Author)

  17. High resolution sea-level curve for the latest Frasnian and earliest Famennian derived for high frequency sequences in the Appalachian Basin

    Energy Technology Data Exchange (ETDEWEB)

    Filer, J.K. (Washington and Lee Univ., Lexington, VA (United States). Dept. of Geology)

    1992-01-01

    Siliciclastic sequences have been mapped in the subsurface and outcrop of much of the Appalachian basin in facies ranging from shale in the basin plain to shelf sandstone. Eleven transgressive/regressive cycles have been defined in an estimated 1.5 to 2.0 Ma period in the latest Frasnian and earliest Famennian, and range in duration from about 75,000 to 400,000 years. Lithofacies maps, covering most of the basin, were prepared for each sequence. These maps show both the area of basinal black shale deposition, which defines the base of each cycle, and the areal extent of subsequent clinoform siltstone and shelf sandstone deposition in the upper portion of each cycle. The stratigraphic patterns show two stacked sets of progradational basinwide sequences. Geographic scale of the study precludes autocyclic controls of cycles. Sea-level/climate cycles, probably superimposed on longer term tectonic cycles, are the proposed cause of these observed depositional patterns. Removal of the long-term progradational trend of Upper Devonian basin filling results in a proposed eustatic sea-level curve (Johnson and others (1985)) reveals correspondence of three regressive maxima in both models. The curve presented here reveals that an ongoing process of higher frequency sea-level modification was active at this time. Higher frequency sea-level events, nested within previously interpreted lower frequency global events, are inferred to also be eustatic. Models of a biotic crises which occurs at this time should consider the implications of these high frequency sea-level cycles. The patterns observed are consistent with latest Frasnian initiation of glaciation in South America. This would be somewhat earlier than has generally been accepted.

  18. Complexity of the ultraviolet mutation frequency response curve in Escherichia coli B/r: SOS induction, one-lesion and two-lesion mutagenesis

    International Nuclear Information System (INIS)

    Doudney, C.O.

    1976-01-01

    Three distinct sections of the ultraviolet mutation frequency response (MFR) curve toward tryptophan prototrophy have been demonstrated in Escherichia coli B/r WP2 trp thy and its uvrA derivative in log-phase growth in minimal medium. The initial section, which appears fluence-squared, may reflect the necessity, if mutation is to result, for induction of two lesions, one located within the potentially mutated genetic locus and the other damaging deoxyribonucleic acid replication and resulting in induction of the error-prone SOS repair function. A second linear section is ascribed to the continued induction, after exposure above that sufficient for complete SOS expression, of isolated lesions which lead to mutation in potentially mutated loci. The third section demonstrates an increased rate of mutagenesis and suggests the induction of two lesions in proximity which result in additional mutations. Split-exposure studies support the inducible nature of the SOS function and suggest that mutation frequency decline (MFD) is due to excision resulting from or related to the prevention of SOS induction by inhibition of protein synthesis. Preirradiation tryptophan starvation of the uvr + strain for 30 min decreases MFR in the first and second sections of the curve. Reduction of MFR in the third section requires more prestarvation time and is blocked by nalidixic acid. The decreased MFR of the first and second sections is ascribed to promotion of postirradiation MFD based on excision and that of the third section to completion of the chromosome during the prestarvation period

  19. Automatic Feature Selection and Weighting for the Formation of Homogeneous Groups for Regional Intensity-Duration-Frequency (IDF) Curve Estimation

    Science.gov (United States)

    Yang, Z.; Burn, D. H.

    2017-12-01

    Extreme rainfall events can have devastating impacts on society. To quantify the associated risk, the IDF curve has been used to provide the essential rainfall-related information for urban planning. However, the recent changes in the rainfall climatology caused by climate change and urbanization have made the estimates provided by the traditional regional IDF approach increasingly inaccurate. This inaccuracy is mainly caused by two problems: 1) The ineffective choice of similarity indicators for the formation of a homogeneous group at different regions; and 2) An inadequate number of stations in the pooling group that does not adequately reflect the optimal balance between group size and group homogeneity or achieve the lowest uncertainty in the rainfall quantiles estimates. For the first issue, to consider the temporal difference among different meteorological and topographic indicators, a three-layer design is proposed based on three stages in the extreme rainfall formation: cloud formation, rainfall generation and change of rainfall intensity above urban surface. During the process, the impacts from climate change and urbanization are considered through the inclusion of potential relevant features at each layer. Then to consider spatial difference of similarity indicators for the homogeneous group formation at various regions, an automatic feature selection and weighting algorithm, specifically the hybrid searching algorithm of Tabu search, Lagrange Multiplier and Fuzzy C-means Clustering, is used to select the optimal combination of features for the potential optimal homogenous groups formation at a specific region. For the second issue, to compare the uncertainty of rainfall quantile estimates among potential groups, the two sample Kolmogorov-Smirnov test-based sample ranking process is used. During the process, linear programming is used to rank these groups based on the confidence intervals of the quantile estimates. The proposed methodology fills the gap

  20. a new approach of Analysing GRB light curves

    International Nuclear Information System (INIS)

    Varga, B.; Horvath, I.

    2005-01-01

    We estimated the T xx quantiles of the cumulative GRB light curves using our recalculated background. The basic information of the light curves was extracted by multivariate statistical methods. The possible classes of the light curves are also briefly discussed

  1. Precipitation intensity-duration-frequency curves for central Belgium with an ensemble of EURO-CORDEX simulations, and associated uncertainties

    Science.gov (United States)

    Hosseinzadehtalaei, Parisa; Tabari, Hossein; Willems, Patrick

    2018-02-01

    An ensemble of 88 regional climate model (RCM) simulations at 0.11° and 0.44° spatial resolutions from the EURO-CORDEX project is analyzed for central Belgium to investigate the projected impact of climate change on precipitation intensity-duration-frequency (IDF) relationships and extreme precipitation quantiles typically used in water engineering designs. The rate of uncertainty arising from the choice of RCM, driving GCM, and radiative concentration pathway (RCP4.5 & RCP8.5) is quantified using a variance decomposition technique after reconstruction of missing data in GCM × RCM combinations. A comparative analysis between the historical simulations of the EURO-CORDEX 0.11° and 0.44° RCMs shows higher precipitation intensities by the finer resolution runs, leading to a larger overestimation of the observations-based IDFs by the 0.11° runs. The results reveal that making a temporal stationarity assumption for the climate system may lead to underestimation of precipitation quantiles up to 70% by the end of this century. This projected increase is generally larger for the 0.11° RCMs compared with the 0.44° RCMs. The relative changes in extreme precipitation do depend on return period and duration, indicating an amplification for larger return periods and for smaller durations. The variance decomposition approach generally identifies RCM as the most dominant component of uncertainty in changes of more extreme precipitation (return period of 10 years) for both 0.11° and 0.44° resolutions, followed by GCM and RCP scenario. The uncertainties associated with cross-contributions of RCMs, GCMs, and RCPs play a non-negligible role in the associated uncertainties of the changes.

  2. Divergent Cumulative Cultural Evolution

    OpenAIRE

    Marriott, Chris; Chebib, Jobran

    2016-01-01

    Divergent cumulative cultural evolution occurs when the cultural evolutionary trajectory diverges from the biological evolutionary trajectory. We consider the conditions under which divergent cumulative cultural evolution can occur. We hypothesize that two conditions are necessary. First that genetic and cultural information are stored separately in the agent. Second cultural information must be transferred horizontally between agents of different generations. We implement a model with these ...

  3. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date....... It is common practice to apply the Kaplan-Meier or Aalen-Johansen estimator to the total sample and report either the estimated cumulative incidence curve or just a single point on the curve as a description of the disease risk. METHODS: We argue that, whenever the disease or disorder of interest is influenced...

  4. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  5. Climatic and basin factors affecting the flood frequency curve: PART II – A full sensitivity analysis based on the continuous simulation approach combined with a factorial experimental design

    Directory of Open Access Journals (Sweden)

    M. Franchini

    2000-01-01

    Full Text Available The sensitivity analysis described in Hashemi et al. (2000 is based on one-at-a-time perturbations to the model parameters. This type of analysis cannot highlight the presence of parameter interactions which might indeed affect the characteristics of the flood frequency curve (ffc even more than the individual parameters. For this reason, the effects of the parameters of the rainfall, rainfall runoff models and of the potential evapotranspiration demand on the ffc are investigated here through an analysis of the results obtained from a factorial experimental design, where all the parameters are allowed to vary simultaneously. This latter, more complex, analysis confirms the results obtained in Hashemi et al. (2000 thus making the conclusions drawn there of wider validity and not related strictly to the reference set selected. However, it is shown that two-factor interactions are present not only between different pairs of parameters of an individual model, but also between pairs of parameters of different models, such as rainfall and rainfall-runoff models, thus demonstrating the complex interaction between climate and basin characteristics affecting the ffc and in particular its curvature. Furthermore, the wider range of climatic regime behaviour produced within the factorial experimental design shows that the probability distribution of soil moisture content at the storm arrival time is no longer sufficient to explain the link between the perturbations to the parameters and their effects on the ffc, as was suggested in Hashemi et al. (2000. Other factors have to be considered, such as the probability distribution of the soil moisture capacity, and the rainfall regime, expressed through the annual maximum rainfalls over different durations. Keywords: Monte Carlo simulation; factorial experimental design; analysis of variance (ANOVA

  6. Cumulation of light nuclei

    International Nuclear Information System (INIS)

    Baldin, A.M.; Bondarev, V.K.; Golovanov, L.B.

    1977-01-01

    Limit fragmentation of light nuclei (deuterium, helium) bombarded with 8,6 GeV/c protons was investigated. Fragments (pions, protons and deuterons) were detected within the emission angle 50-150 deg with regard to primary protons and within the pulse range 150-180 MeV/c. By the kinematics of collision of a primary proton with a target at rest the fragments observed correspond to a target mass upto 3 GeV. Thus, the data obtained correspond to teh cumulation upto the third order

  7. CROSSER - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.

  8. The Effect of Total Cumulative Dose, Number of Treatment Cycles, Interval between Injections, and Length of Treatment on the Frequency of Occurrence of Antibodies to Botulinum Toxin Type A in the Treatment of Muscle Spasticity

    Science.gov (United States)

    Bakheit, Abdel Magid O.; Liptrot, Anthea; Newton, Rachel; Pickett, Andrew M.

    2012-01-01

    A large cumulative dose of botulinum toxin type A (BoNT-A), frequent injections, a short interval between treatment cycles, and a long duration of treatment have all been suggested, but not confirmed, to be associated with a high incidence of neutralizing antibodies to the neurotoxin. The aim of this study was to investigate whether these…

  9. Cumulative environmental effects. Summary

    International Nuclear Information System (INIS)

    2012-01-01

    This report presents a compilation of knowledge about the state of the environment and human activity in the Norwegian part of the North Sea and Skagerrak. The report gives an overview of pressures and impacts on the environment from normal activity and in the event of accidents. This is used to assess the cumulative environmental effects, which factors have most impact and where the impacts are greatest, and to indicate which problems are expected to be most serious in the future. The report is intended to provide relevant information that can be used in the management of the marine area in the future. It also provides input for the identification of environmental targets and management measures for the North Sea and Skagerrak.(Author)

  10. Cumulative environmental effects. Summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    This report presents a compilation of knowledge about the state of the environment and human activity in the Norwegian part of the North Sea and Skagerrak. The report gives an overview of pressures and impacts on the environment from normal activity and in the event of accidents. This is used to assess the cumulative environmental effects, which factors have most impact and where the impacts are greatest, and to indicate which problems are expected to be most serious in the future. The report is intended to provide relevant information that can be used in the management of the marine area in the future. It also provides input for the identification of environmental targets and management measures for the North Sea and Skagerrak.(Author)

  11. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  12. Cumulative stress and autonomic dysregulation in a community sample.

    Science.gov (United States)

    Lampert, Rachel; Tuit, Keri; Hong, Kwang-Ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita

    2016-05-01

    Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers (N = 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview (CAI), a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24-h ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors and cumulative adverse life events (CALE) were all inversely associated with ultra low-frequency (ULF), very low-frequency (VLF) and low-frequency (LF) power and SDNN (all p accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance alone but were not longer significant after adjusting for race and health behaviors. In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial.

  13. Cumulative radiation effect

    International Nuclear Information System (INIS)

    Kirk, J.; Gray, W.M.; Watson, E.R.

    1977-01-01

    In five previous papers, the concept of Cumulative Radiation Effect (CRE) has been presented as a scale of accumulative sub-tolerance radiation damage, with a unique value of the CRE describing a specific level of radiation effect. Simple nomographic and tabular methods for the solution of practical problems in radiotherapy are now described. An essential feature of solving a CRE problem is firstly to present it in a concise and readily appreciated form, and, to do this, nomenclature has been introduced to describe schedules and regimes as compactly as possible. Simple algebraic equations have been derived to describe the CRE achieved by multi-schedule regimes. In these equations, the equivalence conditions existing at the junctions between schedules are not explicit and the equations are based on the CREs of the constituent schedules assessed individually without reference to their context in the regime as a whole. This independent evaluation of CREs for each schedule has resulted in a considerable simplification in the calculation of complex problems. The calculations are further simplified by the use of suitable tables and nomograms, so that the mathematics involved is reduced to simple arithmetical operations which require at the most the use of a slide rule but can be done by hand. The order of procedure in the presentation and calculation of CRE problems can be summarised in an evaluation procedure sheet. The resulting simple methods for solving practical problems of any complexity on the CRE-system are demonstrated by a number of examples. (author)

  14. Cumulative radiation effect

    International Nuclear Information System (INIS)

    Kirk, J.; Cain, O.; Gray, W.M.

    1977-01-01

    Cumulative Radiation Effect (CRE) represents a scale of accumulative sub-tolerance radiation damage, with a unique value of the CRE describing a specific level of radiation effect. Computer calculations have been used to simplify the evaluation of problems associated with the applications of the CRE-system in radiotherapy. In a general appraisal of the applications of computers to the CRE-system, the various problems encountered in clinical radiotherapy have been categorised into those involving the evaluation of a CRE at a point in tissue and those involving the calculation of CRE distributions. As a general guide, the computer techniques adopted at the Glasgow Institute of Radiotherapeutics for the solution of CRE problems are presented, and consist basically of a package of three interactive programs for point CRE calculations and a Fortran program which calculates CRE distributions for iso-effect treatment planning. Many examples are given to demonstrate the applications of these programs, and special emphasis has been laid on the problem of treating a point in tissue with different doses per fraction on alternate treatment days. The wide range of possible clinical applications of the CRE-system has been outlined and described under the categories of routine clinical applications, retrospective and prospective surveys of patient treatment, and experimental and theoretical research. Some of these applications such as the results of surveys and studies of time optimisation of treatment schedules could have far-reaching consequences and lead to significant improvements in treatment and cure rates with the minimum damage to normal tissue. (author)

  15. Secant cumulants and toric geometry

    NARCIS (Netherlands)

    Michalek, M.; Oeding, L.; Zwiernik, P.W.

    2012-01-01

    We study the secant line variety of the Segre product of projective spaces using special cumulant coordinates adapted for secant varieties. We show that the secant variety is covered by open normal toric varieties. We prove that in cumulant coordinates its ideal is generated by binomial quadrics. We

  16. Utilization of the lower inflection point of the pressure-volume curve results in protective conventional ventilation comparable to high frequency oscillatory ventilation in an animal model of acute respiratory distress syndrome

    Directory of Open Access Journals (Sweden)

    Felipe S. Rossi

    2008-01-01

    Full Text Available INTRODUCTION: Studies comparing high frequency oscillatory and conventional ventilation in acute respiratory distress syndrome have used low values of positive end-expiratory pressure and identified a need for better recruitment and pulmonary stability with high frequency. OBJECTIVE: To compare conventional and high frequency ventilation using the lower inflection point of the pressure-volume curve as the determinant of positive end-expiratory pressure to obtain similar levels of recruitment and alveolar stability. METHODS: After lung lavage of adult rabbits and lower inflection point determination, two groups were randomized: conventional (positive end-expiratory pressure = lower inflection point; tidal volume=6 ml/kg and high frequency ventilation (mean airway pressures= lower inflection point +4 cmH2O. Blood gas and hemodynamic data were recorded over 4 h. After sacrifice, protein analysis from lung lavage and histologic evaluation were performed. RESULTS: The oxygenation parameters, protein and histological data were similar, except for the fact that significantly more normal alveoli were observed upon protective ventilation. High frequency ventilation led to lower PaCO2 levels. DISCUSSION: Determination of the lower inflection point of the pressure-volume curve is important for setting the minimum end expiratory pressure needed to keep the airways opened. This is useful when comparing different strategies to treat severe respiratory insufficiency, optimizing conventional ventilation, improving oxygenation and reducing lung injury. CONCLUSIONS: Utilization of the lower inflection point of the pressure-volume curve in the ventilation strategies considered in this study resulted in comparable efficacy with regards to oxygenation and hemodynamics, a high PaCO2 level and a lower pH. In addition, a greater number of normal alveoli were found after protective conventional ventilation in an animal model of acute respiratory distress syndrome.

  17. The challenge of cumulative impacts

    Energy Technology Data Exchange (ETDEWEB)

    Masden, Elisabeth

    2011-07-01

    Full text: As governments pledge to combat climate change, wind turbines are becoming a common feature of terrestrial and marine environments. Although wind power is a renewable energy source and a means of reducing carbon emissions, there is a need to ensure that the wind farms themselves do not damage the environment. There is particular concern over the impacts of wind farms on bird populations, and with increasing numbers of wind farm proposals, the concern focuses on cumulative impacts. Individually, a wind farm, or indeed any activity/action, may have minor effects on the environment, but collectively these may be significant, potentially greater than the sum of the individual parts acting alone. Cumulative impact assessment is a legislative requirement of environmental impact assessment but such assessments are rarely adequate restricting the acquisition of basic knowledge about the cumulative impacts of wind farms on bird populations. Reasons for this are numerous but a recurring theme is the lack of clear definitions and guidance on how to perform cumulative assessments. Here we present a conceptual framework and include illustrative examples to demonstrate how the framework can be used to improve the planning and execution of cumulative impact assessments. The core concept is that explicit definitions of impacts, actions and scales of assessment are required to reduce uncertainty in the process of assessment and improve communication between stake holders. Only when it is clear what has been included within a cumulative assessment, is it possible to make comparisons between developments. Our framework requires improved legislative guidance on the actions to include in assessments, and advice on the appropriate baselines against which to assess impacts. Cumulative impacts are currently considered on restricted scales (spatial and temporal) relating to individual development assessments. We propose that benefits would be gained from elevating cumulative

  18. Spaces of positive and negative frequency solutions of field equations in curved space--times. I. The Klein--Gordon equation in stationary space--times

    International Nuclear Information System (INIS)

    Moreno, C.

    1977-01-01

    In stationary space--times V/sub n/ x R with compact space-section manifold without boundary V/sub n/, the Klein--Gordon equation is solved by the one-parameter group of unitary operators generated by the energy operator i -1 T -1 in the Sobolev spaces H/sup l/(V/sub n/) x H/sup l/(V/sub n/). The canonical symplectic and complex structures of the associated dynamical system are calculated. The existence and the uniqueness of the Lichnerowicz kernel are established. The Hilbert spaces of positive and negative frequency-part solutions defined by means of this kernel are constructed

  19. Deriving Area-storage Curves of Global Reservoirs

    Science.gov (United States)

    Mu, M.; Tang, Q.

    2017-12-01

    Basic information including capacity, dam height, and largest water area on global reservoirs and dams is well documented in databases such as GRanD (Global Reservoirs and Dams), ICOLD (International Commission on Large Dams). However, though playing a critical role in estimating reservoir storage variations from remote sensing or hydrological models, area-storage (or elevation-storage) curves of reservoirs are not publicly shared. In this paper, we combine Landsat surface water extent, 1 arc-minute global relief model (ETOPO1) and GRanD database to derive area-storage curves of global reservoirs whose area is larger than 1 km2 (6,000 more reservoirs are included). First, the coverage polygon of each reservoir in GRanD is extended to where water was detected by Landsat during 1985-2015. Second, elevation of each pixel in the reservoir is extracted from resampled 30-meter ETOPO1, and then relative depth and frequency of each depth value is calculated. Third, cumulative storage is calculated with increasing water area by every one percent of reservoir coverage area and then the uncalibrated area-storage curve is obtained. Finally, the area-storage curve is linearly calibrated by the ratio of calculated capacity over reported capacity in GRanD. The derived curves are compared with in-situ reservoir data collected in Great Plains Region in US, and the results show that in-situ records are well captured by the derived curves even in relative small reservoirs (several square kilometers). The new derived area-storage curves have the potential to be employed in global monitoring or modelling of reservoirs storage and area variations.

  20. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date...... by calendar time trends, the total sample Kaplan-Meier and Aalen-Johansen estimators do not provide useful estimates of the general risk in the target population. We present some alternatives to this type of analysis. RESULTS: We show how a proportional hazards model may be used to extrapolate disease risk...... estimates if proportionality is a reasonable assumption. If not reasonable, we instead advocate that a more useful description of the disease risk lies in the age-specific cumulative incidence curves across strata given by time of entry or perhaps just the end of follow-up estimates across all strata...

  1. Electro-Mechanical Resonance Curves

    Science.gov (United States)

    Greenslade, Thomas B., Jr.

    2018-01-01

    Recently I have been investigating the frequency response of galvanometers. These are direct-current devices used to measure small currents. By using a low-frequency function generator to supply the alternating-current signal and a stopwatch smartphone app to measure the period, I was able to take data to allow a resonance curve to be drawn. This…

  2. Cumulative risk, cumulative outcome: a 20-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Leslie Atkinson

    Full Text Available Cumulative risk (CR models provide some of the most robust findings in the developmental literature, predicting numerous and varied outcomes. Typically, however, these outcomes are predicted one at a time, across different samples, using concurrent designs, longitudinal designs of short duration, or retrospective designs. We predicted that a single CR index, applied within a single sample, would prospectively predict diverse outcomes, i.e., depression, intelligence, school dropout, arrest, smoking, and physical disease from childhood to adulthood. Further, we predicted that number of risk factors would predict number of adverse outcomes (cumulative outcome; CO. We also predicted that early CR (assessed at age 5/6 explains variance in CO above and beyond that explained by subsequent risk (assessed at ages 12/13 and 19/20. The sample consisted of 284 individuals, 48% of whom were diagnosed with a speech/language disorder. Cumulative risk, assessed at 5/6-, 12/13-, and 19/20-years-old, predicted aforementioned outcomes at age 25/26 in every instance. Furthermore, number of risk factors was positively associated with number of negative outcomes. Finally, early risk accounted for variance beyond that explained by later risk in the prediction of CO. We discuss these findings in terms of five criteria posed by these data, positing a "mediated net of adversity" model, suggesting that CR may increase some central integrative factor, simultaneously augmenting risk across cognitive, quality of life, psychiatric and physical health outcomes.

  3. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population

    OpenAIRE

    Tomitaka, Shinichiro; Kawasaki, Yohei; Ide, Kazuki; Akutagawa, Maiko; Yamada, Hiroshi; Furukawa, Toshiaki A.; Ono, Yutaka

    2016-01-01

    [Background]Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item th...

  4. Learning curve estimation techniques for nuclear industry

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    1983-01-01

    Statistical techniques are developed to estimate the progress made by the nuclear industry in learning to prevent accidents. Learning curves are derived for accident occurrence rates based on actuarial data, predictions are made for the future, and compact analytical equations are obtained for the statistical accuracies of the estimates. Both maximum likelihood estimation and the method of moments are applied to obtain parameters for the learning models, and results are compared to each other and to earlier graphical and analytical results. An effective statistical test is also derived to assess the significance of trends. The models used associate learning directly to accidents, to the number of plants and to the cumulative number of operating years. Using as a data base nine core damage accidents in electricity-producing plants, it is estimated that the probability of a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year

  5. The Algebra of the Cumulative Percent Operation.

    Science.gov (United States)

    Berry, Andrew J.

    2002-01-01

    Discusses how to help students avoid some pervasive reasoning errors in solving cumulative percent problems. Discusses the meaning of ."%+b%." the additive inverse of ."%." and other useful applications. Emphasizes the operational aspect of the cumulative percent concept. (KHR)

  6. Adaptive strategies for cumulative cultural learning.

    Science.gov (United States)

    Ehn, Micael; Laland, Kevin

    2012-05-21

    The demographic and ecological success of our species is frequently attributed to our capacity for cumulative culture. However, it is not yet known how humans combine social and asocial learning to generate effective strategies for learning in a cumulative cultural context. Here we explore how cumulative culture influences the relative merits of various pure and conditional learning strategies, including pure asocial and social learning, critical social learning, conditional social learning and individual refiner strategies. We replicate the Rogers' paradox in the cumulative setting. However, our analysis suggests that strategies that resolved Rogers' paradox in a non-cumulative setting may not necessarily evolve in a cumulative setting, thus different strategies will optimize cumulative and non-cumulative cultural learning. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. 32 CFR 651.16 - Cumulative impacts.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Cumulative impacts. 651.16 Section 651.16... § 651.16 Cumulative impacts. (a) NEPA analyses must assess cumulative effects, which are the impact on the environment resulting from the incremental impact of the action when added to other past, present...

  8. A paradox of cumulative culture.

    Science.gov (United States)

    Kobayashi, Yutaka; Wakano, Joe Yuichiro; Ohtsuki, Hisashi

    2015-08-21

    Culture can grow cumulatively if socially learnt behaviors are improved by individual learning before being passed on to the next generation. Previous authors showed that this kind of learning strategy is unlikely to be evolutionarily stable in the presence of a trade-off between learning and reproduction. This is because culture is a public good that is freely exploited by any member of the population in their model (cultural social dilemma). In this paper, we investigate the effect of vertical transmission (transmission from parents to offspring), which decreases the publicness of culture, on the evolution of cumulative culture in both infinite and finite population models. In the infinite population model, we confirm that culture accumulates largely as long as transmission is purely vertical. It turns out, however, that introduction of even slight oblique transmission drastically reduces the equilibrium level of culture. Even more surprisingly, if the population size is finite, culture hardly accumulates even under purely vertical transmission. This occurs because stochastic extinction due to random genetic drift prevents a learning strategy from accumulating enough culture. Overall, our theoretical results suggest that introducing vertical transmission alone does not really help solve the cultural social dilemma problem. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Lagrangian Curves on Spectral Curves of Monopoles

    International Nuclear Information System (INIS)

    Guilfoyle, Brendan; Khalid, Madeeha; Ramon Mari, Jose J.

    2010-01-01

    We study Lagrangian points on smooth holomorphic curves in TP 1 equipped with a natural neutral Kaehler structure, and prove that they must form real curves. By virtue of the identification of TP 1 with the space LE 3 of oriented affine lines in Euclidean 3-space, these Lagrangian curves give rise to ruled surfaces in E 3 , which we prove have zero Gauss curvature. Each ruled surface is shown to be the tangent lines to a curve in E 3 , called the edge of regression of the ruled surface. We give an alternative characterization of these curves as the points in E 3 where the number of oriented lines in the complex curve Σ that pass through the point is less than the degree of Σ. We then apply these results to the spectral curves of certain monopoles and construct the ruled surfaces and edges of regression generated by the Lagrangian curves.

  10. Comparação de diferentes metodologias para estimativa de curvas intensidade-duração-freqüência para Pelotas - RS Comparison of different methodologies to estimate intensity-duration-frequency curves for Pelotas - RS, Brazil

    Directory of Open Access Journals (Sweden)

    Rita de C. F. Damé

    2008-06-01

    Full Text Available Nos projetos agrícolas de obras hidráulicas, onde não se dispõe de dados observados de vazão, é necessário explorar ao máximo as informações relativas às curvas Intensidade-Duração-Freqüência (IDF. Diante disso, é preciso obter maneira de desenvolver metodologias de estimativas de curvas IDF, em locais que possuam pouco ou nenhum dado pluviográfico. O objetivo do trabalho foi comparar as metodologias de desagregação de precipitações diárias para verificar o ganho de informação em termos de curvas IDF, comparadas àquela obtida a partir de dados observados (histórica. Os métodos utilizados foram: (a Método das Relações (CETESB, 1979; (b BELTRAME et al. (1991; (c ROBAINA & PEITER (1992; (d Modelo Bartlett-Lewis do Pulso Retangular Modificado (DAMÉ, 2001. Utilizou-se de série de dados de precipitação diária de Pelotas - RS, referente ao período de 1982-1998. Para estimar as curvas IDF, a partir dos registros históricos, foram estabelecidas as durações de 15; 30; 60; 360; 720 e 1.440 minutos, e os períodos de retorno de 2; 5 e 10 anos. Os valores de intensidades máximas foram comparados entre si, pelo teste "t" de Student, para os coeficientes linear e angular, e pelo Erro Relativo Médio Quadrático. O método que melhor representou as intensidades máximas de precipitação, nos períodos de retorno de 2 e 10 anos, foi o Método das Relações (CETESB, 1979.Agricultural projects which deal with hydraulic projects and do not possess observed data on outflow need to explore at the most, information about the Intensity-Duration-Frequency (IDF curves. Thus, it is necessary to create ways to develop methodologies that estimate IDF curves for locations that have little or no pluviometric data. The aim of this work was to compare disaggregation methodologies for daily precipitation, to verify the increase in quality information considering the IDF curves, as compared to those originated from observed data

  11. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  12. Cumulative trauma disorders: A review.

    Science.gov (United States)

    Iqbal, Zaheen A; Alghadir, Ahmad H

    2017-08-03

    Cumulative trauma disorder (CTD) is a term for various injuries of the musculoskeletal and nervous systems that are caused by repetitive tasks, forceful exertions, vibrations, mechanical compression or sustained postures. Although there are many studies citing incidence of CTDs, there are fewer articles about its etiology, pathology and management. The aim of our study was to discuss the etiology, pathogenesis, prevention and management of CTDs. A literature search was performed using various electronic databases. The search was limited to articles in English language pertaining to randomized clinical trials, cohort studies and systematic reviews of CTDs. A total of 180 papers were identified to be relevant published since 1959. Out of these, 125 papers reported about its incidence and 50 about its conservative treatment. Workplace environment, same task repeatability and little variability, decreased time for rest, increase in expectations are major factors for developing CTDs. Prevention of its etiology and early diagnosis can be the best to decrease its incidence and severity. For effective management of CTDs, its treatment should be divided into Primordial, Primary, Secondary and Tertiary prevention.

  13. Complete cumulative index (1963-1983)

    International Nuclear Information System (INIS)

    1983-01-01

    This complete cumulative index covers all regular and special issues and supplements published by Atomic Energy Review (AER) during its lifetime (1963-1983). The complete cumulative index consists of six Indexes: the Index of Abstracts, the Subject Index, the Title Index, the Author Index, the Country Index and the Table of Elements Index. The complete cumulative index supersedes the Cumulative Indexes for Volumes 1-7: 1963-1969 (1970), and for Volumes 1-10: 1963-1972 (1972); this Index also finalizes Atomic Energy Review, the publication of which has recently been terminated by the IAEA

  14. ECM using Edwards curves

    DEFF Research Database (Denmark)

    Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja

    2013-01-01

    -arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...

  15. A simple Lissajous curves experimental setup

    Science.gov (United States)

    Şahin Kızılcık, Hasan; Damlı, Volkan

    2018-05-01

    The aim of this study is to develop an experimental setup to produce Lissajous curves. The setup was made using a smartphone, a powered speaker (computer speaker), a balloon, a laser pointer and a piece of mirror. Lissajous curves are formed as follows: a piece of mirror is attached to a balloon. The balloon is vibrated with the sound signal provided by the speaker that is connected to a smartphone. The laser beam is reflected off the mirror and the reflection is shaped as a Lissajous curve. Because of the intersection of two frequencies (frequency of the sound signal and natural vibration frequency of the balloon), these curves are formed. They can be used to measure the ratio of frequencies.

  16. Transmission of wave energy in curved ducts

    Science.gov (United States)

    Rostafinski, W.

    1973-01-01

    A formation of wave energy flow was developed for motion in curved ducts. A parametric study over a range of frequencies determined the ability of circular bends to transmit energy for the case of perfectly rigid walls.

  17. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  18. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  19. Cumulative human impacts on marine predators

    DEFF Research Database (Denmark)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact...

  20. Cumulative Student Loan Debt in Minnesota, 2015

    Science.gov (United States)

    Williams-Wyche, Shaun

    2016-01-01

    To better understand student debt in Minnesota, the Minnesota Office of Higher Education (the Office) gathers information on cumulative student loan debt from Minnesota degree-granting institutions. These data detail the number of students with loans by institution, the cumulative student loan debt incurred at that institution, and the percentage…

  1. Contractibility of curves

    Directory of Open Access Journals (Sweden)

    Janusz Charatonik

    1991-11-01

    Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.

  2. The Relationship between Gender, Cumulative Adversities and ...

    African Journals Online (AJOL)

    The Relationship between Gender, Cumulative Adversities and Mental Health of Employees in ... CAs were measured in three forms (family adversities (CAFam), personal adversities ... Age of employees ranged between 18-65 years.

  3. Cumulative cultural learning: Development and diversity

    Science.gov (United States)

    2017-01-01

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children’s learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission—the cornerstone of human cultural diversity. PMID:28739945

  4. Complexity and demographic explanations of cumulative culture

    NARCIS (Netherlands)

    Querbes, A.; Vaesen, K.; Houkes, W.N.

    2014-01-01

    Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological

  5. Cumulative human impacts on marine predators.

    Science.gov (United States)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J; Halpern, Benjamin S; Breed, Greg A; Nickel, Barry; Teutschel, Nicole M; Crowder, Larry B; Benson, Scott; Dutton, Peter H; Bailey, Helen; Kappes, Michelle A; Kuhn, Carey E; Weise, Michael J; Mate, Bruce; Shaffer, Scott A; Hassrick, Jason L; Henry, Robert W; Irvine, Ladd; McDonald, Birgitte I; Robinson, Patrick W; Block, Barbara A; Costa, Daniel P

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact (CUI) on marine predators by combining electronic tracking data of eight protected predator species (n=685 individuals) in the California Current Ecosystem with data on 24 anthropogenic stressors. We show significant variation in CUI with some of the highest impacts within US National Marine Sanctuaries. High variation in underlying species and cumulative impact distributions means that neither alone is sufficient for effective spatial management. Instead, comprehensive management approaches accounting for both cumulative human impacts and trade-offs among multiple stressors must be applied in planning the use of marine resources.

  6. Cumulative cultural learning: Development and diversity.

    Science.gov (United States)

    Legare, Cristine H

    2017-07-24

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children's learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission-the cornerstone of human cultural diversity.

  7. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  8. About the cumulants of periodic signals

    Science.gov (United States)

    Barrau, Axel; El Badaoui, Mohammed

    2018-01-01

    This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.

  9. Cumulative effects assessment: Does scale matter?

    International Nuclear Information System (INIS)

    Therivel, Riki; Ross, Bill

    2007-01-01

    Cumulative effects assessment (CEA) is (or should be) an integral part of environmental assessment at both the project and the more strategic level. CEA helps to link the different scales of environmental assessment in that it focuses on how a given receptor is affected by the totality of plans, projects and activities, rather than on the effects of a particular plan or project. This article reviews how CEAs consider, and could consider, scale issues: spatial extent, level of detail, and temporal issues. It is based on an analysis of Canadian project-level CEAs and UK strategic-level CEAs. Based on a review of literature and, especially, case studies with which the authors are familiar, it concludes that scale issues are poorly considered at both levels, with particular problems being unclear or non-existing cumulative effects scoping methodologies; poor consideration of past or likely future human activities beyond the plan or project in question; attempts to apportion 'blame' for cumulative effects; and, at the plan level, limited management of cumulative effects caused particularly by the absence of consent regimes. Scale issues are important in most of these problems. However both strategic-level and project-level CEA have much potential for managing cumulative effects through better siting and phasing of development, demand reduction and other behavioural changes, and particularly through setting development consent rules for projects. The lack of strategic resource-based thresholds constrains the robust management of strategic-level cumulative effects

  10. A learning curve for solar thermal power

    Science.gov (United States)

    Platzer, Werner J.; Dinter, Frank

    2016-05-01

    Photovoltaics started its success story by predicting the cost degression depending on cumulated installed capacity. This so-called learning curve was published and used for predictions for PV modules first, then predictions of system cost decrease also were developed. This approach is less sensitive to political decisions and changing market situations than predictions on the time axis. Cost degression due to innovation, use of scaling effects, improved project management, standardised procedures including the search for better sites and optimization of project size are learning effects which can only be utilised when projects are developed. Therefore a presentation of CAPEX versus cumulated installed capacity is proposed in order to show the possible future advancement of the technology to politics and market. However from a wide range of publications on cost for CSP it is difficult to derive a learning curve. A logical cost structure for direct and indirect capital expenditure is needed as the basis for further analysis. Using derived reference cost for typical power plant configurations predictions of future cost have been derived. Only on the basis of that cost structure and the learning curve levelised cost of electricity for solar thermal power plants should be calculated for individual projects with different capacity factors in various locations.

  11. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  12. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  13. Managing cumulative impacts: A key to sustainability?

    Energy Technology Data Exchange (ETDEWEB)

    Hunsaker, C.T.

    1994-12-31

    This paper addresses how science can be more effectively used in creating policy to manage cumulative effects on ecosystems. The paper focuses on the scientific techniques that we have to identify and to assess cumulative impacts on ecosystems. The term ``sustainable development`` was brought into common use by the World Commission on Environment and Development (The Brundtland Commission) in 1987. The Brundtland Commission report highlighted the need to simultaneously address developmental and environmental imperatives simultaneously by calling for development that ``meets the needs of the present generation without compromising the needs of future generations.`` We cannot claim to be working toward sustainable development until we can quantitatively assess cumulative impacts on the environment: The two concepts are inextricibally linked in that the elusiveness of cumulative effects likely has the greatest potential of keeping us from achieving sustainability. In this paper, assessment and management frameworks relevant to cumulative impacts are discussed along with recent literature on how to improve such assessments. When possible, examples are given for marine ecosystems.

  14. JUMPING THE CURVE

    Directory of Open Access Journals (Sweden)

    René Pellissier

    2012-01-01

    Full Text Available This paper explores the notion ofjump ing the curve,following from Handy 's S-curve onto a new curve with new rules policies and procedures. . It claims that the curve does not generally lie in wait but has to be invented by leadership. The focus of this paper is the identification (mathematically and inferentially ofthat point in time, known as the cusp in catastrophe theory, when it is time to change - pro-actively, pre-actively or reactively. These three scenarios are addressed separately and discussed in terms ofthe relevance ofeach.

  15. Perspectives on cumulative risks and impacts.

    Science.gov (United States)

    Faust, John B

    2010-01-01

    Cumulative risks and impacts have taken on different meanings in different regulatory and programmatic contexts at federal and state government levels. Traditional risk assessment methodologies, with considerable limitations, can provide a framework for the evaluation of cumulative risks from chemicals. Under an environmental justice program in California, cumulative impacts are defined to include exposures, public health effects, or environmental effects in a geographic area from the emission or discharge of environmental pollution from all sources, through all media. Furthermore, the evaluation of these effects should take into account sensitive populations and socioeconomic factors where possible and to the extent data are available. Key aspects to this potential approach include the consideration of exposures (versus risk), socioeconomic factors, the geographic or community-level assessment scale, and the inclusion of not only health effects but also environmental effects as contributors to impact. Assessments of this type extend the boundaries of the types of information that toxicologists generally provide for risk management decisions.

  16. Cumulative processes and quark distribution in nuclei

    International Nuclear Information System (INIS)

    Kondratyuk, L.; Shmatikov, M.

    1984-01-01

    Assuming existence of multiquark (mainly 12q) bags in nuclei the spectra of cumulative nucleons and mesons produced in high-energy particle-nucleus collisions are discussed. The exponential form of quark momentum distribution in 12q-bag (agreeing well with the experimental data on lepton-nucleus interactions at large q 2 ) is shown to result in quasi-exponential distribution of cumulative particles over the light-cone variable αsub(B). The dependence of f(αsub(B); psub(perpendicular)) (where psub(perpendicular) is the transverse momentum of the bag) upon psub(perpendicular) is considered. The yields of cumulative resonances as well as effects related to the u- and d-quark distributions in N > Z nuclei being different are dicscussed

  17. Incorporating experience curves in appliance standards analysis

    International Nuclear Information System (INIS)

    Desroches, Louis-Benoit; Garbesi, Karina; Kantner, Colleen; Van Buskirk, Robert; Yang, Hung-Chia

    2013-01-01

    There exists considerable evidence that manufacturing costs and consumer prices of residential appliances have decreased in real terms over the last several decades. This phenomenon is generally attributable to manufacturing efficiency gained with cumulative experience producing a certain good, and is modeled by an empirical experience curve. The technical analyses conducted in support of U.S. energy conservation standards for residential appliances and commercial equipment have, until recently, assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. This assumption does not reflect real market price dynamics. Using price data from the Bureau of Labor Statistics, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These experience curves were incorporated into recent energy conservation standards analyses for these products. Including experience curves increases the national consumer net present value of potential standard levels. In some cases a potential standard level exhibits a net benefit when considering experience, whereas without experience it exhibits a net cost. These results highlight the importance of modeling more representative market prices. - Highlights: ► Past appliance standards analyses have assumed constant equipment prices. ► There is considerable evidence of consistent real price declines. ► We incorporate experience curves for several large appliances into the analysis. ► The revised analyses demonstrate larger net present values of potential standards. ► The results imply that past standards analyses may have undervalued benefits.

  18. Cumulative Culture and Future Thinking: Is Mental Time Travel a Prerequisite to Cumulative Cultural Evolution?

    Science.gov (United States)

    Vale, G. L.; Flynn, E. G.; Kendal, R. L.

    2012-01-01

    Cumulative culture denotes the, arguably, human capacity to build on the cultural behaviors of one's predecessors, allowing increases in cultural complexity to occur such that many of our cultural artifacts, products and technologies have progressed beyond what a single individual could invent alone. This process of cumulative cultural evolution…

  19. EXAFS cumulants of CdSe

    International Nuclear Information System (INIS)

    Diop, D.

    1997-04-01

    EXAFS functions had been extracted from measurements on the K edge of Se at different temperatures between 20 and 300 K. The analysis of the EXAFS of the filtered first two shells has been done in the wavevector range laying between 2 and 15.5 A -1 in terms of the cumulants of the effective distribution of distances. The cumulants C 3 and C 4 obtained from the phase difference and the amplitude ratio methods have shown the anharmonicity in the vibrations of atoms around their equilibrium position. (author). 13 refs, 3 figs

  20. Cumulative effects of wind turbines. A guide to assessing the cumulative effects of wind energy development

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This guidance provides advice on how to assess the cumulative effects of wind energy developments in an area and is aimed at developers, planners, and stakeholders interested in the development of wind energy in the UK. The principles of cumulative assessment, wind energy development in the UK, cumulative assessment of wind energy development, and best practice conclusions are discussed. The identification and assessment of the cumulative effects is examined in terms of global environmental sustainability, local environmental quality and socio-economic activity. Supplementary guidance for assessing the principle cumulative effects on the landscape, on birds, and on the visual effect is provided. The consensus building approach behind the preparation of this guidance is outlined in the annexes of the report.

  1. Tornado-Shaped Curves

    Science.gov (United States)

    Martínez, Sol Sáez; de la Rosa, Félix Martínez; Rojas, Sergio

    2017-01-01

    In Advanced Calculus, our students wonder if it is possible to graphically represent a tornado by means of a three-dimensional curve. In this paper, we show it is possible by providing the parametric equations of such tornado-shaped curves.

  2. Simulating Supernova Light Curves

    International Nuclear Information System (INIS)

    Even, Wesley Paul; Dolence, Joshua C.

    2016-01-01

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth's atmosphere.

  3. Simulating Supernova Light Curves

    Energy Technology Data Exchange (ETDEWEB)

    Even, Wesley Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dolence, Joshua C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-05

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth’s atmosphere.

  4. Image scaling curve generation

    NARCIS (Netherlands)

    2012-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  5. Image scaling curve generation.

    NARCIS (Netherlands)

    2011-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  6. Tempo curves considered harmful

    NARCIS (Netherlands)

    Desain, P.; Honing, H.

    1993-01-01

    In the literature of musicology, computer music research and the psychology of music, timing or tempo measurements are mostly presented in the form of continuous curves. The notion of these tempo curves is dangerous, despite its widespread use, because it lulls its users into the false impression

  7. Observable Zitterbewegung in curved spacetimes

    Science.gov (United States)

    Kobakhidze, Archil; Manning, Adrian; Tureanu, Anca

    2016-06-01

    Zitterbewegung, as it was originally described by Schrödinger, is an unphysical, non-observable effect. We verify whether the effect can be observed in non-inertial reference frames/curved spacetimes, where the ambiguity in defining particle states results in a mixing of positive and negative frequency modes. We explicitly demonstrate that such a mixing is in fact necessary to obtain the correct classical value for a particle's velocity in a uniformly accelerated reference frame, whereas in cosmological spacetime a particle does indeed exhibit Zitterbewegung.

  8. Observable Zitterbewegung in curved spacetimes

    Energy Technology Data Exchange (ETDEWEB)

    Kobakhidze, Archil, E-mail: archilk@physics.usyd.edu.au [ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, The University of Sydney, NSW 2006 (Australia); Manning, Adrian, E-mail: a.manning@physics.usyd.edu.au [ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, The University of Sydney, NSW 2006 (Australia); Tureanu, Anca, E-mail: anca.tureanu@helsinki.fi [Department of Physics, University of Helsinki, P.O. Box 64, 00014 Helsinki (Finland)

    2016-06-10

    Zitterbewegung, as it was originally described by Schrödinger, is an unphysical, non-observable effect. We verify whether the effect can be observed in non-inertial reference frames/curved spacetimes, where the ambiguity in defining particle states results in a mixing of positive and negative frequency modes. We explicitly demonstrate that such a mixing is in fact necessary to obtain the correct classical value for a particle's velocity in a uniformly accelerated reference frame, whereas in cosmological spacetime a particle does indeed exhibit Zitterbewegung.

  9. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population.

    Science.gov (United States)

    Tomitaka, Shinichiro; Kawasaki, Yohei; Ide, Kazuki; Akutagawa, Maiko; Yamada, Hiroshi; Furukawa, Toshiaki A; Ono, Yutaka

    2016-01-01

    Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item threshold. To verify this hypothesis, we investigated the boundary curves of the distribution of total depressive symptom scores in a general population. Data collected from 21,040 subjects who had completed the Center for Epidemiologic Studies Depression Scale (CES-D) questionnaire as part of a national Japanese survey were analyzed. The CES-D consists of 20 items (16 negative items and four positive items). The boundary curves of adjacent item scores in the distribution of total depressive symptom scores for the 16 negative items were analyzed using log-normal scales and curve fitting. The boundary curves of adjacent item scores for a given symptom approximated a common linear pattern on a log normal scale. Curve fitting showed that an exponential fit had a markedly higher coefficient of determination than either linear or quadratic fits. With negative affect items, the gap between the total score curve and boundary curve continuously increased with increasing total depressive symptom scores on a log-normal scale, whereas the boundary curves of positive affect items, which are not considered manifest variables of the latent trait, did not exhibit such increases in this gap. The results of the present study support the hypothesis that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores commonly follow the predicted mathematical model, which was verified to approximate an exponential mathematical pattern.

  10. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population

    Directory of Open Access Journals (Sweden)

    Shinichiro Tomitaka

    2016-10-01

    Full Text Available Background Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item threshold. To verify this hypothesis, we investigated the boundary curves of the distribution of total depressive symptom scores in a general population. Methods Data collected from 21,040 subjects who had completed the Center for Epidemiologic Studies Depression Scale (CES-D questionnaire as part of a national Japanese survey were analyzed. The CES-D consists of 20 items (16 negative items and four positive items. The boundary curves of adjacent item scores in the distribution of total depressive symptom scores for the 16 negative items were analyzed using log-normal scales and curve fitting. Results The boundary curves of adjacent item scores for a given symptom approximated a common linear pattern on a log normal scale. Curve fitting showed that an exponential fit had a markedly higher coefficient of determination than either linear or quadratic fits. With negative affect items, the gap between the total score curve and boundary curve continuously increased with increasing total depressive symptom scores on a log-normal scale, whereas the boundary curves of positive affect items, which are not considered manifest variables of the latent trait, did not exhibit such increases in this gap. Discussion The results of the present study support the hypothesis that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores commonly follow the predicted mathematical model, which was verified to approximate an

  11. The curve shortening problem

    CERN Document Server

    Chou, Kai-Seng

    2001-01-01

    Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The authors present a complete treatment of the Gage-Hamilton theorem, a clear, detailed exposition of Grayson''s convexity theorem, a systematic discussion of invariant solutions, applications to the existence of simple closed geodesics on a surface, and a new, almost convexity theorem for the generalized curve shortening problem.Many questions regarding curve shortening remain outstanding. With its careful exposition and complete guide to the literature, The Curve Shortening Problem provides not only an outstanding starting point for graduate students and new investigations, but a superb reference that presents intriguing new results for those already active in the field.

  12. Learning-curve estimation techniques for nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Vaurio, J.K.

    1983-01-01

    Statistical techniques are developed to estimate the progress made by the nuclear industry in learning to prevent accidents. Learning curves are derived for accident occurrence rates based on acturial data, predictions are made for the future, and compact analytical equations are obtained for the statistical accuracies of the estimates. Both maximum likelihood estimation and the method of moments are applied to obtain parameters for the learning models, and results are compared to each other and to earlier graphical and analytical results. An effective statistical test is also derived to assess the significance of trends. The models used associate learning directly to accidents, to the number of plants and to the cumulative number of operating years. Using as a data base nine core damage accidents in electricity-producing plants, it is estimated that the probability of a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year.

  13. Learning curve estimation techniques for the nuclear industry

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1983-01-01

    Statistical techniques are developed to estimate the progress made by the nuclear industry in learning to prevent accidents. Learning curves are derived for accident occurrence rates based on actuarial data, predictions are made for the future, and compact analytical equations are obtained for the statistical accuracies of the estimates. Both maximum likelihood estimation and the method of moments are applied to obtain parameters for the learning models, and results are compared to each other and to earlier graphical and analytical results. An effective statistical test is also derived to assess the significance of trends. The models used associate learning directly to accidents, to the number of plants and to the cumulative number of operating years. Using as a data base nine core damage accidents in electricity-producing plants, it is estimated that the probability of a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year

  14. Learning-curve estimation techniques for nuclear industry

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1983-01-01

    Statistical techniques are developed to estimate the progress made by the nuclear industry in learning to prevent accidents. Learning curves are derived for accident occurrence rates based on acturial data, predictions are made for the future, and compact analytical equations are obtained for the statistical accuracies of the estimates. Both maximum likelihood estimation and the method of moments are applied to obtain parameters for the learning models, and results are compared to each other and to earlier graphical and analytical results. An effective statistical test is also derived to assess the significance of trends. The models used associate learning directly to accidents, to the number of plants and to the cumulative number of operating years. Using as a data base nine core damage accidents in electricity-producing plants, it is estimated that the probability of a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year

  15. Multiparty correlation measure based on the cumulant

    International Nuclear Information System (INIS)

    Zhou, D. L.; Zeng, B.; Xu, Z.; You, L.

    2006-01-01

    We propose a genuine multiparty correlation measure for a multiparty quantum system as the trace norm of the cumulant of the state. The legitimacy of our multiparty correlation measure is explicitly demonstrated by proving it satisfies the five basic conditions required for a correlation measure. As an application we construct an efficient algorithm for the calculation of our measures for all stabilizer states

  16. Decision analysis with cumulative prospect theory.

    Science.gov (United States)

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  17. Cumulative watershed effects: a research perspective

    Science.gov (United States)

    Leslie M. Reid; Robert R. Ziemer

    1989-01-01

    A cumulative watershed effect (CWE) is any response to multiple land-use activities that is caused by, or results in, altered watershed function. The CWE issue is politically defined, as is the significance of particular impacts. But the processes generating CWEs are the traditional focus of geomorphology and ecology, and have thus been studied for decades. The CWE...

  18. An evaluation paradigm for cumulative impact analysis

    Science.gov (United States)

    Stakhiv, Eugene Z.

    1988-09-01

    Cumulative impact analysis is examined from a conceptual decision-making perspective, focusing on its implicit and explicit purposes as suggested within the policy and procedures for environmental impact analysis of the National Environmental Policy Act of 1969 (NEPA) and its implementing regulations. In this article it is also linked to different evaluation and decision-making conventions, contrasting a regulatory context with a comprehensive planning framework. The specific problems that make the application of cumulative impact analysis a virtually intractable evaluation requirement are discussed in connection with the federal regulation of wetlands uses. The relatively familiar US Army Corps of Engineers' (the Corps) permit program, in conjunction with the Environmental Protection Agency's (EPA) responsibilities in managing its share of the Section 404 regulatory program requirements, is used throughout as the realistic context for highlighting certain pragmatic evaluation aspects of cumulative impact assessment. To understand the purposes of cumulative impact analysis (CIA), a key distinction must be made between the implied comprehensive and multiobjective evaluation purposes of CIA, promoted through the principles and policies contained in NEPA, and the more commonly conducted and limited assessment of cumulative effects (ACE), which focuses largely on the ecological effects of human actions. Based on current evaluation practices within the Corps' and EPA's permit programs, it is shown that the commonly used screening approach to regulating wetlands uses is not compatible with the purposes of CIA, nor is the environmental impact statement (EIS) an appropriate vehicle for evaluating the variety of objectives and trade-offs needed as part of CIA. A heuristic model that incorporates the basic elements of CIA is developed, including the idea of trade-offs among social, economic, and environmental protection goals carried out within the context of environmental

  19. Learning Curve? Which One?

    Directory of Open Access Journals (Sweden)

    Paulo Prochno

    2004-07-01

    Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.

  20. The crime kuznets curve

    OpenAIRE

    Buonanno, Paolo; Fergusson, Leopoldo; Vargas, Juan Fernando

    2014-01-01

    We document the existence of a Crime Kuznets Curve in US states since the 1970s. As income levels have risen, crime has followed an inverted U-shaped pattern, first increasing and then dropping. The Crime Kuznets Curve is not explained by income inequality. In fact, we show that during the sample period inequality has risen monotonically with income, ruling out the traditional Kuznets Curve. Our finding is robust to adding a large set of controls that are used in the literature to explain the...

  1. Bond yield curve construction

    Directory of Open Access Journals (Sweden)

    Kožul Nataša

    2014-01-01

    Full Text Available In the broadest sense, yield curve indicates the market's view of the evolution of interest rates over time. However, given that cost of borrowing it closely linked to creditworthiness (ability to repay, different yield curves will apply to different currencies, market sectors, or even individual issuers. As government borrowing is indicative of interest rate levels available to other market players in a particular country, and considering that bond issuance still remains the dominant form of sovereign debt, this paper describes yield curve construction using bonds. The relationship between zero-coupon yield, par yield and yield to maturity is given and their usage in determining curve discount factors is described. Their usage in deriving forward rates and pricing related derivative instruments is also discussed.

  2. SRHA calibration curve

    Data.gov (United States)

    U.S. Environmental Protection Agency — an UV calibration curve for SRHA quantitation. This dataset is associated with the following publication: Chang, X., and D. Bouchard. Surfactant-Wrapped Multiwalled...

  3. Bragg Curve Spectroscopy

    International Nuclear Information System (INIS)

    Gruhn, C.R.

    1981-05-01

    An alternative utilization is presented for the gaseous ionization chamber in the detection of energetic heavy ions, which is called Bragg Curve Spectroscopy (BCS). Conceptually, BCS involves using the maximum data available from the Bragg curve of the stopping heavy ion (HI) for purposes of identifying the particle and measuring its energy. A detector has been designed that measures the Bragg curve with high precision. From the Bragg curve the range from the length of the track, the total energy from the integral of the specific ionization over the track, the dE/dx from the specific ionization at the beginning of the track, and the Bragg peak from the maximum of the specific ionization of the HI are determined. This last signal measures the atomic number, Z, of the HI unambiguously

  4. ROBUST DECLINE CURVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sutawanir Darwis

    2012-05-01

    Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.

  5. Power Curve Measurements FGW

    DEFF Research Database (Denmark)

    Georgieva Yankova, Ginka; Federici, Paolo

    This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2.......This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2....

  6. Curves and Abelian varieties

    CERN Document Server

    Alexeev, Valery; Clemens, C Herbert; Beauville, Arnaud

    2008-01-01

    This book is devoted to recent progress in the study of curves and abelian varieties. It discusses both classical aspects of this deep and beautiful subject as well as two important new developments, tropical geometry and the theory of log schemes. In addition to original research articles, this book contains three surveys devoted to singularities of theta divisors, of compactified Jacobians of singular curves, and of "strange duality" among moduli spaces of vector bundles on algebraic varieties.

  7. Activation energy and R. Chen frequency factor in the Randall and Wilkins original equation for second order kinetics. Emission curve simulated in Microsoft Excel algebra; Energia de activacion y factor de frecuencia de R. Chen en la ecuacion original de Randall y Wilkins para cinetica de segundo orden. Curva de emision simulada en algebra de Microsoft Excel

    Energy Technology Data Exchange (ETDEWEB)

    Moreno M, A. [Departamento de Apoyo en Ciencias Aplicadas, Benemerita Universidad Autonoma de Puebla, 4 Sur 104, Centro Historico, 72000 Puebla (Mexico); Moreno B, A

    2000-07-01

    In this work the incorporation of activation energy and frequency factor parameters proposed by R. Chen are presented in the original formulation of Randall and wilkins second order kinetics. The results concordance are compared between the calculus following the R. Chen methodology with those ones obtained by direct incorporation of the previously indicated in the Randall-Wilkins-Levy expression for a simulated thermoluminescent emission curve of two peaks with maximum peak temperature (tm): t m1=120 and t m2=190. (Author)

  8. Sharing a quota on cumulative carbon emissions

    International Nuclear Information System (INIS)

    Raupach, Michael R.; Davis, Steven J.; Peters, Glen P.; Andrew, Robbie M.; Canadell, Josep G.; Ciais, Philippe

    2014-01-01

    Any limit on future global warming is associated with a quota on cumulative global CO 2 emissions. We translate this global carbon quota to regional and national scales, on a spectrum of sharing principles that extends from continuation of the present distribution of emissions to an equal per-capita distribution of cumulative emissions. A blend of these endpoints emerges as the most viable option. For a carbon quota consistent with a 2 C warming limit (relative to pre-industrial levels), the necessary long-term mitigation rates are very challenging (typically over 5% per year), both because of strong limits on future emissions from the global carbon quota and also the likely short-term persistence in emissions growth in many regions. (authors)

  9. Complexity and demographic explanations of cumulative culture.

    Science.gov (United States)

    Querbes, Adrien; Vaesen, Krist; Houkes, Wybo

    2014-01-01

    Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing--while favoured by increasing--population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change.

  10. Complexity and demographic explanations of cumulative culture.

    Directory of Open Access Journals (Sweden)

    Adrien Querbes

    Full Text Available Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing--while favoured by increasing--population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change.

  11. Conceptual models for cumulative risk assessment.

    Science.gov (United States)

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  12. Childhood Cumulative Risk and Later Allostatic Load

    DEFF Research Database (Denmark)

    Doan, Stacey N; Dich, Nadya; Evans, Gary W

    2014-01-01

    State, followed for 8 years (between the ages 9 and 17). Poverty- related stress was computed using the cumulative risk approach, assessing stressors across 9 domains, including environmental, psychosocial, and demographic factors. Allostatic load captured a range of physiological responses, including......Objective: The present study investigated the long-term impact of exposure to poverty-related stressors during childhood on allostatic load, an index of physiological dysregulation, and the potential mediating role of substance use. Method: Participants (n = 162) were rural children from New York...... cardiovascular, hypothalamic pituitary adrenal axis, sympathetic adrenal medullary system, and metabolic activity. Smoking and alcohol/drug use were tested as mediators of the hypothesized childhood risk-adolescent allostatic load relationship. Results: Cumulative risk exposure at age 9 predicted increases...

  13. Fuzzy set theory for cumulative trauma prediction

    OpenAIRE

    Fonseca, Daniel J.; Merritt, Thomas W.; Moynihan, Gary P.

    2001-01-01

    A widely used fuzzy reasoning algorithm was modified and implemented via an expert system to assess the potential risk of employee repetitive strain injury in the workplace. This fuzzy relational model, known as the Priority First Cover Algorithm (PFC), was adapted to describe the relationship between 12 cumulative trauma disorders (CTDs) of the upper extremity, and 29 identified risk factors. The algorithm, which finds a suboptimal subset from a group of variables based on the criterion of...

  14. Sikap Kerja Duduk Terhadap Cumulative Trauma Disorder

    OpenAIRE

    Rahmawati, Yulita; Sugiharto, -

    2011-01-01

    Permasalahan yang diteliti adalah adakah hubungan antara sikap kerja duduk dengan kejadian Cumulative Trauma Disorder (CTD) pada pekerja bagian pengamplasan di PT. Geromar Jepara. Tujuan yang ingin dicapai adalah untuk mengetahui hubungan antara sikap kerja duduk dengan kejadian CTD pada pekerja bagian pengamplasan. Metode penelitian ini bersifat explanatory dengan menggunakan pendekatan belah lintang. Populasi dalam penelitian ini adalah pekerja bagian pengamplasan sebanyak 30 orang. Teknik ...

  15. Power Reactor Docket Information. Annual cumulation (citations)

    International Nuclear Information System (INIS)

    1977-12-01

    An annual cumulation of the citations to the documentation associated with civilian nuclear power plants is presented. This material is that which is submitted to the U.S. Nuclear Regulatory Commission in support of applications for construction and operating licenses. Citations are listed by Docket number in accession number sequence. The Table of Contents is arranged both by Docket number and by nuclear power plant name

  16. Cumulative Effect of Depression on Dementia Risk

    OpenAIRE

    Olazarán, J.; Trincado, R.; Bermejo-Pareja, F.

    2013-01-01

    Objective. To analyze a potential cumulative effect of life-time depression on dementia and Alzheimer's disease (AD), with control of vascular factors (VFs). Methods. This study was a subanalysis of the Neurological Disorders in Central Spain (NEDICES) study. Past and present depression, VFs, dementia status, and dementia due to AD were documented at study inception. Dementia status was also documented after three years. Four groups were created according to baseline data: never depression (n...

  17. Cumulative release to the accessible environment

    International Nuclear Information System (INIS)

    Kanehiro, B.

    1985-01-01

    The Containment and Isolation Working Group considered issues related to the postclosure behavior of repositories in crystalline rock. This working group was further divided into subgroups to consider the progress since the 1978 GAIN Symposium and identify research needs in the individual areas of regional ground-water flow, ground-water travel time, fractional release, and cumulative release. The analysis and findings of the Fractional Release Subgroup are presented

  18. EPA Workshop on Epigenetics and Cumulative Risk ...

    Science.gov (United States)

    Agenda Download the Workshop Agenda (PDF) The workshop included presentations and discussions by scientific experts pertaining to three topics (i.e., epigenetic changes associated with diverse stressors, key science considerations in understanding epigenetic changes, and practical application of epigenetic tools to address cumulative risks from environmental stressors), to address several questions under each topic, and included an opportunity for attendees to participate in break-out groups, provide comments and ask questions. Workshop Goals The workshop seeks to examine the opportunity for use of aggregate epigenetic change as an indicator in cumulative risk assessment for populations exposed to multiple stressors that affect epigenetic status. Epigenetic changes are specific molecular changes around DNA that alter expression of genes. Epigenetic changes include DNA methylation, formation of histone adducts, and changes in micro RNAs. Research today indicates that epigenetic changes are involved in many chronic diseases (cancer, cardiovascular disease, obesity, diabetes, mental health disorders, and asthma). Research has also linked a wide range of stressors including pollution and social factors with occurrence of epigenetic alterations. Epigenetic changes have the potential to reflect impacts of risk factors across multiple stages of life. Only recently receiving attention is the nexus between the factors of cumulative exposure to environmental

  19. Higher order cumulants in colorless partonic plasma

    Energy Technology Data Exchange (ETDEWEB)

    Cherif, S. [Sciences and Technologies Department, University of Ghardaia, Ghardaia, Algiers (Algeria); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria); Ahmed, M. A. A. [Department of Physics, College of Science, Taibah University Al-Madinah Al-Mounawwarah KSA (Saudi Arabia); Department of Physics, Taiz University in Turba, Taiz (Yemen); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria); Ladrem, M., E-mail: mladrem@yahoo.fr [Department of Physics, College of Science, Taibah University Al-Madinah Al-Mounawwarah KSA (Saudi Arabia); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria)

    2016-06-10

    Any physical system considered to study the QCD deconfinement phase transition certainly has a finite volume, so the finite size effects are inevitably present. This renders the location of the phase transition and the determination of its order as an extremely difficult task, even in the simplest known cases. In order to identify and locate the colorless QCD deconfinement transition point in finite volume T{sub 0}(V), a new approach based on the finite-size cumulant expansion of the order parameter and the ℒ{sub m,n}-Method is used. We have shown that both cumulants of higher order and their ratios, associated to the thermodynamical fluctuations of the order parameter, in QCD deconfinement phase transition behave in a particular enough way revealing pronounced oscillations in the transition region. The sign structure and the oscillatory behavior of these in the vicinity of the deconfinement phase transition point might be a sensitive probe and may allow one to elucidate their relation to the QCD phase transition point. In the context of our model, we have shown that the finite volume transition point is always associated to the appearance of a particular point in whole higher order cumulants under consideration.

  20. Cumulative irritation potential of topical retinoid formulations.

    Science.gov (United States)

    Leyden, James J; Grossman, Rachel; Nighland, Marge

    2008-08-01

    Localized irritation can limit treatment success with topical retinoids such as tretinoin and adapalene. The factors that influence irritant reactions have been shown to include individual skin sensitivity, the particular retinoid and concentration used, and the vehicle formulation. To compare the cutaneous tolerability of tretinoin 0.04% microsphere gel (TMG) with that of adapalene 0.3% gel and a standard tretinoin 0.025% cream. The results of 2 randomized, investigator-blinded studies of 2 to 3 weeks' duration, which utilized a split-face method to compare cumulative irritation scores induced by topical retinoids in subjects with healthy skin, were combined. Study 1 compared TMG 0.04% with adapalene 0.3% gel over 2 weeks, while study 2 compared TMG 0.04% with tretinoin 0.025% cream over 3 weeks. In study 1, TMG 0.04% was associated with significantly lower cumulative scores for erythema, dryness, and burning/stinging than adapalene 0.3% gel. However, in study 2, there were no significant differences in cumulative irritation scores between TMG 0.04% and tretinoin 0.025% cream. Measurements of erythema by a chromameter showed no significant differences between the test formulations in either study. Cutaneous tolerance of TMG 0.04% on the face was superior to that of adapalene 0.3% gel and similar to that of a standard tretinoin cream containing a lower concentration of the drug (0.025%).

  1. Approximation by planar elastic curves

    DEFF Research Database (Denmark)

    Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge

    2016-01-01

    We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....

  2. Power Curve Measurements REWS

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here, the refere......This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here......, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...

  3. Curved electromagnetic missiles

    International Nuclear Information System (INIS)

    Myers, J.M.; Shen, H.M.; Wu, T.T.

    1989-01-01

    Transient electromagnetic fields can exhibit interesting behavior in the limit of great distances from their sources. In situations of finite total radiated energy, the energy reaching a distant receiver can decrease with distance much more slowly than the usual r - 2 . Cases of such slow decrease have been referred to as electromagnetic missiles. All of the wide variety of known missiles propagate in essentially straight lines. A sketch is presented here of a missile that can follow a path that is strongly curved. An example of a curved electromagnetic missile is explicitly constructed and some of its properties are discussed. References to details available elsewhere are given

  4. Algebraic curves and cryptography

    CERN Document Server

    Murty, V Kumar

    2010-01-01

    It is by now a well-known paradigm that public-key cryptosystems can be built using finite Abelian groups and that algebraic geometry provides a supply of such groups through Abelian varieties over finite fields. Of special interest are the Abelian varieties that are Jacobians of algebraic curves. All of the articles in this volume are centered on the theme of point counting and explicit arithmetic on the Jacobians of curves over finite fields. The topics covered include Schoof's \\ell-adic point counting algorithm, the p-adic algorithms of Kedlaya and Denef-Vercauteren, explicit arithmetic on

  5. IGMtransmission: Transmission curve computation

    Science.gov (United States)

    Harrison, Christopher M.; Meiksin, Avery; Stock, David

    2015-04-01

    IGMtransmission is a Java graphical user interface that implements Monte Carlo simulations to compute the corrections to colors of high-redshift galaxies due to intergalactic attenuation based on current models of the Intergalactic Medium. The effects of absorption due to neutral hydrogen are considered, with particular attention to the stochastic effects of Lyman Limit Systems. Attenuation curves are produced, as well as colors for a wide range of filter responses and model galaxy spectra. Photometric filters are included for the Hubble Space Telescope, the Keck telescope, the Mt. Palomar 200-inch, the SUBARU telescope and UKIRT; alternative filter response curves and spectra may be readily uploaded.

  6. A bivariate optimal replacement policy with cumulative repair cost ...

    Indian Academy of Sciences (India)

    Min-Tsai Lai

    Shock model; cumulative damage model; cumulative repair cost limit; preventive maintenance model. 1. Introduction ... with two types of shocks: one type is failure shock, and the other type is damage ...... Theory, methods and applications.

  7. Learning from uncertain curves

    DEFF Research Database (Denmark)

    Mallasto, Anton; Feragen, Aasa

    2017-01-01

    We introduce a novel framework for statistical analysis of populations of nondegenerate Gaussian processes (GPs), which are natural representations of uncertain curves. This allows inherent variation or uncertainty in function-valued data to be properly incorporated in the population analysis. Us...

  8. Power Curve Measurements

    DEFF Research Database (Denmark)

    Federici, Paolo; Kock, Carsten Weber

    This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...

  9. Power Curve Measurements, FGW

    DEFF Research Database (Denmark)

    Vesth, Allan; Kock, Carsten Weber

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....

  10. Power Curve Measurements

    DEFF Research Database (Denmark)

    Federici, Paolo; Vesth, Allan

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....

  11. Power Curve Measurements

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Gómez Arranz, Paula

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...

  12. Carbon Lorenz Curves

    NARCIS (Netherlands)

    Groot, L.F.M.|info:eu-repo/dai/nl/073642398

    2008-01-01

    The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across

  13. The Axial Curve Rotator.

    Science.gov (United States)

    Hunter, Walter M.

    This document contains detailed directions for constructing a device that mechanically produces the three-dimensional shape resulting from the rotation of any algebraic line or curve around either axis on the coordinate plant. The device was developed in response to student difficulty in visualizing, and thus grasping the mathematical principles…

  14. Nacelle lidar power curve

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Wagner, Rozenn

    This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...

  15. Power curve report

    DEFF Research Database (Denmark)

    Vesth, Allan; Kock, Carsten Weber

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...

  16. Textbook Factor Demand Curves.

    Science.gov (United States)

    Davis, Joe C.

    1994-01-01

    Maintains that teachers and textbook graphics follow the same basic pattern in illustrating changes in demand curves when product prices increase. Asserts that the use of computer graphics will enable teachers to be more precise in their graphic presentation of price elasticity. (CFR)

  17. ECM using Edwards curves

    NARCIS (Netherlands)

    Bernstein, D.J.; Birkner, P.; Lange, T.; Peters, C.P.

    2013-01-01

    This paper introduces EECM-MPFQ, a fast implementation of the elliptic-curve method of factoring integers. EECM-MPFQ uses fewer modular multiplications than the well-known GMP-ECM software, takes less time than GMP-ECM, and finds more primes than GMP-ECM. The main improvements above the

  18. Power Curve Measurements FGW

    DEFF Research Database (Denmark)

    Federici, Paolo; Kock, Carsten Weber

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...

  19. On interference of cumulative proton production mechanisms

    International Nuclear Information System (INIS)

    Braun, M.A.; Vechernin, V.V.

    1993-01-01

    The dynamical picture of the cumulative proton production in hA-collisions by means of diagram analysis with NN interaction described by a non-relativistic NN potential is considered. The contributions of the various mechanisms (spectator, direct and rescattering) for backward hemisphere proton production within the framework of this common approach is calculated. The emphasis is on the comparison of the relative contributions of these mechanisms for various angles, taking into account the interference of these contributions. Comparison with experimental data is also presented. (author)

  20. Preserved cumulative semantic interference despite amnesia

    Directory of Open Access Journals (Sweden)

    Gary Michael Oppenheim

    2015-05-01

    As predicted by Oppenheim et al’s (2010 implicit incremental learning account, WRP’s BCN RTs demonstrated strong (and significant repetition priming and semantic blocking effects (Figure 1. Similar to typical results from neurally intact undergraduates, WRP took longer to name pictures presented in semantically homogeneous blocks than in heterogeneous blocks, an effect that increased with each cycle. This result challenges accounts that ascribe cumulative semantic interference in this task to explicit memory mechanisms, instead suggesting that the effect has the sort of implicit learning bases that are typically spared in hippocampal amnesia.

  1. Is cumulated pyrethroid exposure associated with prediabetes?

    DEFF Research Database (Denmark)

    Hansen, Martin Rune; Jørs, Erik; Lander, Flemming

    2014-01-01

    was to investigate an association between exposure to pyrethroids and abnormal glucose regulation (prediabetes or diabetes). A cross-sectional study was performed among 116 pesticide sprayers from public vector control programs in Bolivia and 92 nonexposed controls. Pesticide exposure (duration, intensity...... pyrethroids, a significant positive trend was observed between cumulative pesticide exposure (total number of hours sprayed) and adjusted OR of abnormal glucose regulation, with OR 14.7 [0.9-235] in the third exposure quintile. The study found a severely increased prevalence of prediabetes among Bolivian...

  2. F(α) curves: Experimental results

    International Nuclear Information System (INIS)

    Glazier, J.A.; Gunaratne, G.; Libchaber, A.

    1988-01-01

    We study the transition to chaos at the golden and silver means for forced Rayleigh-Benard (RB) convection in mercury. We present f(α) curves below, at, and above the transition, and provide comparisons to the curves calculated for the one-dimensional circle map. We find good agreement at both the golden and silver means. This confirms our earlier observation that for low amplitude forcing, forced RB convection is well described by the one-dimensional circle map and indicates that the f(α) curve is a good measure of the approach to criticality. For selected subcritical experimental data sets we calculate the degree of subcriticality. We also present both experimental and calculated results for f(α) in the presence of a third frequency. Again we obtain agreement: The presence of random noise or a third frequency narrows the right-hand (negative q) side of the f(α) curve. Subcriticality results in symmetrically narrowed curves. We can also distinguish these cases by examining the power spectra and Poincare sections of the time series

  3. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  4. Smooth time-dependent receiver operating characteristic curve estimators.

    Science.gov (United States)

    Martínez-Camblor, Pablo; Pardo-Fernández, Juan Carlos

    2018-03-01

    The receiver operating characteristic curve is a popular graphical method often used to study the diagnostic capacity of continuous (bio)markers. When the considered outcome is a time-dependent variable, two main extensions have been proposed: the cumulative/dynamic receiver operating characteristic curve and the incident/dynamic receiver operating characteristic curve. In both cases, the main problem for developing appropriate estimators is the estimation of the joint distribution of the variables time-to-event and marker. As usual, different approximations lead to different estimators. In this article, the authors explore the use of a bivariate kernel density estimator which accounts for censored observations in the sample and produces smooth estimators of the time-dependent receiver operating characteristic curves. The performance of the resulting cumulative/dynamic and incident/dynamic receiver operating characteristic curves is studied by means of Monte Carlo simulations. Additionally, the influence of the choice of the required smoothing parameters is explored. Finally, two real-applications are considered. An R package is also provided as a complement to this article.

  5. Chapter 19. Cumulative watershed effects and watershed analysis

    Science.gov (United States)

    Leslie M. Reid

    1998-01-01

    Cumulative watershed effects are environmental changes that are affected by more than.one land-use activity and that are influenced by.processes involving the generation or transport.of water. Almost all environmental changes are.cumulative effects, and almost all land-use.activities contribute to cumulative effects

  6. Original and cumulative prospect theory: a discussion of empirical differences

    NARCIS (Netherlands)

    Wakker, P.P.; Fennema, H.

    1997-01-01

    This note discusses differences between prospect theory and cumulative prospect theory. It shows that cumulative prospect theory is not merely a formal correction of some theoretical problems in prospect theory, but it also gives different predictions. Experiments are described that favor cumulative

  7. Implementation Learning and Forgetting Curve to Scheduling in Garment Industry

    Science.gov (United States)

    Muhamad Badri, Huda; Deros, Baba Md; Syahri, M.; Saleh, Chairul; Fitria, Aninda

    2016-02-01

    The learning curve shows the relationship between time and the cumulative number of units produced which using the mathematical description on the performance of workers in performing repetitive works. The problems of this study is level differences in the labors performance before and after the break which affects the company's production scheduling. The study was conducted in the garment industry, which the aims is to predict the company production scheduling using the learning curve and forgetting curve. By implementing the learning curve and forgetting curve, this paper contributes in improving the labors performance that is in line with the increase in maximum output 3 hours productive before the break are 15 unit product with learning curve percentage in the company is 93.24%. Meanwhile, the forgetting curve improving maximum output 3 hours productive after the break are 11 unit product with the percentage of forgetting curve in the company is 92.96%. Then, the obtained 26 units product on the productive hours one working day is used as the basic for production scheduling.

  8. Vibrational Analysis of Curved Single-Walled Carbon Nanotube on a Pasternak Elastic Foundation

    DEFF Research Database (Denmark)

    Mehdipour, I.; Barari, Amin; Kimiaeifar, Amin

    2012-01-01

    . By utilizing He’s Energy Balance Method (HEBM), the relationships of the nonlinear amplitude and frequency were expressed for a curved, single-walled carbon nanotube. The amplitude frequency response curves of the nonlinear free vibration were obtained for a curved, single-walled carbon nanotube embedded...

  9. Cumulative Environmental Management Association : Wood Buffalo Region

    International Nuclear Information System (INIS)

    Friesen, B.

    2001-01-01

    The recently announced oil sands development of the Wood Buffalo Region in Alberta was the focus of this power point presentation. Both mining and in situ development is expected to total $26 billion and 2.6 million barrels per day of bitumen production. This paper described the economic, social and environmental challenges facing the resource development of this region. In addition to the proposed oil sands projects, this region will accommodate the needs of conventional oil and gas production, forestry, building of pipelines and power lines, municipal development, recreation, tourism, mining exploration and open cast mining. The Cumulative Environmental Management Association (CEMA) was inaugurated as a non-profit association in April 2000, and includes 41 members from all sectors. Its major role is to ensure a sustainable ecosystem and to avoid any cumulative impacts on wildlife. Other work underway includes the study of soil and plant species diversity, and the effects of air emissions on human health, wildlife and vegetation. The bioaccumulation of heavy metals and their impacts on surface water and fish is also under consideration to ensure the quality and quantity of surface water and ground water. 3 figs

  10. IDF-curves for precipitation In Belgium

    International Nuclear Information System (INIS)

    Mohymont, Bernard; Demarde, Gaston R.

    2004-01-01

    The Intensity-Duration-Frequency (IDF) curves for precipitation constitute a relationship between the intensity, the duration and the frequency of rainfall amounts. The intensity of precipitation is expressed in mm/h, the duration or aggregation time is the length of the interval considered while the frequency stands for the probability of occurrence of the event. IDF-curves constitute a classical and useful tool that is primarily used to dimension hydraulic structures in general, as e.g., sewer systems and which are consequently used to assess the risk of inundation. In this presentation, the IDF relation for precipitation is studied for different locations in Belgium. These locations correspond to two long-term, high-quality precipitation networks of the RMIB: (a) the daily precipitation depths of the climatological network (more than 200 stations, 1951-2001 baseline period); (b) the high-frequency 10-minutes precipitation depths of the hydro meteorological network (more than 30 stations, 15 to 33 years baseline period). For the station of Uccle, an uninterrupted time-series of more than one hundred years of 10-minutes rainfall data is available. The proposed technique for assessing the curves is based on maximum annual values of precipitation. A new analytical formula for the IDF-curves was developed such that these curves stay valid for aggregation times ranging from 10 minutes to 30 days (when fitted with appropriate data). Moreover, all parameters of this formula have physical dimensions. Finally, adequate spatial interpolation techniques are used to provide nationwide extreme values precipitation depths for short- to long-term durations With a given return period. These values are estimated on the grid points of the Belgian ALADIN-domain used in the operational weather forecasts at the RMIB.(Author)

  11. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  12. Cumulative exposure to carbon monoxide during the day

    Energy Technology Data Exchange (ETDEWEB)

    Joumard, R. (INRETS, 69 - Bron (FR))

    The carbon monoxide, CO, has the advantage of being very easily and accurately measured under various conditions. In addition, it allows the translation of CO concentrations into their biological effects. The cumulative CO exposure should be considered according to current environment conditions during a given period of life, e.g. the day. In addition, the translation of concentrations and exposure times of CO fixed on blood haemoglobine (carboxyhaemoglobine) depends on physiological factors such as age, size, sex, or physical activity. This paper gives some examples of CO exposure translated into curves of carboxyhaemoglobine: case of 92 persons whose schedule was studied in details, of customs officers whose exposure was measured during one week, or other theoretical cases. In all the cases studied, smoking is by far the first factor of pollution by carbon monoxide. If not considering this case, the CO contents observed are preoccupying for sensitive subjects (in particular children) only in very rare cases. Furthermore, this approach allows the assessment of maximal allowable concentrations during specific exposures (work, e.g. in a tunnel) by integrating them into normal life conditions and population current exposure.

  13. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  14. Carbon Lorenz Curves

    Energy Technology Data Exchange (ETDEWEB)

    Groot, L. [Utrecht University, Utrecht School of Economics, Janskerkhof 12, 3512 BL Utrecht (Netherlands)

    2008-11-15

    The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across countries. These tools allow policy-makers and the general public to grasp at a single glance the impact of conventional distribution rules such as equal caps or grandfathering, or more sophisticated ones, on the distribution of greenhouse gas emissions. Second, using the Samuelson rule for the optimal provision of a public good, the Pareto-optimal distribution of carbon emissions is compared with the distribution that follows if countries follow Nash-Cournot abatement strategies. It is shown that the Pareto-optimal distribution under the Samuelson rule can be approximated by the equal cap division, represented by the diagonal in the Lorenz curve diagram.

  15. Dynamics of curved fronts

    CERN Document Server

    Pelce, Pierre

    1989-01-01

    In recent years, much progress has been made in the understanding of interface dynamics of various systems: hydrodynamics, crystal growth, chemical reactions, and combustion. Dynamics of Curved Fronts is an important contribution to this field and will be an indispensable reference work for researchers and graduate students in physics, applied mathematics, and chemical engineering. The book consist of a 100 page introduction by the editor and 33 seminal articles from various disciplines.

  16. International Wage Curves

    OpenAIRE

    David G. Blanchflower; Andrew J. Oswald

    1992-01-01

    The paper provides evidence for the existence of a negatively sloped locus linking the level of pay to the rate of regional (or industry) unemployment. This "wage curve" is estimated using microeconomic data for Britain, the US, Canada, Korea, Austria, Italy, Holland, Switzerland, Norway, and Germany, The average unemployment elasticity of pay is approximately -0.1. The paper sets out a multi-region efficiency wage model and argues that its predictions are consistent with the data.

  17. Anatomical curve identification

    Science.gov (United States)

    Bowman, Adrian W.; Katina, Stanislav; Smith, Joanna; Brown, Denise

    2015-01-01

    Methods for capturing images in three dimensions are now widely available, with stereo-photogrammetry and laser scanning being two common approaches. In anatomical studies, a number of landmarks are usually identified manually from each of these images and these form the basis of subsequent statistical analysis. However, landmarks express only a very small proportion of the information available from the images. Anatomically defined curves have the advantage of providing a much richer expression of shape. This is explored in the context of identifying the boundary of breasts from an image of the female torso and the boundary of the lips from a facial image. The curves of interest are characterised by ridges or valleys. Key issues in estimation are the ability to navigate across the anatomical surface in three-dimensions, the ability to recognise the relevant boundary and the need to assess the evidence for the presence of the surface feature of interest. The first issue is addressed by the use of principal curves, as an extension of principal components, the second by suitable assessment of curvature and the third by change-point detection. P-spline smoothing is used as an integral part of the methods but adaptations are made to the specific anatomical features of interest. After estimation of the boundary curves, the intermediate surfaces of the anatomical feature of interest can be characterised by surface interpolation. This allows shape variation to be explored using standard methods such as principal components. These tools are applied to a collection of images of women where one breast has been reconstructed after mastectomy and where interest lies in shape differences between the reconstructed and unreconstructed breasts. They are also applied to a collection of lip images where possible differences in shape between males and females are of interest. PMID:26041943

  18. Estimating Corporate Yield Curves

    OpenAIRE

    Antionio Diaz; Frank Skinner

    2001-01-01

    This paper represents the first study of retail deposit spreads of UK financial institutions using stochastic interest rate modelling and the market comparable approach. By replicating quoted fixed deposit rates using the Black Derman and Toy (1990) stochastic interest rate model, we find that the spread between fixed and variable rates of interest can be modeled (and priced) using an interest rate swap analogy. We also find that we can estimate an individual bank deposit yield curve as a spr...

  19. LCC: Light Curves Classifier

    Science.gov (United States)

    Vo, Martin

    2017-08-01

    Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.

  20. Evolution model with a cumulative feedback coupling

    Science.gov (United States)

    Trimper, Steffen; Zabrocki, Knud; Schulz, Michael

    2002-05-01

    The paper is concerned with a toy model that generalizes the standard Lotka-Volterra equation for a certain population by introducing a competition between instantaneous and accumulative, history-dependent nonlinear feedback the origin of which could be a contribution from any kind of mismanagement in the past. The results depend on the sign of that additional cumulative loss or gain term of strength λ. In case of a positive coupling the system offers a maximum gain achieved after a finite time but the population will die out in the long time limit. In this case the instantaneous loss term of strength u is irrelevant and the model exhibits an exact solution. In the opposite case λ<0 the time evolution of the system is terminated in a crash after ts provided u=0. This singularity after a finite time can be avoided if u≠0. The approach may well be of relevance for the qualitative understanding of more realistic descriptions.

  1. Psychometric properties of the Cumulated Ambulation Score

    DEFF Research Database (Denmark)

    Ferriero, Giorgio; Kristensen, Morten T; Invernizzi, Marco

    2018-01-01

    INTRODUCTION: In the geriatric population, independent mobility is a key factor in determining readiness for discharge following acute hospitalization. The Cumulated Ambulation Score (CAS) is a potentially valuable score that allows day-to-day measurements of basic mobility. The CAS was developed...... and validated in older patients with hip fracture as an early postoperative predictor of short-term outcome, but it is also used to assess geriatric in-patients with acute medical illness. Despite the fast- accumulating literature on the CAS, to date no systematic review synthesizing its psychometric properties....... Of 49 studies identified, 17 examined the psychometric properties of the CAS. EVIDENCE SYNTHESIS: Most papers dealt with patients after hip fracture surgery, and only 4 studies assessed the CAS psychometric characteristics also in geriatric in-patients with acute medical illness. Two versions of CAS...

  2. Uniformization of elliptic curves

    OpenAIRE

    Ülkem, Özge; Ulkem, Ozge

    2015-01-01

    Every elliptic curve E defined over C is analytically isomorphic to C*=qZ for some q ∊ C*. Similarly, Tate has shown that if E is defined over a p-adic field K, then E is analytically isomorphic to K*=qZ for some q ∊ K . Further the isomorphism E(K) ≅ K*/qZ respects the action of the Galois group GK/K, where K is the algebraic closure of K. I will explain the construction of this isomorphism.

  3. Roc curves for continuous data

    CERN Document Server

    Krzanowski, Wojtek J

    2009-01-01

    Since ROC curves have become ubiquitous in many application areas, the various advances have been scattered across disparate articles and texts. ROC Curves for Continuous Data is the first book solely devoted to the subject, bringing together all the relevant material to provide a clear understanding of how to analyze ROC curves.The fundamental theory of ROC curvesThe book first discusses the relationship between the ROC curve and numerous performance measures and then extends the theory into practice by describing how ROC curves are estimated. Further building on the theory, the authors prese

  4. Curves and graphs

    International Nuclear Information System (INIS)

    Bitelli, T.

    1976-01-01

    The study of a 'f(x)' function in a cartesian system is made and orientation is given about the square minimum method. The use of logarithimic and dilogarithimic system is analysed. In the graphic representation, the linear diagram, hystograms and the accumulated frequency polygon are analysed. In all the study, several examples applyed to nuclear medicine are given [pt

  5. Cumulative radiation dose of multiple trauma patients during their hospitalization

    International Nuclear Information System (INIS)

    Wang Zhikang; Sun Jianzhong; Zhao Zudan

    2012-01-01

    Objective: To study the cumulative radiation dose of multiple trauma patients during their hospitalization and to analyze the dose influence factors. Methods: The DLP for CT and DR were retrospectively collected from the patients during June, 2009 and April, 2011 at a university affiliated hospital. The cumulative radiation doses were calculated by summing typical effective doses of the anatomic regions scanned. Results: The cumulative radiation doses of 113 patients were collected. The maximum,minimum and the mean values of cumulative effective doses were 153.3, 16.48 mSv and (52.3 ± 26.6) mSv. Conclusions: Multiple trauma patients have high cumulative radiation exposure. Therefore, the management of cumulative radiation doses should be enhanced. To establish the individualized radiation exposure archives will be helpful for the clinicians and technicians to make decision whether to image again and how to select the imaging parameters. (authors)

  6. Modelling the effect of non-uniform radon progeny activities on transformation frequencies in human bronchial airways

    International Nuclear Information System (INIS)

    Fakir, H.; Hofmann, W.; Aubineau-Laniece, I.

    2006-01-01

    The effects of radiological and morphological source heterogeneities in straight and Y-shaped bronchial airways on hit frequencies and Micro-dosimetric quantities in epithelial cells have been investigated previously. The goal of the present study is to relate these physical quantities to transformation frequencies in sensitive target cells and to radon-induced lung cancer risk. Based on an effect-specific track length model, computed linear energy transfer (LET) spectra were converted to corresponding transformation frequencies for different activity distributions and source - target configurations. Average transformation probabilities were considerably enhanced for radon progeny accumulations and target cells at the carinal ridge, relative to uniform activity distributions and target cells located along the curved and straight airway portions at the same exposure level. Although uncorrelated transformation probabilities produce a linear dose - effect relationship, correlated transformations first increase depending on the LET, but then decrease significantly when exceeding a defined number of hits or cumulative exposure level. (authors)

  7. Low Birth Weight, Cumulative Obesity Dose, and the Risk of Incident Type 2 Diabetes

    OpenAIRE

    Feng, Cindy; Osgood, Nathaniel D.; Dyck, Roland F.

    2018-01-01

    Background. Obesity history may provide a better understanding of the contribution of obesity to T2DM risk. Methods. 17,634 participants from the 1958 National Child Development Study were followed from birth to 50 years. Cumulative obesity dose, a measure of obesity history, was calculated by subtracting the upper cut-off of the normal BMI from the actual BMI at each follow-up and summing the areas under the obesity dose curve. Hazard ratios (HRs) for diabetes were calculated using Cox regre...

  8. 7 CFR 42.132 - Determining cumulative sum values.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Determining cumulative sum values. 42.132 Section 42... Determining cumulative sum values. (a) The parameters for the on-line cumulative sum sampling plans for AQL's... 3 1 2.5 3 1 2 1 (b) At the beginning of the basic inspection period, the CuSum value is set equal to...

  9. Improving cumulative effects assessment in Alberta: Regional strategic assessment

    International Nuclear Information System (INIS)

    Johnson, Dallas; Lalonde, Kim; McEachern, Menzie; Kenney, John; Mendoza, Gustavo; Buffin, Andrew; Rich, Kate

    2011-01-01

    The Government of Alberta, Canada is developing a regulatory framework to better manage cumulative environmental effects from development in the province. A key component of this effort is regional planning, which will lay the primary foundation for cumulative effects management into the future. Alberta Environment has considered the information needs of regional planning and has concluded that Regional Strategic Assessment may offer significant advantages if integrated into the planning process, including the overall improvement of cumulative environmental effects assessment in the province.

  10. Children neglected: Where cumulative risk theory fails.

    Science.gov (United States)

    O'Hara, Mandy; Legano, Lori; Homel, Peter; Walker-Descartes, Ingrid; Rojas, Mary; Laraque, Danielle

    2015-07-01

    Neglected children, by far the majority of children maltreated, experience an environment most deficient in cognitive stimulation and language exchange. When physical abuse co-occurs with neglect, there is more stimulation through negative parent-child interaction, which may lead to better cognitive outcomes, contrary to Cumulative Risk Theory. The purpose of the current study was to assess whether children only neglected perform worse on cognitive tasks than children neglected and physically abused. Utilizing LONGSCAN archived data, 271 children only neglected and 101 children neglected and physically abused in the first four years of life were compared. The two groups were assessed at age 6 on the WPPSI-R vocabulary and block design subtests, correlates of cognitive intelligence. Regression analyses were performed, controlling for additional predictors of poor cognitive outcome, including socioeconomic variables and caregiver depression. Children only neglected scored significantly worse than children neglected and abused on the WPPSI-R vocabulary subtest (p=0.03). The groups did not differ on the block design subtest (p=0.4). This study shows that for neglected children, additional abuse may not additively accumulate risk when considering intelligence outcomes. Children experiencing only neglect may need to be referred for services that address cognitive development, with emphasis on the linguistic environment, in order to best support the developmental challenges of neglected children. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Standardization of the cumulative absolute velocity

    International Nuclear Information System (INIS)

    O'Hara, T.F.; Jacobson, J.P.

    1991-12-01

    EPRI NP-5930, ''A Criterion for Determining Exceedance of the Operating Basis Earthquake,'' was published in July 1988. As defined in that report, the Operating Basis Earthquake (OBE) is exceeded when both a response spectrum parameter and a second damage parameter, referred to as the Cumulative Absolute Velocity (CAV), are exceeded. In the review process of the above report, it was noted that the calculation of CAV could be confounded by time history records of long duration containing low (nondamaging) acceleration. Therefore, it is necessary to standardize the method of calculating CAV to account for record length. This standardized methodology allows consistent comparisons between future CAV calculations and the adjusted CAV threshold value based upon applying the standardized methodology to the data set presented in EPRI NP-5930. The recommended method to standardize the CAV calculation is to window its calculation on a second-by-second basis for a given time history. If the absolute acceleration exceeds 0.025g at any time during each one second interval, the earthquake records used in EPRI NP-5930 have been reanalyzed and the adjusted threshold of damage for CAV was found to be 0.16g-set

  12. Curved Josephson junction

    International Nuclear Information System (INIS)

    Dobrowolski, Tomasz

    2012-01-01

    The constant curvature one and quasi-one dimensional Josephson junction is considered. On the base of Maxwell equations, the sine–Gordon equation that describes an influence of curvature on the kink motion was obtained. It is showed that the method of geometrical reduction of the sine–Gordon model from three to lower dimensional manifold leads to an identical form of the sine–Gordon equation. - Highlights: ► The research on dynamics of the phase in a curved Josephson junction is performed. ► The geometrical reduction is applied to the sine–Gordon model. ► The results of geometrical reduction and the fundamental research are compared.

  13. Analysis of Memory Codes and Cumulative Rehearsal in Observational Learning

    Science.gov (United States)

    Bandura, Albert; And Others

    1974-01-01

    The present study examined the influence of memory codes varying in meaningfulness and retrievability and cumulative rehearsal on retention of observationally learned responses over increasing temporal intervals. (Editor)

  14. Curved-Duct

    Directory of Open Access Journals (Sweden)

    Je Hyun Baekt

    2000-01-01

    Full Text Available A numerical study is conducted on the fully-developed laminar flow of an incompressible viscous fluid in a square duct rotating about a perpendicular axis to the axial direction of the duct. At the straight duct, the rotation produces vortices due to the Coriolis force. Generally two vortex cells are formed and the axial velocity distribution is distorted by the effect of this Coriolis force. When a convective force is weak, two counter-rotating vortices are shown with a quasi-parabolic axial velocity profile for weak rotation rates. As the rotation rate increases, the axial velocity on the vertical centreline of the duct begins to flatten and the location of vorticity center is moved near to wall by the effect of the Coriolis force. When the convective inertia force is strong, a double-vortex secondary flow appears in the transverse planes of the duct for weak rotation rates but as the speed of rotation increases the secondary flow is shown to split into an asymmetric configuration of four counter-rotating vortices. If the rotation rates are increased further, the secondary flow restabilizes to a slightly asymmetric double-vortex configuration. Also, a numerical study is conducted on the laminar flow of an incompressible viscous fluid in a 90°-bend square duct that rotates about axis parallel to the axial direction of the inlet. At a 90°-bend square duct, the feature of flow by the effect of a Coriolis force and a centrifugal force, namely a secondary flow by the centrifugal force in the curved region and the Coriolis force in the downstream region, is shown since the centrifugal force in curved region and the Coriolis force in downstream region are dominant respectively.

  15. Elliptic curves for applications (Tutorial)

    NARCIS (Netherlands)

    Lange, T.; Bernstein, D.J.; Chatterjee, S.

    2011-01-01

    More than 25 years ago, elliptic curves over finite fields were suggested as a group in which the Discrete Logarithm Problem (DLP) can be hard. Since then many researchers have scrutinized the security of the DLP on elliptic curves with the result that for suitably chosen curves only exponential

  16. Titration Curves: Fact and Fiction.

    Science.gov (United States)

    Chamberlain, John

    1997-01-01

    Discusses ways in which datalogging equipment can enable titration curves to be measured accurately and how computing power can be used to predict the shape of curves. Highlights include sources of error, use of spreadsheets to generate titration curves, titration of a weak acid with a strong alkali, dibasic acids, weak acid and weak base, and…

  17. Peak oil analyzed with a logistic function and idealized Hubbert curve

    International Nuclear Information System (INIS)

    Gallagher, Brian

    2011-01-01

    A logistic function is used to characterize peak and ultimate production of global crude oil and petroleum-derived liquid fuels. Annual oil production data were incrementally summed to construct a logistic curve in its initial phase. Using a curve-fitting approach, a population-growth logistic function was applied to complete the cumulative production curve. The simulated curve was then deconstructed into a set of annual oil production data producing an 'idealized' Hubbert curve. An idealized Hubbert curve (IHC) is defined as having properties of production data resulting from a constant growth-rate under fixed resource limits. An IHC represents a potential production curve constructed from cumulative production data and provides a new perspective for estimating peak production periods and remaining resources. The IHC model data show that idealized peak oil production occurred in 2009 at 83.2 Mb/d (30.4 Gb/y). IHC simulations of truncated historical oil production data produced similar results and indicate that this methodology can be useful as a prediction tool. - Research Highlights: →Global oil production data were analyzed by a simple curve fitting method. →Best fit-curve results were obtained using two logistic functions on select data. →A broad potential oil production peak is forecast for the years from 2004 to 2014. →Similar results were obtained using historical data from about 10 to 30 years ago. →Two potential oil production decline scenarios were presented and compared.

  18. Consideration of uncertainties in CCDF risk curves in safety oriented decision making processes

    International Nuclear Information System (INIS)

    Stern, E.; Tadmor, J.

    1988-01-01

    In recent years, some of the results of Probabilistic Risk Assessment (i.e. the magnitudes of the various adverse health effects and other effects of potential accidents in nuclear power plants) have usually been presented in Complementary Cumulative Distribution Function curves, widely known as CCDF risk curves. CCDF curves are characteristic of probabilistic accident analyses and consequence calculations, although, in many cases, the codes producing the CCDF curves consist of a mixture of both probabilistic and deterministic calculations. One of the main difficulties in the process of PRA is the problem of uncertainties associated with the risk assessments. The uncertainties, as expressed in CCDF risk curves can be classified into two main categories: (a) uncertainties expressed by the CCDF risk curve itself due to its probabilistic nature and - (b) the uncertainty band of CCDF risk curves. The band consists of a ''family of CCDF curves'' which represents the risks (e.g. early/late fatalities) evaluated at various levels of confidence for a specific Plant-Site Combination (PSC) i.e. a certain nuclear power plant located at a certain site. The reasons why a family of curves rather than a single curve represents the risk of a certain PSC have been discussed. Generally, the uncertainty band of CCDF curves is limited by the 95% (''conservative'') and the 5% curves. In most cases the 50% (median, ''best estimate'') curve is also shown because scientists tend to believe that it represents the ''realistic'' (or real) risk of the plant

  19. Several problems of cumulative effective mass fraction in anti-seismic analysis

    International Nuclear Information System (INIS)

    Wang Wei; Sheng Feng; Li Hailong; Wen Jing; Luan Lin

    2005-01-01

    Cumulative Effective Mass Fraction (CEMF) is one of important items which sign the accuracy in antiseismic analysis. Based on the primary theories of CEMF, the paper show the influence of CEMF on the accuracy in antiseismic analysis. Moreover, some advices and ways are given to solve common problems in antiseismic analysis, such as how to increase CEMF, how to avoid the mass's loss because of the torsional frequency's being close to the frequency corresponding to the peak of seismic response spectrum, how to avoid the mass's loss because of the constraints, and so on. (authors)

  20. A Journey Between Two Curves

    Directory of Open Access Journals (Sweden)

    Sergey A. Cherkis

    2007-03-01

    Full Text Available A typical solution of an integrable system is described in terms of a holomorphic curve and a line bundle over it. The curve provides the action variables while the time evolution is a linear flow on the curve's Jacobian. Even though the system of Nahm equations is closely related to the Hitchin system, the curves appearing in these two cases have very different nature. The former can be described in terms of some classical scattering problem while the latter provides a solution to some Seiberg-Witten gauge theory. This note identifies the setup in which one can formulate the question of relating the two curves.

  1. Cumulative Effect of Depression on Dementia Risk

    Directory of Open Access Journals (Sweden)

    J. Olazarán

    2013-01-01

    Full Text Available Objective. To analyze a potential cumulative effect of life-time depression on dementia and Alzheimer’s disease (AD, with control of vascular factors (VFs. Methods. This study was a subanalysis of the Neurological Disorders in Central Spain (NEDICES study. Past and present depression, VFs, dementia status, and dementia due to AD were documented at study inception. Dementia status was also documented after three years. Four groups were created according to baseline data: never depression (nD, past depression (pD, present depression (prD, and present and past depression (prpD. Logistic regression was used. Results. Data of 1,807 subjects were investigated at baseline (mean age 74.3, 59.3% women, and 1,376 (81.6% subjects were evaluated after three years. The prevalence of dementia at baseline was 6.7%, and dementia incidence was 6.3%. An effect of depression was observed on dementia prevalence (OR [CI 95%] 1.84 [1.01–3.35] for prD and 2.73 [1.08–6.87] for prpD, and on dementia due to AD (OR 1.98 [0.98–3.99] for prD and OR 3.98 [1.48–10.71] for prpD (fully adjusted models, nD as reference. Depression did not influence dementia incidence. Conclusions. Present depression and, particularly, present and past depression are associated with dementia at old age. Multiple mechanisms, including toxic effect of depression on hippocampal neurons, plausibly explain these associations.

  2. Quantitative cumulative biodistribution of antibodies in mice

    Science.gov (United States)

    Yip, Victor; Palma, Enzo; Tesar, Devin B; Mundo, Eduardo E; Bumbaca, Daniela; Torres, Elizabeth K; Reyes, Noe A; Shen, Ben Q; Fielder, Paul J; Prabhu, Saileta; Khawli, Leslie A; Boswell, C Andrew

    2014-01-01

    The neonatal Fc receptor (FcRn) plays an important and well-known role in antibody recycling in endothelial and hematopoietic cells and thus it influences the systemic pharmacokinetics (PK) of immunoglobulin G (IgG). However, considerably less is known about FcRn’s role in the metabolism of IgG within individual tissues after intravenous administration. To elucidate the organ distribution and gain insight into the metabolism of humanized IgG1 antibodies with different binding affinities FcRn, comparative biodistribution studies in normal CD-1 mice were conducted. Here, we generated variants of herpes simplex virus glycoprotein D-specific antibody (humanized anti-gD) with increased and decreased FcRn binding affinity by genetic engineering without affecting antigen specificity. These antibodies were expressed in Chinese hamster ovary cell lines, purified and paired radiolabeled with iodine-125 and indium-111. Equal amounts of I-125-labeled and In-111-labeled antibodies were mixed and intravenously administered into mice at 5 mg/kg. This approach allowed us to measure both the real-time IgG uptake (I-125) and cumulative uptake of IgG and catabolites (In-111) in individual tissues up to 1 week post-injection. The PK and distribution of the wild-type IgG and the variant with enhanced binding for FcRn were largely similar to each other, but vastly different for the rapidly cleared low-FcRn-binding variant. Uptake in individual tissues varied across time, FcRn binding affinity, and radiolabeling method. The liver and spleen emerged as the most concentrated sites of IgG catabolism in the absence of FcRn protection. These data provide an increased understanding of FcRn’s role in antibody PK and catabolism at the tissue level. PMID:24572100

  3. Soil Water Retention Curve

    Science.gov (United States)

    Johnson, L. E.; Kim, J.; Cifelli, R.; Chandra, C. V.

    2016-12-01

    Potential water retention, S, is one of parameters commonly used in hydrologic modeling for soil moisture accounting. Physically, S indicates total amount of water which can be stored in soil and is expressed in units of depth. S can be represented as a change of soil moisture content and in this context is commonly used to estimate direct runoff, especially in the Soil Conservation Service (SCS) curve number (CN) method. Generally, the lumped and the distributed hydrologic models can easily use the SCS-CN method to estimate direct runoff. Changes in potential water retention have been used in previous SCS-CN studies; however, these studies have focused on long-term hydrologic simulations where S is allowed to vary at the daily time scale. While useful for hydrologic events that span multiple days, the resolution is too coarse for short-term applications such as flash flood events where S may not recover its full potential. In this study, a new method for estimating a time-variable potential water retention at hourly time-scales is presented. The methodology is applied for the Napa River basin, California. The streamflow gage at St Helena, located in the upper reaches of the basin, is used as the control gage site to evaluate the model performance as it is has minimal influences by reservoirs and diversions. Rainfall events from 2011 to 2012 are used for estimating the event-based SCS CN to transfer to S. As a result, we have derived the potential water retention curve and it is classified into three sections depending on the relative change in S. The first is a negative slope section arising from the difference in the rate of moving water through the soil column, the second is a zero change section representing the initial recovery the potential water retention, and the third is a positive change section representing the full recovery of the potential water retention. Also, we found that the soil water moving has traffic jam within 24 hours after finished first

  4. NOISY DISPERSION CURVE PICKING (NDCP): a Matlab friendly suite package for fully control dispersion curve picking

    Science.gov (United States)

    Granados, I.; Calo, M.; Ramos, V.

    2017-12-01

    We developed a Matlab suite package (NDCP, Noisy Dispersion Curve Picking) that allows a full control over parameters to identify correctly group velocity dispersion curves in two types of datasets: correlograms between two stations or surface wave records from earthquakes. Using the frequency-time analysis (FTAN), the procedure to obtain the dispersion curves from records with a high noise level becomes difficult, and sometimes, the picked curve result in a misinterpreted character. For correlogram functions, obtained with cross-correlation of noise records or earthquake's coda, a non-homogeneous noise sources distribution yield to a non-symmetric Green's function (GF); to retrieve the complete information contained in there, NDCP allows to pick the dispersion curve in the time domain both in the causal and non-causal part of the GF. Then the picked dispersion curve is displayed on the FTAN diagram to in order to check if it matches with the maximum of the signal energy avoiding confusion with overtones or spike of noise. To illustrate how NDCP performs, we show exemple using: i) local correlograms functions obtained from sensors deployed into a volcanic caldera (Los Humeros, in Puebla, Mexico), ii) regional correlograms functions between two stations of the National Seismological Service (SSN, Servicio Sismológico Nacional in Spanish), and iii) surface wave seismic record for an earthquake located in the Pacific Ocean coast of Mexico and recorded by the SSN. This work is supported by the GEMEX project (Geothermal Europe-Mexico consortium).

  5. Fermions in curved spacetimes

    Energy Technology Data Exchange (ETDEWEB)

    Lippoldt, Stefan

    2016-01-21

    In this thesis we study a formulation of Dirac fermions in curved spacetime that respects general coordinate invariance as well as invariance under local spin base transformations. We emphasize the advantages of the spin base invariant formalism both from a conceptual as well as from a practical viewpoint. This suggests that local spin base invariance should be added to the list of (effective) properties of (quantum) gravity theories. We find support for this viewpoint by the explicit construction of a global realization of the Clifford algebra on a 2-sphere which is impossible in the spin-base non-invariant vielbein formalism. The natural variables for this formulation are spacetime-dependent Dirac matrices subject to the Clifford-algebra constraint. In particular, a coframe, i.e. vielbein field is not required. We disclose the hidden spin base invariance of the vielbein formalism. Explicit formulas for the spin connection as a function of the Dirac matrices are found. This connection consists of a canonical part that is completely fixed in terms of the Dirac matrices and a free part that can be interpreted as spin torsion. The common Lorentz symmetric gauge for the vielbein is constructed for the Dirac matrices, even for metrics which are not linearly connected. Under certain criteria, it constitutes the simplest possible gauge, demonstrating why this gauge is so useful. Using the spin base formulation for building a field theory of quantized gravity and matter fields, we show that it suffices to quantize the metric and the matter fields. This observation is of particular relevance for field theory approaches to quantum gravity, as it can serve for a purely metric-based quantization scheme for gravity even in the presence of fermions. Hence, in the second part of this thesis we critically examine the gauge, and the field-parametrization dependence of renormalization group flows in the vicinity of non-Gaussian fixed points in quantum gravity. While physical

  6. A Framework for Treating Cumulative Trauma with Art Therapy

    Science.gov (United States)

    Naff, Kristina

    2014-01-01

    Cumulative trauma is relatively undocumented in art therapy practice, although there is growing evidence that art therapy provides distinct benefits for resolving various traumas. This qualitative study proposes an art therapy treatment framework for cumulative trauma derived from semi-structured interviews with three art therapists and artistic…

  7. Cumulative effects of forest management activities: how might they occur?

    Science.gov (United States)

    R. M. Rice; R. B. Thomas

    1985-01-01

    Concerns are often voiced about possible environmental damage as the result of the cumulative sedimentation effects of logging and forest road construction. In response to these concerns, National Forests are developing procedures to reduce the possibility that their activities may lead to unacceptable cumulative effects

  8. Cumulative effect in multiple production processes on nuclei

    International Nuclear Information System (INIS)

    Golubyatnikova, E.S.; Shmonin, V.L.; Kalinkin, B.N.

    1989-01-01

    It is shown that the cumulative effect is a natural result of the process of hadron multiple production in nuclear reactions. Interpretation is made of the universality of slopes of inclusive spectra and other characteristics of cumulative hadrons. The character of information from such reactions is discussed, which could be helpful in studying the mechanism of multiparticle production. 27 refs.; 4 figs

  9. Cumulative particle production in the quark recombination model

    International Nuclear Information System (INIS)

    Gavrilov, V.B.; Leksin, G.A.

    1987-01-01

    Production of cumulative particles in hadron-nuclear inteactions at high energies is considered within the framework of recombination quark model. Predictions for inclusive cross sections of production of cumulative particles and different resonances containing quarks in s state are made

  10. On the analysis of Canadian Holstein dairy cow lactation curves using standard growth functions

    NARCIS (Netherlands)

    López, S.; France, J.; Odongo, N.E.; McBride, R.A.; Kebreab, E.; Alzahal, O.; McBride, B.W.; Dijkstra, J.

    2015-01-01

    Six classical growth functions (monomolecular, Schumacher, Gompertz, logistic, Richards, and Morgan) were fitted to individual and average (by parity) cumulative milk production curves of Canadian Holstein dairy cows. The data analyzed consisted of approximately 91,000 daily milk yield records

  11. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  12. Detection of flaws below curved surfaces

    International Nuclear Information System (INIS)

    Elsley, R.K.; Addison, R.C.; Graham, L.J.

    1983-01-01

    A measurement model has been developed to describe ultrasonic measurements made with circular piston transducers in parts with flat or cylindrically curved surfaces. The model includes noise terms to describe electrical noise, scatterer noise and echo noise as well as effects of attenuation, diffraction and Fresnel loss. An experimental procedure for calibrating the noise terms of the model was developed. Experimental measurements were made on a set of known flaws located beneath a cylindrically curved surface. The model was verified by using it to correct the experimental measurements to obtain the absolute scattering amplitude of the flaws. For longitudinal wave propagation within the part, the derived scattering amplitudes were consistent with predictions at internal angles of less than 30 0 . At larger angles, focusing and aberrations caused a lack of agreement; the model needs further refinement in this case. For shear waves, it was found that the frequency for optimum flaw detection in the presence of material noise is lower than that for longitudinal waves; lower frequency measurements are currently in progress. The measurement model was then used to make preliminary predictions of the best experimental measurement technique for the detection of cracks located under cylindrically curved surfaces

  13. [Customized and non-customized French intrauterine growth curves. II - Comparison with existing curves and benefits of customization].

    Science.gov (United States)

    Ego, A; Prunet, C; Blondel, B; Kaminski, M; Goffinet, F; Zeitlin, J

    2016-02-01

    Our aim is to compare the new French EPOPé intrauterine growth curves, developed to address the guidelines 2013 of the French College of Obstetricians and Gynecologists, with reference curves currently used in France, and to evaluate the consequences of their adjustment for fetal sex and maternal characteristics. Eight intrauterine and birthweight curves, used in France were compared to the EPOPé curves using data from the French Perinatal Survey 2010. The influence of adjustment on the rate of SGA births and the characteristics of these births was analysed. Due to their birthweight values and distribution, the selected intrauterine curves are less suitable for births in France than the new curves. Birthweight curves led to low rates of SGA births from 4.3 to 8.5% compared to 10.0% with the EPOPé curves. The adjustment for maternal and fetal characteristics avoids the over-representation of girls among SGA births, and reclassifies 4% of births. Among births reclassified as SGA, the frequency of medical and obstetrical risk factors for growth restriction, smoking (≥10 cigarettes/day), and neonatal transfer is higher than among non-SGA births (P<0.01). The EPOPé curves are more suitable for French births than currently used curves, and their adjustment improves the identification of mothers and babies at risk of growth restriction and poor perinatal outcomes. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  14. A damage cumulation method for crack initiation prediction under non proportional loading and overloading

    International Nuclear Information System (INIS)

    Taheri, S.

    1992-04-01

    For a sequence of constant amplitude cyclic loading containing overloads, we propose a method for damage cumulation in non proportional loading. This method uses as data cyclic stabilized states at non proportional loading and initiation or fatigue curve in uniaxial case. For that, we take into account the dependence of Cyclic Strain Stress Curves (C.S.S.C.) and mean cell size on prehardening and we define a stabilized uniaxial state cyclically equivalent to a non proportional stabilized state through a family of C.S.S.C. Although simple assumptions like linear damage function and linear cumulation is used we obtain a sequence effect for difficult cross slip materials as 316 stainless steel, but the Miner rule for easy cross-slip materials. We show then differences between a load-controlled test and a strain controlled test: for a 316 stainless steel in a load controlled test, the non proportional loading at each cycle is less damaging than the uniaxial one for the same equivalent stress, while the result is opposite in a strain controlled test. We show also that an overloading retards initiation in a load controlled test while it accelerates initiation in a strain controlled test. (author). 26 refs., 8 figs

  15. Towards Greenland Glaciation: cumulative or abrupt transition?

    Science.gov (United States)

    Ramstein, Gilles; Tan, Ning; Ladant, Jean-baptiste; Dumas, Christophe; Contoux, Camille

    2017-04-01

    During the mid-Pliocene warming period (3-3.3 Ma BP), the global annual mean temperatures inferred by data and model studies were 2-3° warmer than pre-industrial values. Accordingly, Greenland ice sheet volume is supposed to reach at the most, only half of that of present-day [Haywood et al. 2010]. Around 2.7-2.6 Ma BP, just ˜ 500 kyr after the warming peak of mid-Pliocene, the Greenland ice sheet has reached its full size [Lunt et al. 2008]. A crucial question concerns the evolution of the Greenland ice sheet from half to full size during the 3 - 2.5 Ma period. Data show a decreasing trend of atmospheric CO2 concentration from 3 Ma to 2.5 Ma [Seki et al.2010; Bartoli et al. 2011; Martinez et al. 2015]. However, a recent study [Contoux et al. 2015] suggests that a lowering of CO2 is not sufficient to initiate a perennial glaciation on Greenland and must be combined with low summer insolation to preserve the ice sheet during insolation maxima. This suggests rather a cumulative process than an abrupt event. In order to diagnose the evolution of the ice sheet build-up, we carry on, for the first time, a transient simulation of climate and ice sheet evolutions from 3 Ma to 2.5 Ma. This strategy enables us to investigate the waxing and waning of the ice sheet during several orbital cycles. We use a tri-dimensional interpolation method designed by Ladant et al. (2014), which allows the evolution of CO2 concentration and of orbital parameters, and the evolution of the Greenland ice sheet size to be taken into account. By interpolating climatic snapshot simulations ran with various possible combinations of CO2, orbits and ice sheet sizes, we can build a continuous climatic forcing that is then used to provide 500 kyrs-long ice sheet simulations. With such a tool, we may offer a physically based answer to different CO2 reconstructions scenarios and analyse which one is the most consistent with Greenland ice sheet buildup.

  16. An Analysis of Cumulative Risks Indicated by Biomonitoring Data of Six Phthalates Using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single component of a chemical mixture drives the cumulative risk of a receptor.1 This study used the MCR, the Hazard Index (HI) and Hazard Quotient (HQ) to evaluate co-exposures to six phthalates using biomonito...

  17. An analysis of cumulative risks based on biomonitoring data for six phthalates using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single chemical drives the cumulative risk of an individual exposed to multiple chemicals. Phthalates are a class of chemicals with ubiquitous exposures in the general population that have the potential to cause ...

  18. Predictive Value of Cumulative Blood Pressure for All-Cause Mortality and Cardiovascular Events

    Science.gov (United States)

    Wang, Yan Xiu; Song, Lu; Xing, Ai Jun; Gao, Ming; Zhao, Hai Yan; Li, Chun Hui; Zhao, Hua Ling; Chen, Shuo Hua; Lu, Cheng Zhi; Wu, Shou Ling

    2017-02-01

    The predictive value of cumulative blood pressure (BP) on all-cause mortality and cardiovascular and cerebrovascular events (CCE) has hardly been studied. In this prospective cohort study including 52,385 participants from the Kailuan Group who attended three medical examinations and without CCE, the impact of cumulative systolic BP (cumSBP) and cumulative diastolic BP (cumDBP) on all-cause mortality and CCEs was investigated. For the study population, the mean (standard deviation) age was 48.82 (11.77) years of which 40,141 (76.6%) were male. The follow-up for all-cause mortality and CCEs was 3.96 (0.48) and 2.98 (0.41) years, respectively. Multivariate Cox proportional hazards regression analysis showed that for every 10 mm Hg·year increase in cumSBP and 5 mm Hg·year increase in cumDBP, the hazard ratio for all-cause mortality were 1.013 (1.006, 1.021) and 1.012 (1.006, 1.018); for CCEs, 1.018 (1.010, 1.027) and 1.017 (1.010, 1.024); for stroke, 1.021 (1.011, 1.031) and 1.018 (1.010, 1.026); and for MI, 1.013 (0.996, 1.030) and 1.015 (1.000, 1.029). Using natural spline function analysis, cumSBP and cumDBP showed a J-curve relationship with CCEs; and a U-curve relationship with stroke (ischemic stroke and hemorrhagic stroke). Therefore, increases in cumSBP and cumDBP were predictive for all-cause mortality, CCEs, and stroke.

  19. Risk of therapy-related leukaemia and preleukaemia after Hodgkin's disease. Relation to age, cumulative dose of alkylating agents, and time from chemotherapy

    DEFF Research Database (Denmark)

    Pedersen-Bjergaard, J.; Specht, L.; Larsen, S.O.

    1987-01-01

    391 patients treated intensively for Hodgkin's disease were followed for up to 15 years to evaluate the risk of therapy-related acute non-lymphocytic leukaemia (t-ANLL) and preleukaemia. Only two independent factors, patient age and cumulative dose of alkylating agents, were related to the risk...... of t-ANLL. The hazard rate of t-ANLL was roughly proportional to the square of patient age and to the total cumulative dose of alkylating agents. In 320 patients treated with alkylating agents the cumulative risk of t-ANLL increased steadily from 1 year after the start of treatment and reached 13.......0% (SE 3.0) at 10 years after which time there were no further cases. Calculated from cessation of therapy with alkylating agents, however, the cumulative risk curve increased steeply during the first 1-2 years then gradually levelled out and no new cases were observed beyond 7 years. With a 15-year...

  20. THE SIGNIFICANCE OF CUMULATIVE WATER BALANCE IN THE DEVELOPMENT OF EARLY COMPLICATIONS AFTER MAJOR ABDOMINAL SURGERY.

    Science.gov (United States)

    Musaeva, T S; Karipidi, M K; Zabolotskikh, I B

    2016-11-01

    a comprehensive assessment of the water balance on the basis of daily, cumulative balance and 10% of the body weight gain and their role in the development of early complications after major abdominal surgery. A retrospective study of the perioperative period in 150 patients who underwent major abdomi- nal surgery was performed. The physical condition of the patients corresponded to ASA 3 class. The average age was 46 (38-62) years. The following stages ofresearch: an analysis of daily balance and cumulative balance in complicated and uncomplicated group and their role in the development of complications; the timing of development ofcomplications and possible relationship with fluid overload and the development of complications; changes in the level of albumin within 10 days of the postoperative period. The analysis of complications didn't show significant differences between complicated and uncomplicated groups according to the water balance during the surgery and by the end of the first day. When constructing the area under the ROC curve (A UROC) low resolution ofthe balance in intraoperative period and the first day and the balance on the second day to predict complications was shown. Significant diferences according to the cumulative balance was observed from the third day of the postoperative period Also with the third day of the postoperative period there is a good resolution for prediction ofpostoperative complications according to the cumulative balance with the cut-offpoint > of 50,7 ml/kg. the excessive infusion therapy is a predictor of adverse outcome in patients after major abdominal surgery. Therefore, after 3 days of postoperative period it is important to maintain mechanisms for the excretion of excess fluid or limitations of infusion therapy.

  1. Cumulative occupational shoulder exposures and surgery for subacromial impingement syndrome: a nationwide Danish cohort study.

    Science.gov (United States)

    Dalbøge, Annett; Frost, Poul; Andersen, Johan Hviid; Svendsen, Susanne Wulff

    2014-11-01

    The primary aim was to examine exposure-response relationships between cumulative occupational shoulder exposures and surgery for subacromial impingement syndrome (SIS), and to compare sex-specific exposure-response relationships. The secondary aim was to examine the time window of relevant exposures. We conducted a nationwide register study of all persons born in Denmark (1933-1977), who had at least 5 years of full-time employment. In the follow-up period (2003-2008), we identified first-time events of surgery for SIS. Cumulative exposure estimates for a 10-year exposure time window with a 1-year lag time were obtained by linking occupational codes with a job exposure matrix. The exposure estimates were expressed as, for example, arm-elevation-years in accordance with the pack-year concept of tobacco consumption. We used a multivariable logistic regression technique equivalent to discrete survival analysis. The adjusted OR (ORadj) increased to a maximum of 2.1 for arm-elevation-years, repetition-years and force-years, and to 1.5 for hand-arm-vibration-years. Sex-specific exposure-response relationships were similar for men and women, when assessed using a relative risk scale. The ORadj increased gradually with the number of years contributing to the cumulative exposure estimates. The excess fraction was 24%. Cumulative occupational shoulder exposures carried an increase in risk of surgery for SIS with similar exposure-response curves for men and women. The risk of surgery for SIS increased gradually, when the period of exposure assessment was extended. In the general working population, a substantial fraction of all first-time operations for SIS could be related to occupational exposures. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Models of genus one curves

    OpenAIRE

    Sadek, Mohammad

    2010-01-01

    In this thesis we give insight into the minimisation problem of genus one curves defined by equations other than Weierstrass equations. We are interested in genus one curves given as double covers of P1, plane cubics, or complete intersections of two quadrics in P3. By minimising such a curve we mean making the invariants associated to its defining equations as small as possible using a suitable change of coordinates. We study the non-uniqueness of minimisations of the genus one curves des...

  3. Learning curve for intracranial angioplasty and stenting in single center.

    Science.gov (United States)

    Cai, Qiankun; Li, Yongkun; Xu, Gelin; Sun, Wen; Xiong, Yunyun; Sun, Wenshan; Bao, Yuanfei; Huang, Xianjun; Zhang, Yao; Zhou, Lulu; Zhu, Wusheng; Liu, Xinfeng

    2014-01-01

    To identify the specific caseload to overcome learning curve effect based on data from consecutive patients treated with Intracranial Angioplasty and Stenting (IAS) in our center. The Stenting and Aggressive Medical Management for Preventing Recurrent Stroke and Intracranial Stenosis trial was prematurely terminated owing to the high rate of periprocedural complications in the endovascular arm. To date, there are no data available for determining the essential caseload sufficient to overcome the learning effect and perform IAS with an acceptable level of complications. Between March 2004 and May 2012, 188 consecutive patients with 194 lesions who underwent IAS were analyzed retrospectively. The outcome variables used to assess the learning curve were periprocedural complications (included transient ischemic attack, ischemic stroke, vessel rupture, cerebral hyperperfusion syndrome, and vessel perforation). Multivariable logistic regression analysis was employed to illustrate the existence of learning curve effect on IAS. A risk-adjusted cumulative sum chart was performed to identify the specific caseload to overcome learning curve effect. The overall rate of 30-days periprocedural complications was 12.4% (24/194). After adjusting for case-mix, multivariate logistic regression analysis showed that operator experience was an independent predictor for periprocedural complications. The learning curve of IAS to overcome complications in a risk-adjusted manner was 21 cases. Operator's level of experience significantly affected the outcome of IAS. Moreover, we observed that the amount of experience sufficient for performing IAS in our center was 21 cases. Copyright © 2013 Wiley Periodicals, Inc.

  4. The role of experience curves for setting MEPS for appliances

    International Nuclear Information System (INIS)

    Siderius, Hans-Paul

    2013-01-01

    Minimum efficiency performance standards (MEPS) are an important policy instrument to raise the efficiency of products. In most schemes the concept of life cycle costs (LCC) is used to guide setting the MEPS levels. Although a large body of literature shows that product cost is decreasing with increasing cumulative production, the experience curve, this is currently not used for setting MEPS. This article shows how to integrate the concept of the experience curve into LCC calculations for setting MEPS in the European Union and applies this to household laundry driers, refrigerator-freezers and televisions. The results indicate that for driers and refrigerator-freezers at least twice the energy savings compared to the current approach can be achieved. These products also show that energy label classes can successfully be used for setting MEPS. For televisions an experience curve is provided, showing a learning rate of 29%. However, television prices do not show a relation with energy efficiency but are to a large extent determined by the time the product is placed on the market. This suggests to policy makers that for televisions and other products with a short (re)design and market cycle timing is more important than the MEPS levels itself. - Highlights: • We integrate experience curves into life cycle cost calculations for MEPS. • For driers and refrigerators this results in at least twice the energy savings. • For flat panel televisions an experience curve is provided

  5. Multiphoton absorption coefficients in solids: an universal curve

    International Nuclear Information System (INIS)

    Brandi, H.S.; Araujo, C.B. de

    1983-04-01

    An universal curve for the frequency dependence of the multiphoton absorption coefficient is proposed based on a 'non-perturbative' approach. Specific applications have been made to obtain two, three, four and five photons absorption coefficient in different materials. Properly scaling of the two photon absorption coefficient and the use of the universal curve yields results for the higher order absorption coefficients in good agreement with the experimental data. (Author) [pt

  6. Cumulants in perturbation expansions for non-equilibrium field theory

    International Nuclear Information System (INIS)

    Fauser, R.

    1995-11-01

    The formulation of perturbation expansions for a quantum field theory of strongly interacting systems in a general non-equilibrium state is discussed. Non-vanishing initial correlations are included in the formulation of the perturbation expansion in terms of cumulants. The cumulants are shown to be the suitable candidate for summing up the perturbation expansion. Also a linked-cluster theorem for the perturbation series with cumulants is presented. Finally a generating functional of the perturbation series with initial correlations is studied. We apply the methods to a simple model of a fermion-boson system. (orig.)

  7. Multilayer Strip Dipole Antenna Using Stacking Technique and Its Application for Curved Surface

    Directory of Open Access Journals (Sweden)

    Charinsak Saetiaw

    2013-01-01

    Full Text Available This paper presents the design of multilayer strip dipole antenna by stacking a flexible copper-clad laminate utilized for curved surface on the cylindrical objects. The designed antenna will reduce the effects of curving based on relative lengths that are changed in each stacking flexible copper-clad laminate layer. Curving is different from each layer of the antenna, so the resonance frequency that resulted from an extended antenna provides better frequency response stability compared to modern antenna when it is curved or attached to cylindrical objects. The frequency of multilayer antenna is designed at 920 MHz for UHF RFID applications.

  8. Quantum fields in curved space

    International Nuclear Information System (INIS)

    Birrell, N.D.; Davies, P.C.W.

    1982-01-01

    The book presents a comprehensive review of the subject of gravitational effects in quantum field theory. Quantum field theory in Minkowski space, quantum field theory in curved spacetime, flat spacetime examples, curved spacetime examples, stress-tensor renormalization, applications of renormalization techniques, quantum black holes and interacting fields are all discussed in detail. (U.K.)

  9. Experimental Observation of Cumulative Second-Harmonic Generation of Circumferential Guided Wave Propagation in a Circular Tube

    International Nuclear Information System (INIS)

    Deng Ming-Xi; Gao Guang-Jian; Li Ming-Liang

    2015-01-01

    The experimental observation of cumulative second-harmonic generation of the primary circumferential guided wave propagation is reported. A pair of wedge transducers is used to generate the primary circumferential guided wave desired and to detect its fundamental-frequency and second-harmonic amplitudes on the outside surface of the circular tube. The amplitudes of the fundamental waves and the second harmonics of the circumferential guided wave propagation are measured for different separations between the two wedge transducers. At the driving frequency where the primary and the double-frequency circumferential guided waves have the same linear phase velocities, the clear second-harmonic signals can be observed. The quantitative relationships between the second-harmonic amplitudes and circumferential angle are analyzed. It is experimentally verified that the second harmonics of primary circumferential guided waves do have a cumulative growth effect with the circumferential angle. (paper)

  10. Extended analysis of cooling curves

    International Nuclear Information System (INIS)

    Djurdjevic, M.B.; Kierkus, W.T.; Liliac, R.E.; Sokolowski, J.H.

    2002-01-01

    Thermal Analysis (TA) is the measurement of changes in a physical property of a material that is heated through a phase transformation temperature range. The temperature changes in the material are recorded as a function of the heating or cooling time in such a manner that allows for the detection of phase transformations. In order to increase accuracy, characteristic points on the cooling curve have been identified using the first derivative curve plotted versus time. In this paper, an alternative approach to the analysis of the cooling curve has been proposed. The first derivative curve has been plotted versus temperature and all characteristic points have been identified with the same accuracy achieved using the traditional method. The new cooling curve analysis also enables the Dendrite Coherency Point (DCP) to be detected using only one thermocouple. (author)

  11. Cumulative Environmental Impacts: Science and Policy to Protect Communities.

    Science.gov (United States)

    Solomon, Gina M; Morello-Frosch, Rachel; Zeise, Lauren; Faust, John B

    2016-01-01

    Many communities are located near multiple sources of pollution, including current and former industrial sites, major roadways, and agricultural operations. Populations in such locations are predominantly low-income, with a large percentage of minorities and non-English speakers. These communities face challenges that can affect the health of their residents, including limited access to health care, a shortage of grocery stores, poor housing quality, and a lack of parks and open spaces. Environmental exposures may interact with social stressors, thereby worsening health outcomes. Age, genetic characteristics, and preexisting health conditions increase the risk of adverse health effects from exposure to pollutants. There are existing approaches for characterizing cumulative exposures, cumulative risks, and cumulative health impacts. Although such approaches have merit, they also have significant constraints. New developments in exposure monitoring, mapping, toxicology, and epidemiology, especially when informed by community participation, have the potential to advance the science on cumulative impacts and to improve decision making.

  12. Pesticide Cumulative Risk Assessment: Framework for Screening Analysis

    Science.gov (United States)

    This document provides guidance on how to screen groups of pesticides for cumulative evaluation using a two-step approach: begin with evaluation of available toxicological information and, if necessary, follow up with a risk-based screening approach.

  13. Online Scheduling in Manufacturing A Cumulative Delay Approach

    CERN Document Server

    Suwa, Haruhiko

    2013-01-01

    Online scheduling is recognized as the crucial decision-making process of production control at a phase of “being in production" according to the released shop floor schedule. Online scheduling can be also considered as one of key enablers to realize prompt capable-to-promise as well as available-to-promise to customers along with reducing production lead times under recent globalized competitive markets. Online Scheduling in Manufacturing introduces new approaches to online scheduling based on a concept of cumulative delay. The cumulative delay is regarded as consolidated information of uncertainties under a dynamic environment in manufacturing and can be collected constantly without much effort at any points in time during a schedule execution. In this approach, the cumulative delay of the schedule has the important role of a criterion for making a decision whether or not a schedule revision is carried out. The cumulative delay approach to trigger schedule revisions has the following capabilities for the ...

  14. Considering Environmental and Occupational Stressors in Cumulative Risk Assessments

    Science.gov (United States)

    While definitions vary across the global scientific community, cumulative risk assessments (CRAs) typically are described as exhibiting a population focus and analyzing the combined risks posed by multiple stressors. CRAs also may consider risk management alternatives as an anal...

  15. Peer tutors as learning and teaching partners: a cumulative ...

    African Journals Online (AJOL)

    ... paper explores the kinds of development in tutors' thinking and action that are possible when training and development is theoretically informed, coherent, and oriented towards improving practice. Keywords: academic development, academic literacies, cumulative learning, higher education, peer tutoring, writing centres.

  16. CTD Information Guide. Preventing Cumulative Trauma Disorders in the Workplace

    National Research Council Canada - National Science Library

    1992-01-01

    The purpose of this report is to provide Army occupational safety and health (OSH) professionals with a primer that explains the basic principles of ergonomic-hazard recognition for common cumulative trauma disorders...

  17. Cumulative radiation exposure in children with cystic fibrosis.

    LENUS (Irish Health Repository)

    O'Reilly, R

    2010-02-01

    This retrospective study calculated the cumulative radiation dose for children with cystic fibrosis (CF) attending a tertiary CF centre. Information on 77 children with a mean age of 9.5 years, a follow up time of 658 person years and 1757 studies including 1485 chest radiographs, 215 abdominal radiographs and 57 computed tomography (CT) scans, of which 51 were thoracic CT scans, were analysed. The average cumulative radiation dose was 6.2 (0.04-25) mSv per CF patient. Cumulative radiation dose increased with increasing age and number of CT scans and was greater in children who presented with meconium ileus. No correlation was identified between cumulative radiation dose and either lung function or patient microbiology cultures. Radiation carries a risk of malignancy and children are particularly susceptible. Every effort must be made to avoid unnecessary radiation exposure in these patients whose life expectancy is increasing.

  18. Cumulative query method for influenza surveillance using search engine data.

    Science.gov (United States)

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  19. Steps and pips in the history of the cumulative recorder.

    OpenAIRE

    Lattal, Kennon A

    2004-01-01

    From its inception in the 1930s until very recent times, the cumulative recorder was the most widely used measurement instrument in the experimental analysis of behavior. It was an essential instrument in the discovery and analysis of schedules of reinforcement, providing the first real-time analysis of operant response rates and patterns. This review traces the evolution of the cumulative recorder from Skinner's early modified kymographs through various models developed by Skinner and his co...

  20. Mapping Cumulative Impacts of Human Activities on Marine Ecosystems

    OpenAIRE

    , Seaplan

    2018-01-01

    Given the diversity of human uses and natural resources that converge in coastal waters, the potential independent and cumulative impacts of those uses on marine ecosystems are important to consider during ocean planning. This study was designed to support the development and implementation of the 2009 Massachusetts Ocean Management Plan. Its goal was to estimate and visualize the cumulative impacts of human activities on coastal and marine ecosystems in the state and federal waters off of Ma...

  1. Testing the stationarity of white dwarf light-curves

    International Nuclear Information System (INIS)

    Molnar, L; Kollath, Z; Plachy, E; Paparo, M

    2009-01-01

    Long period white dwarfs show changes in their frequency spectra from one observing season to another, i.e. their light-curves cannot be considered as stationary multiperiodic variations on long timescales. However, due to the complex frequency spectra of these stars and the narrow frequency spacing, it is still unknown, what the shortest time scale is, where real physical modulation exists. We present tests on artificial data, resembling the observations, using time-frequency distributions (TFDs), Fourier-analysis and the analytical signal method.

  2. Computational aspects of algebraic curves

    CERN Document Server

    Shaska, Tanush

    2005-01-01

    The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove

  3. Phonon dispersion curves of fcc La

    International Nuclear Information System (INIS)

    Stassis, C.; Loong, C.; Zarestky, J.

    1982-01-01

    Large single crystals of fcc La were grown in situ and were used to study the lattice dynamics of this phase of La by coherent inelastic neutron scattering. The phonon dispersion curves have been measured along the [xi00], [xixi0], [xixixi], and [0xi1] symmetry directions at 660 and 1100 K. The T[xixixi] branch exhibits anomalous dispersion for xi>0.25 and, in addition, close to the zone boundary, the phonon frequencies of this branch decrease with decreasing temperature. This soft-mode behavior may be related to the #betta→α# transformation in La, an assumption supported by recent band-theoretical calculations of the generalized susceptibility of fcc La. At X the frequencies of the L[xi00] branch are considerably lower than those of the corresponding branch of #betta#-Ce; a similar but not as pronounced effect is observed for the frequencies of the L[xixixi] branch close to the point L. Since the calculated generalized susceptibility of fcc La exhibits strong peaks at X and L, these anomalies may be due to the renormalization of the phonon frequencies by virtual fbold-arrow-left-rightd transitions to the unoccupied 4f level in La. The data were used to evaluate the elastic constants, the phonon density of states, and the lattice specific heat at constant pressure C/sub P//sup

  4. 51Cr - erythrocyte survival curves

    International Nuclear Information System (INIS)

    Paiva Costa, J. de.

    1982-07-01

    Sixteen patients were studied, being fifteen patients in hemolytic state, and a normal individual as a witness. The aim was to obtain better techniques for the analysis of the erythrocytes, survival curves, according to the recommendations of the International Committee of Hematology. It was used the radiochromatic method as a tracer. Previously a revisional study of the International Literature was made in its aspects inherent to the work in execution, rendering possible to establish comparisons and clarify phonomena observed in cur investigation. Several parameters were considered in this study, hindering both the exponential and the linear curves. The analysis of the survival curves of the erythrocytes in the studied group, revealed that the elution factor did not present a homogeneous answer quantitatively to all, though, the result of the analysis of these curves have been established, through listed programs in the electronic calculator. (Author) [pt

  5. Melting curves of gammairradiated DNA

    International Nuclear Information System (INIS)

    Hofer, H.; Altmann, H.; Kehrer, M.

    1978-08-01

    Melting curves of gammairradiated DNA and data derived of them, are reported. The diminished stability is explained by basedestruction. DNA denatures completely at room temperature, if at least every fifth basepair is broken or weakened by irradiation. (author)

  6. Management of the learning curve

    DEFF Research Database (Denmark)

    Pedersen, Peter-Christian; Slepniov, Dmitrij

    2016-01-01

    Purpose – This paper focuses on the management of the learning curve in overseas capacity expansions. The purpose of this paper is to unravel the direct as well as indirect influences on the learning curve and to advance the understanding of how these affect its management. Design...... the dimensions of the learning process involved in a capacity expansion project and identified the direct and indirect labour influences on the production learning curve. On this basis, the study proposes solutions to managing learning curves in overseas capacity expansions. Furthermore, the paper concludes...... with measures that have the potential to significantly reduce the non-value-added time when establishing new capacities overseas. Originality/value – The paper uses a longitudinal in-depth case study of a Danish wind turbine manufacturer and goes beyond a simplistic treatment of the lead time and learning...

  7. Growth curves for Laron syndrome.

    OpenAIRE

    Laron, Z; Lilos, P; Klinger, B

    1993-01-01

    Growth curves for children with Laron syndrome were constructed on the basis of repeated measurements made throughout infancy, childhood, and puberty in 24 (10 boys, 14 girls) of the 41 patients with this syndrome investigated in our clinic. Growth retardation was already noted at birth, the birth length ranging from 42 to 46 cm in the 12/20 available measurements. The postnatal growth curves deviated sharply from the normal from infancy on. Both sexes showed no clear pubertal spurt. Girls co...

  8. Flow over riblet curved surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Loureiro, J B R; Freire, A P Silva, E-mail: atila@mecanica.ufrj.br [Mechanical Engineering Program, Federal University of Rio de Janeiro (COPPE/UFRJ), C.P. 68503, 21.941-972, Rio de Janeiro, RJ (Brazil)

    2011-12-22

    The present work studies the mechanics of turbulent drag reduction over curved surfaces by riblets. The effects of surface modification on flow separation over steep and smooth curved surfaces are investigated. Four types of two-dimensional surfaces are studied based on the morphometric parameters that describe the body of a blue whale. Local measurements of mean velocity and turbulence profiles are obtained through laser Doppler anemometry (LDA) and particle image velocimetry (PIV).

  9. Intersection numbers of spectral curves

    CERN Document Server

    Eynard, B.

    2011-01-01

    We compute the symplectic invariants of an arbitrary spectral curve with only 1 branchpoint in terms of integrals of characteristic classes in the moduli space of curves. Our formula associates to any spectral curve, a characteristic class, which is determined by the laplace transform of the spectral curve. This is a hint to the key role of Laplace transform in mirror symmetry. When the spectral curve is y=\\sqrt{x}, the formula gives Kontsevich--Witten intersection numbers, when the spectral curve is chosen to be the Lambert function \\exp{x}=y\\exp{-y}, the formula gives the ELSV formula for Hurwitz numbers, and when one chooses the mirror of C^3 with framing f, i.e. \\exp{-x}=\\exp{-yf}(1-\\exp{-y}), the formula gives the Marino-Vafa formula, i.e. the generating function of Gromov-Witten invariants of C^3. In some sense this formula generalizes ELSV, Marino-Vafa formula, and Mumford formula.

  10. Dissolution glow curve in LLD

    International Nuclear Information System (INIS)

    Haverkamp, U.; Wiezorek, C.; Poetter, R.

    1990-01-01

    Lyoluminescence dosimetry is based upon light emission during dissolution of previously irradiated dosimetric materials. The lyoluminescence signal is expressed in the dissolution glow curve. These curves begin, depending on the dissolution system, with a high peak followed by an exponentially decreasing intensity. System parameters that influence the graph of the dissolution glow curve, are, for example, injection speed, temperature and pH value of the solution and the design of the dissolution cell. The initial peak does not significantly correlate with the absorbed dose, it is mainly an effect of the injection. The decay of the curve consists of two exponential components: one fast and one slow. The components depend on the absorbed dose and the dosimetric materials used. In particular, the slow component correlates with the absorbed dose. In contrast to the fast component the argument of the exponential function of the slow component is independent of the dosimetric materials investigated: trehalose, glucose and mannitol. The maximum value, following the peak of the curve, and the integral light output are a measure of the absorbed dose. The reason for the different light outputs of various dosimetric materials after irradiation with the same dose is the differing solubility. The character of the dissolution glow curves is the same following irradiation with photons, electrons or neutrons. (author)

  11. Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.

    Science.gov (United States)

    Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M

    2014-12-01

    In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.

  12. Recurrence and frequency of disturbance have cumulative effect on methanotrophic activity, abundance, and community structure.

    NARCIS (Netherlands)

    Ho, A.; van den Brink, E.; Reim, A.; Krause, S.; Bodelier, P.L.E.

    2016-01-01

    Alternate prolonged drought and heavy rainfall is predicted to intensify with global warming. Desiccation-rewetting events alter the soil quality and nutrient concentrations which drive microbial-mediated processes, including methane oxidation, a key biogeochemical process catalyzed by

  13. Growth curves in Down syndrome with congenital heart disease

    Directory of Open Access Journals (Sweden)

    Caroline D’Azevedo Sica

    Full Text Available SUMMARY Introduction: To assess dietary habits, nutritional status and food frequency in children and adolescents with Down syndrome (DS and congenital heart disease (CHD. Additionally, we attempted to compare body mass index (BMI classifications according to the World Health Organization (WHO curves and curves developed for individuals with DS. Method: Cross-sectional study including individuals with DS and CHD treated at a referral center for cardiology, aged 2 to 18 years. Weight, height, BMI, total energy and food frequency were measured. Nutritional status was assessed using BMI for age and gender, using curves for evaluation of patients with DS and those set by the WHO. Results: 68 subjects with DS and CHD were evaluated. Atrioventricular septal defect (AVSD was the most common heart disease (52.9%. There were differences in BMI classification between the curves proposed for patients with DS and those proposed by the WHO. There was an association between consumption of vitamin E and polyunsaturated fatty acids. Conclusion: Results showed that individuals with DS are mostly considered normal weight for age, when evaluated using specific curves for DS. Reviews on specific curves for DS would be the recommended practice for health professionals so as to avoid precipitated diagnosis of overweight and/or obesity in this population.

  14. Maintenance hemodialysis patients have high cumulative radiation exposure.

    LENUS (Irish Health Repository)

    Kinsella, Sinead M

    2010-10-01

    Hemodialysis is associated with an increased risk of neoplasms which may result, at least in part, from exposure to ionizing radiation associated with frequent radiographic procedures. In order to estimate the average radiation exposure of those on hemodialysis, we conducted a retrospective study of 100 patients in a university-based dialysis unit followed for a median of 3.4 years. The number and type of radiological procedures were obtained from a central radiology database, and the cumulative effective radiation dose was calculated using standardized, procedure-specific radiation levels. The median annual radiation dose was 6.9 millisieverts (mSv) per patient-year. However, 14 patients had an annual cumulative effective radiation dose over 20 mSv, the upper averaged annual limit for occupational exposure. The median total cumulative effective radiation dose per patient over the study period was 21.7 mSv, in which 13 patients had a total cumulative effective radiation dose over 75 mSv, a value reported to be associated with a 7% increased risk of cancer-related mortality. Two-thirds of the total cumulative effective radiation dose was due to CT scanning. The average radiation exposure was significantly associated with the cause of end-stage renal disease, history of ischemic heart disease, transplant waitlist status, number of in-patient hospital days over follow-up, and death during the study period. These results highlight the substantial exposure to ionizing radiation in hemodialysis patients.

  15. Considerations for reference pump curves

    International Nuclear Information System (INIS)

    Stockton, N.B.

    1992-01-01

    This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point

  16. The GO Cygni system: photoelectric observations and light curves analysis

    International Nuclear Information System (INIS)

    Rovithis, P.; Rovithis-Livaniou, H.; Niarchos, P.G.

    1990-01-01

    Photoelectric observations, in B and V, of the system GO Cygni obtained during 1985 at the Kryonerion Astronomical Station of the National Observatory of Greece are given. The corresponding light curves (typical β Lyrae) are analysed using Frequency Domain techniques. New photoelectric and absolute elements for the system are given, and its period was found to continue its increasing

  17. Curve Digitizer – A software for multiple curves digitizing

    Directory of Open Access Journals (Sweden)

    Florentin ŞPERLEA

    2010-06-01

    Full Text Available The Curve Digitizer is software that extracts data from an image file representing a graphicand returns them as pairs of numbers which can then be used for further analysis and applications.Numbers can be read on a computer screen stored in files or copied on paper. The final result is adata set that can be used with other tools such as MSEXCEL. Curve Digitizer provides a useful toolfor any researcher or engineer interested in quantifying the data displayed graphically. The image filecan be obtained by scanning a document

  18. Developing Novel Reservoir Rule Curves Using Seasonal Inflow Projections

    Science.gov (United States)

    Tseng, Hsin-yi; Tung, Ching-pin

    2015-04-01

    Due to significant seasonal rainfall variations, reservoirs and their flexible operational rules are indispensable to Taiwan. Furthermore, with the intensifying impacts of climate change on extreme climate, the frequency of droughts in Taiwan has been increasing in recent years. Drought is a creeping phenomenon, the slow onset character of drought makes it difficult to detect at an early stage, and causes delays on making the best decision of allocating water. For these reasons, novel reservoir rule curves using projected seasonal streamflow are proposed in this study, which can potentially reduce the adverse effects of drought. This study dedicated establishing new rule curves which consider both current available storage and anticipated monthly inflows with leading time of two months to reduce the risk of water shortage. The monthly inflows are projected based on the seasonal climate forecasts from Central Weather Bureau (CWB), which a weather generation model is used to produce daily weather data for the hydrological component of the GWLF. To incorporate future monthly inflow projections into rule curves, this study designs a decision flow index which is a linear combination of current available storage and inflow projections with leading time of 2 months. By optimizing linear relationship coefficients of decision flow index, the shape of rule curves and the percent of water supply in each zone, the best rule curves to decrease water shortage risk and impacts can be developed. The Shimen Reservoir in the northern Taiwan is used as a case study to demonstrate the proposed method. Existing rule curves (M5 curves) of Shimen Reservoir are compared with two cases of new rule curves, including hindcast simulations and historic seasonal forecasts. The results show new rule curves can decrease the total water shortage ratio, and in addition, it can also allocate shortage amount to preceding months to avoid extreme shortage events. Even though some uncertainties in

  19. Regional flow duration curves for ungauged sites in Sicily

    Directory of Open Access Journals (Sweden)

    F. Viola

    2011-01-01

    Full Text Available Flow duration curves are simple and powerful tools to deal with many hydrological and environmental problems related to water quality assessment, water-use assessment and water allocation. Unfortunately the scarcity of streamflow data enables the use of these instruments only for gauged basins. A regional model is developed here for estimating flow duration curves at ungauged basins in Sicily, Italy. Due to the complex ephemeral behavior of the examined region, this study distinguishes dry periods, when flows are zero, from wet periods using a three parameters power law to describe the frequency distribution of flows. A large dataset of streamflows has been analyzed and the parameters of flow duration curves have been derived for about fifty basins. Regional regression equations have been developed to derive flow duration curves starting from morphological basin characteristics.

  20. Cumulative Trauma Among Mayas Living in Southeast Florida.

    Science.gov (United States)

    Millender, Eugenia I; Lowe, John

    2017-06-01

    Mayas, having experienced genocide, exile, and severe poverty, are at high risk for the consequences of cumulative trauma that continually resurfaces through current fear of an uncertain future. Little is known about the mental health and alcohol use status of this population. This correlational study explored t/he relationship of cumulative trauma as it relates to social determinants of health (years in the United States, education, health insurance status, marital status, and employment), psychological health (depression symptoms), and health behaviors (alcohol use) of 102 Guatemalan Mayas living in Southeast Florida. The results of this study indicated that, as specific social determinants of health and cumulative trauma increased, depression symptoms (particularly among women) and the risk for harmful alcohol use (particularly among men) increased. Identifying risk factors at an early stage before serious disease or problems are manifest provides room for early screening leading to early identification, early treatment, and better outcomes.

  1. Session: What do we know about cumulative or population impacts

    Energy Technology Data Exchange (ETDEWEB)

    Kerlinger, Paul; Manville, Al; Kendall, Bill

    2004-09-01

    This session at the Wind Energy and Birds/Bats workshop consisted of a panel discussion followed by a discussion/question and answer period. The panelists were Paul Kerlinger, Curry and Kerlinger, LLC, Al Manville, U.S. Fish and Wildlife Service, and Bill Kendall, US Geological Service. The panel addressed the potential cumulative impacts of wind turbines on bird and bat populations over time. Panel members gave brief presentations that touched on what is currently known, what laws apply, and the usefulness of population modeling. Topics addressed included which sources of modeling should be included in cumulative impacts, comparison of impacts from different modes of energy generation, as well as what research is still needed regarding cumulative impacts of wind energy development on bird and bat populations.

  2. Evolutionary neural network modeling for software cumulative failure time prediction

    International Nuclear Information System (INIS)

    Tian Liang; Noore, Afzel

    2005-01-01

    An evolutionary neural network modeling approach for software cumulative failure time prediction based on multiple-delayed-input single-output architecture is proposed. Genetic algorithm is used to globally optimize the number of the delayed input neurons and the number of neurons in the hidden layer of the neural network architecture. Modification of Levenberg-Marquardt algorithm with Bayesian regularization is used to improve the ability to predict software cumulative failure time. The performance of our proposed approach has been compared using real-time control and flight dynamic application data sets. Numerical results show that both the goodness-of-fit and the next-step-predictability of our proposed approach have greater accuracy in predicting software cumulative failure time compared to existing approaches

  3. Baltic Sea biodiversity status vs. cumulative human pressures

    DEFF Research Database (Denmark)

    Andersen, Jesper H.; Halpern, Benjamin S.; Korpinen, Samuli

    2015-01-01

    Abstract Many studies have tried to explain spatial and temporal variations in biodiversity status of marine areas from a single-issue perspective, such as fishing pressure or coastal pollution, yet most continental seas experience a wide range of human pressures. Cumulative impact assessments have...... been developed to capture the consequences of multiple stressors for biodiversity, but the ability of these assessments to accurately predict biodiversity status has never been tested or ground-truthed. This relationship has similarly been assumed for the Baltic Sea, especially in areas with impaired...... status, but has also never been documented. Here we provide a first tentative indication that cumulative human impacts relate to ecosystem condition, i.e. biodiversity status, in the Baltic Sea. Thus, cumulative impact assessments offer a promising tool for informed marine spatial planning, designation...

  4. Benefit and cost curves for typical pollination mutualisms.

    Science.gov (United States)

    Morris, William F; Vázquez, Diego P; Chacoff, Natacha P

    2010-05-01

    Mutualisms provide benefits to interacting species, but they also involve costs. If costs come to exceed benefits as population density or the frequency of encounters between species increases, the interaction will no longer be mutualistic. Thus curves that represent benefits and costs as functions of interaction frequency are important tools for predicting when a mutualism will tip over into antagonism. Currently, most of what we know about benefit and cost curves in pollination mutualisms comes from highly specialized pollinating seed-consumer mutualisms, such as the yucca moth-yucca interaction. There, benefits to female reproduction saturate as the number of visits to a flower increases (because the amount of pollen needed to fertilize all the flower's ovules is finite), but costs continue to increase (because pollinator offspring consume developing seeds), leading to a peak in seed production at an intermediate number of visits. But for most plant-pollinator mutualisms, costs to the plant are more subtle than consumption of seeds, and how such costs scale with interaction frequency remains largely unknown. Here, we present reasonable benefit and cost curves that are appropriate for typical pollinator-plant interactions, and we show how they can result in a wide diversity of relationships between net benefit (benefit minus cost) and interaction frequency. We then use maximum-likelihood methods to fit net-benefit curves to measures of female reproductive success for three typical pollination mutualisms from two continents, and for each system we chose the most parsimonious model using information-criterion statistics. We discuss the implications of the shape of the net-benefit curve for the ecology and evolution of plant-pollinator mutualisms, as well as the challenges that lie ahead for disentangling the underlying benefit and cost curves for typical pollination mutualisms.

  5. Stochastic evaluation of the dynamic response and the cumulative damage of nuclear power plant piping

    International Nuclear Information System (INIS)

    Suzuki, Kohei; Aoki, Shigeru; Hanaoka, Masaaki

    1981-01-01

    This report deals with a fundamental study concerning an evaluation of uncertainties of the nuclear piping response and cumulative damage under excess-earthquake loadings. The main purposes of this study cover following several problems. (1) Experimental estimation analysis of the uncertainties concerning the dynamic response and the cumulative failure by using piping test model. (2) Numerical simulation analysis by Monte Carlo method under the assumption that relation between restoring force and deformation is characterized by perfectly elasto-plastic one. (Checking the mathematical model.) (3) Development of the conventional uncertainty estimating method by introducing a perturbation technique based on an appropriate equivalently linearized approach. (Checking the estimation technique.) (4) An application of this method to more realistical cases. Through above mentioned procedures some important results are obtained as follows; First, fundamental statistical properties of the natural frequencies and the number of cycle to failure crack initiation are evaluated. Second, the effect of the frequency fluctuation and the yielding fluctuation are estimated and examined through Monte Carlo simulation technique. It has become clear that the yielding fluctuation gives significant effect on the piping power response up to its failure initiation. Finally some results through proposed perturbation technique are discussed. Statistical properties estimated coincide fairly well with those through numerical simulation. (author)

  6. Cumulative carbon as a policy framework for achieving climate stabilization

    Science.gov (United States)

    Matthews, H. Damon; Solomon, Susan; Pierrehumbert, Raymond

    2012-01-01

    The primary objective of the United Nations Framework Convention on Climate Change is to stabilize greenhouse gas concentrations at a level that will avoid dangerous climate impacts. However, greenhouse gas concentration stabilization is an awkward framework within which to assess dangerous climate change on account of the significant lag between a given concentration level and the eventual equilibrium temperature change. By contrast, recent research has shown that global temperature change can be well described by a given cumulative carbon emissions budget. Here, we propose that cumulative carbon emissions represent an alternative framework that is applicable both as a tool for climate mitigation as well as for the assessment of potential climate impacts. We show first that both atmospheric CO2 concentration at a given year and the associated temperature change are generally associated with a unique cumulative carbon emissions budget that is largely independent of the emissions scenario. The rate of global temperature change can therefore be related to first order to the rate of increase of cumulative carbon emissions. However, transient warming over the next century will also be strongly affected by emissions of shorter lived forcing agents such as aerosols and methane. Non-CO2 emissions therefore contribute to uncertainty in the cumulative carbon budget associated with near-term temperature targets, and may suggest the need for a mitigation approach that considers separately short- and long-lived gas emissions. By contrast, long-term temperature change remains primarily associated with total cumulative carbon emissions owing to the much longer atmospheric residence time of CO2 relative to other major climate forcing agents. PMID:22869803

  7. The role of factorial cumulants in reactor neutron noise theory

    International Nuclear Information System (INIS)

    Colombino, A.; Pacilio, N.; Sena, G.

    1979-01-01

    The physical meaning and the combinatorial implications of the factorial cumulant of a state variable such as the number of neutrons or the number of neutron counts are specified. Features of the presentation are: (1) the fission process is treated in its entirety without the customary binary emission restriction, (b) the introduction of the factorial cumulants helps in reducing the complexity of the mathematical problems, (c) all the solutions can be obtained analytically. Only the ergodic hypothesis for the neutron population evolution is dealt with. (author)

  8. Super-Resolution Algorithm in Cumulative Virtual Blanking

    Science.gov (United States)

    Montillet, J. P.; Meng, X.; Roberts, G. W.; Woolfson, M. S.

    2008-11-01

    The proliferation of mobile devices and the emergence of wireless location-based services have generated consumer demand for precise location. In this paper, the MUSIC super-resolution algorithm is applied to time delay estimation for positioning purposes in cellular networks. The goal is to position a Mobile Station with UMTS technology. The problem of Base-Stations herability is solved using Cumulative Virtual Blanking. A simple simulator is presented using DS-SS signal. The results show that MUSIC algorithm improves the time delay estimation in both the cases whether or not Cumulative Virtual Blanking was carried out.

  9. Calibration curves for biological dosimetry

    International Nuclear Information System (INIS)

    Guerrero C, C.; Brena V, M. . E-mail cgc@nuclear.inin.mx

    2004-01-01

    The generated information by the investigations in different laboratories of the world, included the ININ, in which settles down that certain class of chromosomal leisure it increases in function of the dose and radiation type, has given by result the obtaining of calibrated curves that are applied in the well-known technique as biological dosimetry. In this work is presented a summary of the work made in the laboratory that includes the calibrated curves for gamma radiation of 60 Cobalt and X rays of 250 k Vp, examples of presumed exposure to ionizing radiation, resolved by means of aberration analysis and the corresponding dose estimate through the equations of the respective curves and finally a comparison among the dose calculations in those people affected by the accident of Ciudad Juarez, carried out by the group of Oak Ridge, USA and those obtained in this laboratory. (Author)

  10. Vertex algebras and algebraic curves

    CERN Document Server

    Frenkel, Edward

    2004-01-01

    Vertex algebras are algebraic objects that encapsulate the concept of operator product expansion from two-dimensional conformal field theory. Vertex algebras are fast becoming ubiquitous in many areas of modern mathematics, with applications to representation theory, algebraic geometry, the theory of finite groups, modular functions, topology, integrable systems, and combinatorics. This book is an introduction to the theory of vertex algebras with a particular emphasis on the relationship with the geometry of algebraic curves. The notion of a vertex algebra is introduced in a coordinate-independent way, so that vertex operators become well defined on arbitrary smooth algebraic curves, possibly equipped with additional data, such as a vector bundle. Vertex algebras then appear as the algebraic objects encoding the geometric structure of various moduli spaces associated with algebraic curves. Therefore they may be used to give a geometric interpretation of various questions of representation theory. The book co...

  11. Curve collection, extension of databases

    International Nuclear Information System (INIS)

    Gillemot, F.

    1992-01-01

    Full text: Databases: generally calculated data only. The original measurements: diagrams. Information loss between them Expensive research eg. irradiation, aging, creep etc. Original curves should be stored for reanalysing. The format of the stored curves: a. Data in ASCII files, only numbers b. Other information in strings in a second file Same name, but different extension. Extensions shows the type of the test and the type of the file. EXAMPLES. TEN is tensile information, TED is tensile data, CHN is Charpy informations, CHD is Charpy data. Storing techniques: digitalised measurements, digitalising old curves stored on paper. Use: making catalogues, reanalysing, comparison with new data. Tools: mathematical software packages like quattro, genplot, exel, mathcad, qbasic, pascal, fortran, mathlab, grapher etc. (author)

  12. Rational points on elliptic curves

    CERN Document Server

    Silverman, Joseph H

    2015-01-01

    The theory of elliptic curves involves a pleasing blend of algebra, geometry, analysis, and number theory. This book stresses this interplay as it develops the basic theory, thereby providing an opportunity for advanced undergraduates to appreciate the unity of modern mathematics. At the same time, every effort has been made to use only methods and results commonly included in the undergraduate curriculum. This accessibility, the informal writing style, and a wealth of exercises make Rational Points on Elliptic Curves an ideal introduction for students at all levels who are interested in learning about Diophantine equations and arithmetic geometry. Most concretely, an elliptic curve is the set of zeroes of a cubic polynomial in two variables. If the polynomial has rational coefficients, then one can ask for a description of those zeroes whose coordinates are either integers or rational numbers. It is this number theoretic question that is the main subject of this book. Topics covered include the geometry and ...

  13. Theoretical melting curve of caesium

    International Nuclear Information System (INIS)

    Simozar, S.; Girifalco, L.A.; Pennsylvania Univ., Philadelphia

    1983-01-01

    A statistical-mechanical model is developed to account for the complex melting curve of caesium. The model assumes the existence of three different species of caesium defined by three different electronic states. On the basis of this model, the free energy of melting and the melting curve are computed up to 60 kbar, using the solid-state data and the initial slope of the fusion curve as input parameters. The calculated phase diagram agrees with experiment to within the experimental error. Other thermodynamic properties including the entropy and volume of melting were also computed, and they agree with experiment. Since the theory requires only one adjustable constant, this is taken as strong evidence that the three-species model is satisfactory for caesium. (author)

  14. Migration and the Wage Curve:

    DEFF Research Database (Denmark)

    Brücker, Herbert; Jahn, Elke J.

    in a general equilibrium framework. For the empirical analysis we employ the IABS, a two percent sample of the German labor force. We find that the elasticity of the wage curve is particularly high for young workers and workers with a university degree, while it is low for older workers and workers......  Based on a wage curve approach we examine the labor market effects of migration in Germany. The wage curve relies on the assumption that wages respond to a change in the unemployment rate, albeit imperfectly. This allows one to derive the wage and employment effects of migration simultaneously...... with a vocational degree. The wage and employment effects of migration are moderate: a 1 percent increase in the German labor force through immigration increases the aggregate unemployment rate by less than 0.1 percentage points and reduces average wages by less 0.1 percent. While native workers benefit from...

  15. Laffer Curves and Home Production

    Directory of Open Access Journals (Sweden)

    Kotamäki Mauri

    2017-06-01

    Full Text Available In the earlier related literature, consumption tax rate Laffer curve is found to be strictly increasing (see Trabandt and Uhlig (2011. In this paper, a general equilibrium macro model is augmented by introducing a substitute for private consumption in the form of home production. The introduction of home production brings about an additional margin of adjustment – an increase in consumption tax rate not only decreases labor supply and reduces the consumption tax base but also allows a substitution of market goods with home-produced goods. The main objective of this paper is to show that, after the introduction of home production, the consumption tax Laffer curve exhibits an inverse U-shape. Also the income tax Laffer curves are significantly altered. The result shown in this paper casts doubt on some of the earlier results in the literature.

  16. Complexity of Curved Glass Structures

    Science.gov (United States)

    Kosić, T.; Svetel, I.; Cekić, Z.

    2017-11-01

    Despite the increasing number of research on the architectural structures of curvilinear forms and technological and practical improvement of the glass production observed over recent years, there is still a lack of comprehensive codes and standards, recommendations and experience data linked to real-life curved glass structures applications regarding design, manufacture, use, performance and economy. However, more and more complex buildings and structures with the large areas of glass envelope geometrically complex shape are built every year. The aim of the presented research is to collect data on the existing design philosophy on curved glass structure cases. The investigation includes a survey about how architects and engineers deal with different design aspects of curved glass structures with a special focus on the design and construction process, glass types and structural and fixing systems. The current paper gives a brief overview of the survey findings.

  17. Predicting expressway crash frequency using a random effect negative binomial model: A case study in China.

    Science.gov (United States)

    Ma, Zhuanglin; Zhang, Honglu; Chien, Steven I-Jy; Wang, Jin; Dong, Chunjiao

    2017-01-01

    To investigate the relationship between crash frequency and potential influence factors, the accident data for events occurring on a 50km long expressway in China, including 567 crash records (2006-2008), were collected and analyzed. Both the fixed-length and the homogeneous longitudinal grade methods were applied to divide the study expressway section into segments. A negative binomial (NB) model and a random effect negative binomial (RENB) model were developed to predict crash frequency. The parameters of both models were determined using the maximum likelihood (ML) method, and the mixed stepwise procedure was applied to examine the significance of explanatory variables. Three explanatory variables, including longitudinal grade, road width, and ratio of longitudinal grade and curve radius (RGR), were found as significantly affecting crash frequency. The marginal effects of significant explanatory variables to the crash frequency were analyzed. The model performance was determined by the relative prediction error and the cumulative standardized residual. The results show that the RENB model outperforms the NB model. It was also found that the model performance with the fixed-length segment method is superior to that with the homogeneous longitudinal grade segment method. Copyright © 2016. Published by Elsevier Ltd.

  18. Optimization on Spaces of Curves

    DEFF Research Database (Denmark)

    Møller-Andersen, Jakob

    in Rd, and methods to solve the initial and boundary value problem for geodesics allowing us to compute the Karcher mean and principal components analysis of data of curves. We apply the methods to study shape variation in synthetic data in the Kimia shape database, in HeLa cell nuclei and cycles...... of cardiac deformations. Finally we investigate a new application of Riemannian shape analysis in shape optimization. We setup a simple elliptic model problem, and describe how to apply shape calculus to obtain directional derivatives in the manifold of planar curves. We present an implementation based...

  19. Tracing a planar algebraic curve

    International Nuclear Information System (INIS)

    Chen Falai; Kozak, J.

    1994-09-01

    In this paper, an algorithm that determines a real algebraic curve is outlined. Its basic step is to divide the plane into subdomains that include only simple branches of the algebraic curve without singular points. Each of the branches is then stably and efficiently traced in the particular subdomain. Except for the tracing, the algorithm requires only a couple of simple operations on polynomials that can be carried out exactly if the coefficients are rational, and the determination of zeros of several polynomials of one variable. (author). 5 refs, 4 figs

  20. The New Keynesian Phillips Curve

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    This paper provides a survey on the recent literature on the new Keynesian Phillips curve: the controversies surrounding its microfoundation and estimation, the approaches that have been tried to improve its empirical fit and the challenges it faces adapting to the open-economy framework. The new......, learning or state-dependant pricing. The introduction of openeconomy factors into the new Keynesian Phillips curve complicate matters further as it must capture the nexus between price setting, inflation and the exchange rate. This is nevertheless a crucial feature for any model to be used for inflation...... forecasting in a small open economy like Iceland....

  1. Frequency standards

    CERN Document Server

    Riehle, Fritz

    2006-01-01

    Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards

  2. Signature Curves Statistics of DNA Supercoils

    OpenAIRE

    Shakiban, Cheri; Lloyd, Peter

    2004-01-01

    In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...

  3. Cumulative effects of wind turbines. Volume 3: Report on results of consultations on cumulative effects of wind turbines on birds

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This report gives details of the consultations held in developing the consensus approach taken in assessing the cumulative effects of wind turbines. Contributions on bird issues, and views of stakeholders, the Countryside Council for Wales, electric utilities, Scottish Natural Heritage, and the National Wind Power Association are reported. The scoping of key species groups, where cumulative effects might be expected, consideration of other developments, the significance of any adverse effects, mitigation, regional capacity assessments, and predictive models are discussed. Topics considered at two stakeholder workshops are outlined in the appendices.

  4. Cumulative impacts: current research and current opinions at PSW

    Science.gov (United States)

    R. M. Rice

    1987-01-01

    Consideration of cumulative watershed effects (CWEs) has both political and physical aspects. Regardless of the practical usefulness of present methods of dealing with CWEs, the legal requirement to address them remains. Management of federal land is regulated by the National Environmental Policy Act (NEPA) and the Federal Water Pollution Control Act of 1972. The...

  5. Cumulative Risks of Foster Care Placement for Danish Children

    DEFF Research Database (Denmark)

    Fallesen, Peter; Emanuel, Natalia; Wildeman, Christopher

    2014-01-01

    children. Our results also show some variations by parental ethnicity and sex, but these differences are small. Indeed, they appear quite muted relative to racial/ethnic differences in these risks in the United States. Last, though cumulative risks are similar between Danish and American children...

  6. Disintegration of a profiled shock wave at the cumulation point

    International Nuclear Information System (INIS)

    Kaliski, S.

    1978-01-01

    The disintegration at the cumulation point is analyzed of a shock wave generated with the aid of a profiled pressure. The quantitative relations are analyzed for the disintegration waves for typical compression parameters in systems of thermonuclear microfusion. The quantitative conclusions are drawn for the application of simplifying approximate calculations in problems of microfusion. (author)

  7. Cumulative Prospect Theory, Option Returns, and the Variance Premium

    NARCIS (Netherlands)

    Baele, Lieven; Driessen, Joost; Ebert, Sebastian; Londono Yarce, J.M.; Spalt, Oliver

    The variance premium and the pricing of out-of-the-money (OTM) equity index options are major challenges to standard asset pricing models. We develop a tractable equilibrium model with Cumulative Prospect Theory (CPT) preferences that can overcome both challenges. The key insight is that the

  8. Steps and Pips in the History of the Cumulative Recorder

    Science.gov (United States)

    Lattal, Kennon A.

    2004-01-01

    From its inception in the 1930s until very recent times, the cumulative recorder was the most widely used measurement instrument in the experimental analysis of behavior. It was an essential instrument in the discovery and analysis of schedules of reinforcement, providing the first real-time analysis of operant response rates and patterns. This…

  9. The effects of cumulative practice on mathematics problem solving.

    Science.gov (United States)

    Mayfield, Kristin H; Chase, Philip N

    2002-01-01

    This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving.

  10. Anti-irritants II: Efficacy against cumulative irritation

    DEFF Research Database (Denmark)

    Andersen, Flemming; Hedegaard, Kathryn; Petersen, Thomas Kongstad

    2006-01-01

    window of opportunity in which to demonstrate efficacy. Therefore, the effect of AI was studied in a cumulative irritation model by inducing irritant dermatitis with 10 min daily exposures for 5+4 days (no irritation on weekend) to 1% sodium lauryl sulfate on the right and 20% nonanoic acid on the left...

  11. Cumulative Beam Breakup with Time-Dependent Parameters

    CERN Document Server

    Delayen, J R

    2004-01-01

    A general analytical formalism developed recently for cumulative beam breakup (BBU) in linear accelerators with arbitrary beam current profile and misalignments [1] is extended to include time-dependent parameters such as energy chirp or rf focusing in order to reduce BBU-induced instabilities and emittance growth. Analytical results are presented and applied to practical accelerator configurations.

  12. On the mechanism of hadron cumulative production on nucleus

    International Nuclear Information System (INIS)

    Efremov, A.V.

    1976-01-01

    A mechanism of cumulative production of hadrons on nucleus is proposed which is similar to that of high perpendicular hadron production. The cross section obtained describes the main qualitative features of such prosesses, e.g., initial energy dependence atomic number behaviour, dependence on the rest mass of the produced particle and its production angle

  13. Hyperscaling breakdown and Ising spin glasses: The Binder cumulant

    Science.gov (United States)

    Lundow, P. H.; Campbell, I. A.

    2018-02-01

    Among the Renormalization Group Theory scaling rules relating critical exponents, there are hyperscaling rules involving the dimension of the system. It is well known that in Ising models hyperscaling breaks down above the upper critical dimension. It was shown by Schwartz (1991) that the standard Josephson hyperscaling rule can also break down in Ising systems with quenched random interactions. A related Renormalization Group Theory hyperscaling rule links the critical exponents for the normalized Binder cumulant and the correlation length in the thermodynamic limit. An appropriate scaling approach for analyzing measurements from criticality to infinite temperature is first outlined. Numerical data on the scaling of the normalized correlation length and the normalized Binder cumulant are shown for the canonical Ising ferromagnet model in dimension three where hyperscaling holds, for the Ising ferromagnet in dimension five (so above the upper critical dimension) where hyperscaling breaks down, and then for Ising spin glass models in dimension three where the quenched interactions are random. For the Ising spin glasses there is a breakdown of the normalized Binder cumulant hyperscaling relation in the thermodynamic limit regime, with a return to size independent Binder cumulant values in the finite-size scaling regime around the critical region.

  14. How to manage the cumulative flood safety of catchment dams ...

    African Journals Online (AJOL)

    Dam safety is a significant issue being taken seriously worldwide. However, in Australia, although much attention is being devoted to the medium- to large-scale dams, minimal attention is being paid to the serious potential problems associated with smaller dams, particularly the potential cumulative safety threats they pose ...

  15. Cumulative Beam Breakup due to Resistive-Wall Wake

    International Nuclear Information System (INIS)

    Wang, J.-M.

    2004-01-01

    The cumulative beam breakup problem excited by the resistive-wall wake is formulated. An approximate analytic method of finding the asymptotic behavior for the transverse bunch displacement is developed and solved. Comparison between the asymptotic analytical expression and the direct numerical solution is presented. Good agreement is found. The criterion of using the asymptotic analytical expression is discussed

  16. Analysis of sensory ratings data with cumulative link models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Brockhoff, Per B.

    2013-01-01

    Examples of categorical rating scales include discrete preference, liking and hedonic rating scales. Data obtained on these scales are often analyzed with normal linear regression methods or with omnibus Pearson chi2 tests. In this paper we propose to use cumulative link models that allow for reg...

  17. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  18. Implications of applying cumulative risk assessment to the workplace.

    Science.gov (United States)

    Fox, Mary A; Spicer, Kristen; Chosewood, L Casey; Susi, Pam; Johns, Douglas O; Dotson, G Scott

    2018-06-01

    Multiple changes are influencing work, workplaces and workers in the US including shifts in the main types of work and the rise of the 'gig' economy. Work and workplace changes have coincided with a decline in unions and associated advocacy for improved safety and health conditions. Risk assessment has been the primary method to inform occupational and environmental health policy and management for many types of hazards. Although often focused on one hazard at a time, risk assessment frameworks and methods have advanced toward cumulative risk assessment recognizing that exposure to a single chemical or non-chemical stressor rarely occurs in isolation. We explore how applying cumulative risk approaches may change the roles of workers and employers as they pursue improved health and safety and elucidate some of the challenges and opportunities that might arise. Application of cumulative risk assessment should result in better understanding of complex exposures and health risks with the potential to inform more effective controls and improved safety and health risk management overall. Roles and responsibilities of both employers and workers are anticipated to change with potential for a greater burden of responsibility on workers to address risk factors both inside and outside the workplace that affect health at work. A range of policies, guidance and training have helped develop cumulative risk assessment for the environmental health field and similar approaches are available to foster the practice in occupational safety and health. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Hierarchical Bayesian parameter estimation for cumulative prospect theory

    NARCIS (Netherlands)

    Nilsson, H.; Rieskamp, J.; Wagenmakers, E.-J.

    2011-01-01

    Cumulative prospect theory (CPT Tversky & Kahneman, 1992) has provided one of the most influential accounts of how people make decisions under risk. CPT is a formal model with parameters that quantify psychological processes such as loss aversion, subjective values of gains and losses, and

  20. An Axiomatization of Cumulative Prospect Theory for Decision under Risk

    NARCIS (Netherlands)

    Wakker, P.P.; Chateauneuf, A.

    1999-01-01

    Cumulative prospect theory was introduced by Tversky and Kahneman so as to combine the empirical realism of their original prospect theory with the theoretical advantages of Quiggin's rank-dependent utility. Preference axiomatizations were provided in several papers. All those axiomatizations,

  1. Cumulative assessment: does it improve students’ knowledge acquisition and retention?

    NARCIS (Netherlands)

    Cecilio Fernandes, Dario; Nagtegaal, Manouk; Noordzij, Gera; Tio, Rene

    2017-01-01

    Introduction Assessment for learning means changing students’ behaviour regarding their learning. Cumulative assessment has been shown to increase students’ self-study time and spread their study time throughout a course. However, there was no difference regarding students’ knowledge at the end of

  2. Dual Smarandache Curves of a Timelike Curve lying on Unit dual Lorentzian Sphere

    OpenAIRE

    Kahraman, Tanju; Hüseyin Ugurlu, Hasan

    2016-01-01

    In this paper, we give Darboux approximation for dual Smarandache curves of time like curve on unit dual Lorentzian sphere. Firstly, we define the four types of dual Smarandache curves of a timelike curve lying on dual Lorentzian sphere.

  3. Texas curve margin of safety.

    Science.gov (United States)

    2013-01-01

    This software can be used to assist with the assessment of margin of safety for a horizontal curve. It is intended for use by engineers and technicians responsible for safety analysis or management of rural highway pavement or traffic control devices...

  4. Principal Curves on Riemannian Manifolds.

    Science.gov (United States)

    Hauberg, Soren

    2016-09-01

    Euclidean statistics are often generalized to Riemannian manifolds by replacing straight-line interpolations with geodesic ones. While these Riemannian models are familiar-looking, they are restricted by the inflexibility of geodesics, and they rely on constructions which are optimal only in Euclidean domains. We consider extensions of Principal Component Analysis (PCA) to Riemannian manifolds. Classic Riemannian approaches seek a geodesic curve passing through the mean that optimizes a criteria of interest. The requirements that the solution both is geodesic and must pass through the mean tend to imply that the methods only work well when the manifold is mostly flat within the support of the generating distribution. We argue that instead of generalizing linear Euclidean models, it is more fruitful to generalize non-linear Euclidean models. Specifically, we extend the classic Principal Curves from Hastie & Stuetzle to data residing on a complete Riemannian manifold. We show that for elliptical distributions in the tangent of spaces of constant curvature, the standard principal geodesic is a principal curve. The proposed model is simple to compute and avoids many of the pitfalls of traditional geodesic approaches. We empirically demonstrate the effectiveness of the Riemannian principal curves on several manifolds and datasets.

  5. Elliptic curves and primality proving

    Science.gov (United States)

    Atkin, A. O. L.; Morain, F.

    1993-07-01

    The aim of this paper is to describe the theory and implementation of the Elliptic Curve Primality Proving algorithm. Problema, numeros primos a compositis dignoscendi, hosque in factores suos primos resolvendi, ad gravissima ac utilissima totius arithmeticae pertinere, et geometrarum tum veterum tum recentiorum industriam ac sagacitatem occupavisse, tam notum est, ut de hac re copiose loqui superfluum foret.

  6. A Curve for all Reasons

    Indian Academy of Sciences (India)

    from biology, feel that every pattern in the living world, ranging from the folding of ... curves band c have the same rate of increase but reach different asymptotes. If these .... not at x = 0, but at xo' which is the minimum size at birth that will permit ...

  7. Survival curves for irradiated cells

    International Nuclear Information System (INIS)

    Gibson, D.K.

    1975-01-01

    The subject of the lecture is the probability of survival of biological cells which have been subjected to ionising radiation. The basic mathematical theories of cell survival as a function of radiation dose are developed. A brief comparison with observed survival curves is made. (author)

  8. Mentorship, learning curves, and balance.

    Science.gov (United States)

    Cohen, Meryl S; Jacobs, Jeffrey P; Quintessenza, James A; Chai, Paul J; Lindberg, Harald L; Dickey, Jamie; Ungerleider, Ross M

    2007-09-01

    Professionals working in the arena of health care face a variety of challenges as their careers evolve and develop. In this review, we analyze the role of mentorship, learning curves, and balance in overcoming challenges that all such professionals are likely to encounter. These challenges can exist both in professional and personal life. As any professional involved in health care matures, complex professional skills must be mastered, and new professional skills must be acquired. These skills are both technical and judgmental. In most circumstances, these skills must be learned. In 2007, despite the continued need for obtaining new knowledge and learning new skills, the professional and public tolerance for a "learning curve" is much less than in previous decades. Mentorship is the key to success in these endeavours. The success of mentorship is two-sided, with responsibilities for both the mentor and the mentee. The benefits of this relationship must be bidirectional. It is the responsibility of both the student and the mentor to assure this bidirectional exchange of benefit. This relationship requires time, patience, dedication, and to some degree selflessness. This mentorship will ultimately be the best tool for mastering complex professional skills and maturing through various learning curves. Professional mentorship also requires that mentors identify and explicitly teach their mentees the relational skills and abilities inherent in learning the management of the triad of self, relationships with others, and professional responsibilities.Up to two decades ago, a learning curve was tolerated, and even expected, while professionals involved in healthcare developed the techniques that allowed for the treatment of previously untreatable diseases. Outcomes have now improved to the point that this type of learning curve is no longer acceptable to the public. Still, professionals must learn to perform and develop independence and confidence. The responsibility to

  9. Frequency Synthesiser

    NARCIS (Netherlands)

    Drago, Salvatore; Sebastiano, Fabio; Leenaerts, Dominicus M.W.; Breems, Lucien J.; Nauta, Bram

    2016-01-01

    A low power frequency synthesiser circuit (30) for a radio transceiver, the synthesiser circuit comprising: a digital controlled oscillator configured to generate an output signal having a frequency controlled by an input digital control word (DCW); a feedback loop connected between an output and an

  10. Frequency synthesiser

    NARCIS (Netherlands)

    Drago, S.; Sebastiano, Fabio; Leenaerts, Dominicus Martinus Wilhelmus; Breems, Lucien Johannes; Nauta, Bram

    2010-01-01

    A low power frequency synthesiser circuit (30) for a radio transceiver, the synthesiser circuit comprising: a digital controlled oscillator configured to generate an output signal having a frequency controlled by an input digital control word (DCW); a feedback loop connected between an output and an

  11. The Use of the Kurtosis-Adjusted Cumulative Noise Exposure Metric in Evaluating the Hearing Loss Risk for Complex Noise.

    Science.gov (United States)

    Xie, Hong-Wei; Qiu, Wei; Heyer, Nicholas J; Zhang, Mei-Bian; Zhang, Peng; Zhao, Yi-Ming; Hamernik, Roger P

    2016-01-01

    To test a kurtosis-adjusted cumulative noise exposure (CNE) metric for use in evaluating the risk of hearing loss among workers exposed to industrial noises. Specifically, to evaluate whether the kurtosis-adjusted CNE (1) provides a better association with observed industrial noise-induced hearing loss, and (2) provides a single metric applicable to both complex (non-Gaussian [non-G]) and continuous or steady state (Gaussian [G]) noise exposures for predicting noise-induced hearing loss (dose-response curves). Audiometric and noise exposure data were acquired on a population of screened workers (N = 341) from two steel manufacturing plants located in Zhejiang province and a textile manufacturing plant located in Henan province, China. All the subjects from the two steel manufacturing plants (N = 178) were exposed to complex noise, whereas the subjects from textile manufacturing plant (N = 163) were exposed to a G continuous noise. Each subject was given an otologic examination to determine their pure-tone HTL and had their personal 8-hr equivalent A-weighted noise exposure (LAeq) and full-shift noise kurtosis statistic (which is sensitive to the peaks and temporal characteristics of noise exposures) measured. For each subject, an unadjusted and kurtosis-adjusted CNE index for the years worked was created. Multiple linear regression analysis controlling for age was used to determine the relationship between CNE (unadjusted and kurtosis adjusted) and the mean HTL at 3, 4, and 6 kHz (HTL346) among the complex noise-exposed group. In addition, each subject's HTLs from 0.5 to 8.0 kHz were age and sex adjusted using Annex A (ISO-1999) to determine whether they had adjusted high-frequency noise-induced hearing loss (AHFNIHL), defined as an adjusted HTL shift of 30 dB or greater at 3.0, 4.0, or 6.0 kHz in either ear. Dose-response curves for AHFNIHL were developed separately for workers exposed to G and non-G noise using both unadjusted and adjusted CNE as the exposure

  12. Theory and experiments on Peano and Hilbert curve RFID tags

    Science.gov (United States)

    McVay, John; Hoorfar, Ahmad; Engheta, Nader

    2006-05-01

    Recently, there has been considerable interest in the area of Radio Frequency Identification (RFID) and Radio Frequency Tagging (RFTAG). This emerging area of interest can be applied for inventory control (commercial) as well as friend/foe identification (military) to name but a few. The current technology can be broken down into two main groups, namely passive and active RFID tags. Utilization of Space-Filling Curve (SFC) geometries, such as the Peano and Hilbert curves, has been recently investigated for use in completely passive RFID applications [1, 2]. In this work, we give an overview of our work on the space-filling curves and the potential for utilizing the electrically small, resonant characteristics of these curves for use in RFID technologies with an emphasis on the challenging issues involved when attempting to tag conductive objects. In particular, we investigate the possible use of these tags in conjunction with high impedance ground-planes made of Hilbert or Peano curve inclusions [3, 4] to develop electrically small RFID tags that may also radiate efficiently, within close proximity of large conductive objects [5].

  13. The challenges and opportunities in cumulative effects assessment

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Melissa M., E-mail: mfoley@usgs.gov [U.S. Geological Survey, Pacific Coastal and Marine Science Center, 400 Natural Bridges, Dr., Santa Cruz, CA 95060 (United States); Center for Ocean Solutions, Stanford University, 99 Pacific St., Monterey, CA 93940 (United States); Mease, Lindley A., E-mail: lamease@stanford.edu [Center for Ocean Solutions, Stanford University, 473 Via Ortega, Stanford, CA 94305 (United States); Martone, Rebecca G., E-mail: rmartone@stanford.edu [Center for Ocean Solutions, Stanford University, 99 Pacific St., Monterey, CA 93940 (United States); Prahler, Erin E. [Center for Ocean Solutions, Stanford University, 473 Via Ortega, Stanford, CA 94305 (United States); Morrison, Tiffany H., E-mail: tiffany.morrison@jcu.edu.au [ARC Centre of Excellence for Coral Reef Studies, James Cook University, Townsville, QLD, 4811 (Australia); Murray, Cathryn Clarke, E-mail: cmurray@pices.int [WWF-Canada, 409 Granville Street, Suite 1588, Vancouver, BC V6C 1T2 (Canada); Wojcik, Deborah, E-mail: deb.wojcik@duke.edu [Nicholas School for the Environment, Duke University, 9 Circuit Dr., Durham, NC 27708 (United States)

    2017-01-15

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  14. The challenges and opportunities in cumulative effects assessment

    International Nuclear Information System (INIS)

    Foley, Melissa M.; Mease, Lindley A.; Martone, Rebecca G.; Prahler, Erin E.; Morrison, Tiffany H.; Murray, Cathryn Clarke; Wojcik, Deborah

    2017-01-01

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  15. The challenges and opportunities in cumulative effects assessment

    Science.gov (United States)

    Foley, Melissa M.; Mease, Lindley A; Martone, Rebecca G; Prahler, Erin E; Morrison, Tiffany H; Clarke Murray, Cathryn; Wojcik, Deborah

    2016-01-01

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  16. Lorenz curves in a new science-funding model

    Science.gov (United States)

    Huang, Ding-wei

    2017-12-01

    We propose an agent-based model to theoretically and systematically explore the implications of a new approach to fund science, which has been suggested recently by J. Bollen et al.[?] We introduce various parameters and examine their effects. The concentration of funding is shown by the Lorenz curve and the Gini coefficient. In this model, all scientists are treated equally and follow the well-intended regulations. All scientists give a fixed ratio of their funding to others. The fixed ratio becomes an upper bound for the Gini coefficient. We observe two distinct regimes in the parameter space: valley and plateau. In the valley regime, the fluidity of funding is significant. The Lorenz curve is smooth. The Gini coefficient is well below the upper bound. The funding distribution is the desired result. In the plateau regime, the cumulative advantage is significant. The Lorenz curve has a sharp turn. The Gini coefficient saturates to the upper bound. The undue concentration of funding happens swiftly. The funding distribution is the undesired results, where a minority of scientists take the majority of funding. Phase transitions between these two regimes are discussed.

  17. Carbon Intensities of Economies from the Perspective of Learning Curves

    Directory of Open Access Journals (Sweden)

    Henrique Pacini

    2014-03-01

    Full Text Available While some countries have achieved considerable development, many others still lack accessto the goods and services considered standard in the modern society. As CO2 emissions and development are often correlated, this paper employs the theoretical background of the Environmental Kuznets Curve (EKC and the learning curves toolkit to analyze how carbon intensities have changed as countries move towards higher development (and cumulative wealth levels. The EKC concept is then tested with the methodology of learning curves for the period between 1971 and 2010, so as to capture a dynamic picture of emissions trends and development. Results of both analyses reveal that empirical data fails to provide direct evidence of an EKC for emissions and development. The data does show, however, an interesting pattern in the dispersion of emissions levels for countries within the same HDI categories. While data does not show that countries grow more polluting during intermediary development stages, it does provide evidence that countries become more heterogeneous in their emission intensities as they develop, later re-converging to lower emission intensities at higher HDI levels. Learning rates also indicate heterogeneity among developing countries and relative convergence among developed countries. Given the heterogeneity of development paths among countries, the experiences of those which are managing to develop at low carbon intensities can prove valuable examples for ongoing efforts in climate change mitigation, especially in the developing world.

  18. Measurement of four-particle cumulants and symmetric cumulants with subevent methods in small collision systems with the ATLAS detector

    CERN Document Server

    Derendarz, Dominik; The ATLAS collaboration

    2018-01-01

    Measurements of symmetric cumulants SC(n,m)=⟨v2nv2m⟩−⟨v2n⟩⟨v2m⟩ for (n,m)=(2,3) and (2,4) and asymmetric cumulant AC(n) are presented in pp, p+Pb and peripheral Pb+Pb collisions at various collision energies, aiming to probe the long-range collective nature of multi-particle production in small systems. Results are obtained using the standard cumulant method, as well as the two-subevent and three-subevent cumulant methods. Results from the standard method are found to be strongly biased by non-flow correlations as indicated by strong sensitivity to the chosen event class definition. A systematic reduction of non-flow effects is observed when using the two-subevent method and the results become independent of event class definition when the three-subevent method is used. The measured SC(n,m) shows an anti-correlation between v2 and v3, and a positive correlation between v2 and v4. The magnitude of SC(n,m) is constant with Nch in pp collisions, but increases with Nch in p+Pb and Pb+Pb collisions. ...

  19. Blood flow in curved pipe with radiative heat transfer

    International Nuclear Information System (INIS)

    Ogulu, A.; Bestman, A.R.

    1992-03-01

    Blood flow in a curved pipe such as the aorta is modelled in this study. The aorta is modelled as a curved pipe of slowly varying cross-section. Asymptotic series expansions about a small parameter δ, which is a measure of the curvature ratio is employed to obtain the velocity and temperature distributions. The study simulates the effect of radio-frequency heating, for instance during physiotherapy, on the flow of blood in the cardiovascular system assuming an external constant pressure gradient; and our results agree very well with results obtained by Pedley. (author). 9 refs, 2 figs

  20. A catalog of special plane curves

    CERN Document Server

    Lawrence, J Dennis

    2014-01-01

    Among the largest, finest collections available-illustrated not only once for each curve, but also for various values of any parameters present. Covers general properties of curves and types of derived curves. Curves illustrated by a CalComp digital incremental plotter. 12 illustrations.

  1. Computation of undulator tuning curves

    International Nuclear Information System (INIS)

    Dejus, Roger J.

    1997-01-01

    Computer codes for fast computation of on-axis brilliance tuning curves and flux tuning curves have been developed. They are valid for an ideal device (regular planar device or a helical device) using the Bessel function formalism. The effects of the particle beam emittance and the beam energy spread on the spectrum are taken into account. The applicability of the codes and the importance of magnetic field errors of real insertion devices are addressed. The validity of the codes has been experimentally verified at the APS and observed discrepancies are in agreement with predicted reduction of intensities due to magnetic field errors. The codes are distributed as part of the graphical user interface XOP (X-ray OPtics utilities), which simplifies execution and viewing of the results

  2. Curved canals: Ancestral files revisited

    Directory of Open Access Journals (Sweden)

    Jain Nidhi

    2008-01-01

    Full Text Available The aim of this article is to provide an insight into different techniques of cleaning and shaping of curved root canals with hand instruments. Although a plethora of root canal instruments like ProFile, ProTaper, LightSpeed ® etc dominate the current scenario, the inexpensive conventional root canal hand files such as K-files and flexible files can be used to get optimum results when handled meticulously. Special emphasis has been put on the modifications in biomechanical canal preparation in a variety of curved canal cases. This article compiles a series of clinical cases of root canals with curvatures in the middle and apical third and with S-shaped curvatures that were successfully completed by employing only conventional root canal hand instruments.

  3. Invariance for Single Curved Manifold

    KAUST Repository

    Castro, Pedro Machado Manhaes de

    2012-01-01

    Recently, it has been shown that, for Lambert illumination model, solely scenes composed by developable objects with a very particular albedo distribution produce an (2D) image with isolines that are (almost) invariant to light direction change. In this work, we provide and investigate a more general framework, and we show that, in general, the requirement for such in variances is quite strong, and is related to the differential geometry of the objects. More precisely, it is proved that single curved manifolds, i.e., manifolds such that at each point there is at most one principal curvature direction, produce invariant is surfaces for a certain relevant family of energy functions. In the three-dimensional case, the associated energy function corresponds to the classical Lambert illumination model with albedo. This result is also extended for finite-dimensional scenes composed by single curved objects. © 2012 IEEE.

  4. Invariance for Single Curved Manifold

    KAUST Repository

    Castro, Pedro Machado Manhaes de

    2012-08-01

    Recently, it has been shown that, for Lambert illumination model, solely scenes composed by developable objects with a very particular albedo distribution produce an (2D) image with isolines that are (almost) invariant to light direction change. In this work, we provide and investigate a more general framework, and we show that, in general, the requirement for such in variances is quite strong, and is related to the differential geometry of the objects. More precisely, it is proved that single curved manifolds, i.e., manifolds such that at each point there is at most one principal curvature direction, produce invariant is surfaces for a certain relevant family of energy functions. In the three-dimensional case, the associated energy function corresponds to the classical Lambert illumination model with albedo. This result is also extended for finite-dimensional scenes composed by single curved objects. © 2012 IEEE.

  5. Gliomas: Application of Cumulative Histogram Analysis of Normalized Cerebral Blood Volume on 3 T MRI to Tumor Grading

    Science.gov (United States)

    Kim, Hyungjin; Choi, Seung Hong; Kim, Ji-Hoon; Ryoo, Inseon; Kim, Soo Chin; Yeom, Jeong A.; Shin, Hwaseon; Jung, Seung Chai; Lee, A. Leum; Yun, Tae Jin; Park, Chul-Kee; Sohn, Chul-Ho; Park, Sung-Hye

    2013-01-01

    Background Glioma grading assumes significant importance in that low- and high-grade gliomas display different prognoses and are treated with dissimilar therapeutic strategies. The objective of our study was to retrospectively assess the usefulness of a cumulative normalized cerebral blood volume (nCBV) histogram for glioma grading based on 3 T MRI. Methods From February 2010 to April 2012, 63 patients with astrocytic tumors underwent 3 T MRI with dynamic susceptibility contrast perfusion-weighted imaging. Regions of interest containing the entire tumor volume were drawn on every section of the co-registered relative CBV (rCBV) maps and T2-weighted images. The percentile values from the cumulative nCBV histograms and the other histogram parameters were correlated with tumor grades. Cochran’s Q test and the McNemar test were used to compare the diagnostic accuracies of the histogram parameters after the receiver operating characteristic curve analysis. Using the parameter offering the highest diagnostic accuracy, a validation process was performed with an independent test set of nine patients. Results The 99th percentile of the cumulative nCBV histogram (nCBV C99), mean and peak height differed significantly between low- and high-grade gliomas (P = histogram analysis of nCBV using 3 T MRI can be a useful method for preoperative glioma grading. The nCBV C99 value is helpful in distinguishing high- from low-grade gliomas and grade IV from III gliomas. PMID:23704910

  6. Curved Folded Plate Timber Structures

    OpenAIRE

    Buri, Hans Ulrich; Stotz, Ivo; Weinand, Yves

    2011-01-01

    This work investigates the development of a Curved Origami Prototype made with timber panels. In the last fifteen years the timber industry has developed new, large size, timber panels. Composition and dimensions of these panels and the possibility of milling them with Computer Numerical Controlled machines shows great potential for folded plate structures. To generate the form of these structures we were inspired by Origami, the Japanese art of paper folding. Common paper tessellations are c...

  7. Projection-based curve clustering

    International Nuclear Information System (INIS)

    Auder, Benjamin; Fischer, Aurelie

    2012-01-01

    This paper focuses on unsupervised curve classification in the context of nuclear industry. At the Commissariat a l'Energie Atomique (CEA), Cadarache (France), the thermal-hydraulic computer code CATHARE is used to study the reliability of reactor vessels. The code inputs are physical parameters and the outputs are time evolution curves of a few other physical quantities. As the CATHARE code is quite complex and CPU time-consuming, it has to be approximated by a regression model. This regression process involves a clustering step. In the present paper, the CATHARE output curves are clustered using a k-means scheme, with a projection onto a lower dimensional space. We study the properties of the empirically optimal cluster centres found by the clustering method based on projections, compared with the 'true' ones. The choice of the projection basis is discussed, and an algorithm is implemented to select the best projection basis among a library of orthonormal bases. The approach is illustrated on a simulated example and then applied to the industrial problem. (authors)

  8. Growth curves for Laron syndrome.

    Science.gov (United States)

    Laron, Z; Lilos, P; Klinger, B

    1993-01-01

    Growth curves for children with Laron syndrome were constructed on the basis of repeated measurements made throughout infancy, childhood, and puberty in 24 (10 boys, 14 girls) of the 41 patients with this syndrome investigated in our clinic. Growth retardation was already noted at birth, the birth length ranging from 42 to 46 cm in the 12/20 available measurements. The postnatal growth curves deviated sharply from the normal from infancy on. Both sexes showed no clear pubertal spurt. Girls completed their growth between the age of 16-19 years to a final mean (SD) height of 119 (8.5) cm whereas the boys continued growing beyond the age of 20 years, achieving a final height of 124 (8.5) cm. At all ages the upper to lower body segment ratio was more than 2 SD above the normal mean. These growth curves constitute a model not only for primary, hereditary insulin-like growth factor-I (IGF-I) deficiency (Laron syndrome) but also for untreated secondary IGF-I deficiencies such as growth hormone gene deletion and idiopathic congenital isolated growth hormone deficiency. They should also be useful in the follow up of children with Laron syndrome treated with biosynthetic recombinant IGF-I. PMID:8333769

  9. Elementary particles in curved spaces

    International Nuclear Information System (INIS)

    Lazanu, I.

    2004-01-01

    The theories in particle physics are developed currently, in Minkowski space-time starting from the Poincare group. A physical theory in flat space can be seen as the limit of a more general physical theory in a curved space. At the present time, a theory of particles in curved space does not exist, and thus the only possibility is to extend the existent theories in these spaces. A formidable obstacle to the extension of physical models is the absence of groups of motion in more general Riemann spaces. A space of constant curvature has a group of motion that, although differs from that of a flat space, has the same number of parameters and could permit some generalisations. In this contribution we try to investigate some physical implications of the presumable existence of elementary particles in curved space. In de Sitter space (dS) the invariant rest mass is a combination of the Poincare rest mass and the generalised angular momentum of a particle and it permits to establish a correlation with the vacuum energy and with the cosmological constant. The consequences are significant because in an experiment the local structure of space-time departs from the Minkowski space and becomes a dS or AdS space-time. Discrete symmetry characteristics of the dS/AdS group suggest some arguments for the possible existence of the 'mirror matter'. (author)

  10. An appraisal of the learning curve in robotic general surgery.

    Science.gov (United States)

    Pernar, Luise I M; Robertson, Faith C; Tavakkoli, Ali; Sheu, Eric G; Brooks, David C; Smink, Douglas S

    2017-11-01

    Robotic-assisted surgery is used with increasing frequency in general surgery for a variety of applications. In spite of this increase in usage, the learning curve is not yet defined. This study reviews the literature on the learning curve in robotic general surgery to inform adopters of the technology. PubMed and EMBASE searches yielded 3690 abstracts published between July 1986 and March 2016. The abstracts were evaluated based on the following inclusion criteria: written in English, reporting original work, focus on general surgery operations, and with explicit statistical methods. Twenty-six full-length articles were included in final analysis. The articles described the learning curves in colorectal (9 articles, 35%), foregut/bariatric (8, 31%), biliary (5, 19%), and solid organ (4, 15%) surgery. Eighteen of 26 (69%) articles report single-surgeon experiences. Time was used as a measure of the learning curve in all studies (100%); outcomes were examined in 10 (38%). In 12 studies (46%), the authors identified three phases of the learning curve. Numbers of cases needed to achieve plateau performance were wide-ranging but overlapping for different kinds of operations: 19-128 cases for colorectal, 8-95 for foregut/bariatric, 20-48 for biliary, and 10-80 for solid organ surgery. Although robotic surgery is increasingly utilized in general surgery, the literature provides few guidelines on the learning curve for adoption. In this heterogeneous sample of reviewed articles, the number of cases needed to achieve plateau performance varies by case type and the learning curve may have multiple phases as surgeons add more complex cases to their case mix with growing experience. Time is the most common determinant for the learning curve. The literature lacks a uniform assessment of outcomes and complications, which would arguably reflect expertise in a more meaningful way than time to perform the operation alone.

  11. Decline curve based models for predicting natural gas well performance

    Directory of Open Access Journals (Sweden)

    Arash Kamari

    2017-06-01

    Full Text Available The productivity of a gas well declines over its production life as cannot cover economic policies. To overcome such problems, the production performance of gas wells should be predicted by applying reliable methods to analyse the decline trend. Therefore, reliable models are developed in this study on the basis of powerful artificial intelligence techniques viz. the artificial neural network (ANN modelling strategy, least square support vector machine (LSSVM approach, adaptive neuro-fuzzy inference system (ANFIS, and decision tree (DT method for the prediction of cumulative gas production as well as initial decline rate multiplied by time as a function of the Arps' decline curve exponent and ratio of initial gas flow rate over total gas flow rate. It was concluded that the results obtained based on the models developed in current study are in satisfactory agreement with the actual gas well production data. Furthermore, the results of comparative study performed demonstrates that the LSSVM strategy is superior to the other models investigated for the prediction of both cumulative gas production, and initial decline rate multiplied by time.

  12. Dual Smarandache Curves and Smarandache Ruled Surfaces

    OpenAIRE

    Tanju KAHRAMAN; Mehmet ÖNDER; H. Hüseyin UGURLU

    2013-01-01

    In this paper, by considering dual geodesic trihedron (dual Darboux frame) we define dual Smarandache curves lying fully on dual unit sphere S^2 and corresponding to ruled surfaces. We obtain the relationships between the elements of curvature of dual spherical curve (ruled surface) x(s) and its dual Smarandache curve (Smarandache ruled surface) x1(s) and we give an example for dual Smarandache curves of a dual spherical curve.

  13. On the analysis of Canadian Holstein dairy cow lactation curves using standard growth functions.

    Science.gov (United States)

    López, S; France, J; Odongo, N E; McBride, R A; Kebreab, E; AlZahal, O; McBride, B W; Dijkstra, J

    2015-04-01

    Six classical growth functions (monomolecular, Schumacher, Gompertz, logistic, Richards, and Morgan) were fitted to individual and average (by parity) cumulative milk production curves of Canadian Holstein dairy cows. The data analyzed consisted of approximately 91,000 daily milk yield records corresponding to 122 first, 99 second, and 92 third parity individual lactation curves. The functions were fitted using nonlinear regression procedures, and their performance was assessed using goodness-of-fit statistics (coefficient of determination, residual mean squares, Akaike information criterion, and the correlation and concordance coefficients between observed and adjusted milk yields at several days in milk). Overall, all the growth functions evaluated showed an acceptable fit to the cumulative milk production curves, with the Richards equation ranking first (smallest Akaike information criterion) followed by the Morgan equation. Differences among the functions in their goodness-of-fit were enlarged when fitted to average curves by parity, where the sigmoidal functions with a variable point of inflection (Richards and Morgan) outperformed the other 4 equations. All the functions provided satisfactory predictions of milk yield (calculated from the first derivative of the functions) at different lactation stages, from early to late lactation. The Richards and Morgan equations provided the most accurate estimates of peak yield and total milk production per 305-d lactation, whereas the least accurate estimates were obtained with the logistic equation. In conclusion, classical growth functions (especially sigmoidal functions with a variable point of inflection) proved to be feasible alternatives to fit cumulative milk production curves of dairy cows, resulting in suitable statistical performance and accurate estimates of lactation traits. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Cumulants of heat transfer across nonlinear quantum systems

    Science.gov (United States)

    Li, Huanan; Agarwalla, Bijay Kumar; Li, Baowen; Wang, Jian-Sheng

    2013-12-01

    We consider thermal conduction across a general nonlinear phononic junction. Based on two-time observation protocol and the nonequilibrium Green's function method, heat transfer in steady-state regimes is studied, and practical formulas for the calculation of the cumulant generating function are obtained. As an application, the general formalism is used to study anharmonic effects on fluctuation of steady-state heat transfer across a single-site junction with a quartic nonlinear on-site pinning potential. An explicit nonlinear modification to the cumulant generating function exact up to the first order is given, in which the Gallavotti-Cohen fluctuation symmetry is found still valid. Numerically a self-consistent procedure is introduced, which works well for strong nonlinearity.

  15. A cumulant functional for static and dynamic correlation

    International Nuclear Information System (INIS)

    Hollett, Joshua W.; Hosseini, Hessam; Menzies, Cameron

    2016-01-01

    A functional for the cumulant energy is introduced. The functional is composed of a pair-correction and static and dynamic correlation energy components. The pair-correction and static correlation energies are functionals of the natural orbitals and the occupancy transferred between near-degenerate orbital pairs, rather than the orbital occupancies themselves. The dynamic correlation energy is a functional of the statically correlated on-top two-electron density. The on-top density functional used in this study is the well-known Colle-Salvetti functional. Using the cc-pVTZ basis set, the functional effectively models the bond dissociation of H 2 , LiH, and N 2 with equilibrium bond lengths and dissociation energies comparable to those provided by multireference second-order perturbation theory. The performance of the cumulant functional is less impressive for HF and F 2 , mainly due to an underestimation of the dynamic correlation energy by the Colle-Salvetti functional.

  16. Fragmentation of tensor polarized deuterons into cumulative pions

    International Nuclear Information System (INIS)

    Afanas'ev, S.; Arkhipov, V.; Bondarev, V.

    1998-01-01

    The tensor analyzing power T 20 of the reaction d polarized + A → π - (0 0 ) + X has been measured in the fragmentation of 9 GeV tensor polarized deuterons into pions with momenta from 3.5 to 5.3 GeV/c on hydrogen, beryllium and carbon targets. This kinematic range corresponds to the region of cumulative hadron production with the cumulative variable x c from 1.08 to 1.76. The values of T 20 have been found to be small and consistent with positive values. This contradicts the predictions based on a direct mechanism assuming NN collision between a high momentum nucleon in the deuteron and a target nucleon (NN → NNπ)

  17. Experience of cumulative effects assessment in the UK

    Directory of Open Access Journals (Sweden)

    Piper Jake

    2004-01-01

    Full Text Available Cumulative effects assessment (CEA is a development of environmental impact assessment which attempts to take into account the wider picture of what impacts may affect the environment as a result of either multiple or linear projects, or development plans. CEA is seen as a further valuable tool in promoting sustainable development. The broader canvas upon which the assessment is made leads to a suite of issues such as complexity in methods and assessment of significance, the desirability of co-operation between developers and other parties, new ways of addressing mitigation and monitoring. After outlining the legislative position and the process of CEA, this paper looks at three cases studies in the UK where cumulative assessment has been carried out - the cases concern wind farms, major infrastructure and off-shore developments.

  18. Ecosystem assessment methods for cumulative effects at the regional scale

    International Nuclear Information System (INIS)

    Hunsaker, C.T.

    1989-01-01

    Environmental issues such as nonpoint-source pollution, acid rain, reduced biodiversity, land use change, and climate change have widespread ecological impacts and require an integrated assessment approach. Since 1978, the implementing regulations for the National Environmental Policy Act (NEPA) have required assessment of potential cumulative environmental impacts. Current environmental issues have encouraged ecologists to improve their understanding of ecosystem process and function at several spatial scales. However, management activities usually occur at the local scale, and there is little consideration of the potential impacts to the environmental quality of a region. This paper proposes that regional ecological risk assessment provides a useful approach for assisting scientists in accomplishing the task of assessing cumulative impacts. Critical issues such as spatial heterogeneity, boundary definition, and data aggregation are discussed. Examples from an assessment of acidic deposition effects on fish in Adirondack lakes illustrate the importance of integrated data bases, associated modeling efforts, and boundary definition at the regional scale

  19. Polarization in high Psub(trans) and cumulative hadron production

    International Nuclear Information System (INIS)

    Efremov, A.V.

    1978-01-01

    The final hadron polarization in the high Psub(trans) processes is analyzed in the parton hard scattering picture. Scaling assumption allows a correct qualitative description to be given for the Psub(trans)-behaviour of polarization or escape angle behaviour in cumulative production. The energy scaling and weak dependence on the beam and target type is predicted. A method is proposed for measuring the polarization of hadron jets

  20. Seasonal climate change patterns due to cumulative CO2 emissions

    Science.gov (United States)

    Partanen, Antti-Ilari; Leduc, Martin; Damon Matthews, H.

    2017-07-01

    Cumulative CO2 emissions are near linearly related to both global and regional changes in annual-mean surface temperature. These relationships are known as the transient climate response to cumulative CO2 emissions (TCRE) and the regional TCRE (RTCRE), and have been shown to remain approximately constant over a wide range of cumulative emissions. Here, we assessed how well this relationship holds for seasonal patterns of temperature change, as well as for annual-mean and seasonal precipitation patterns. We analyzed an idealized scenario with CO2 concentration growing at an annual rate of 1% using data from 12 Earth system models from the Coupled Model Intercomparison Project Phase 5 (CMIP5). Seasonal RTCRE values for temperature varied considerably, with the highest seasonal variation evident in the Arctic, where RTCRE was about 5.5 °C per Tt C for boreal winter and about 2.0 °C per Tt C for boreal summer. Also the precipitation response in the Arctic during boreal winter was stronger than during other seasons. We found that emission-normalized seasonal patterns of temperature change were relatively robust with respect to time, though they were sub-linear with respect to emissions particularly near the Arctic. Moreover, RTCRE patterns for precipitation could not be quantified robustly due to the large internal variability of precipitation. Our results suggest that cumulative CO2 emissions are a useful metric to predict regional and seasonal changes in precipitation and temperature. This extension of the TCRE framework to seasonal and regional climate change is helpful for communicating the link between emissions and climate change to policy-makers and the general public, and is well-suited for impact studies that could make use of estimated regional-scale climate changes that are consistent with the carbon budgets associated with global temperature targets.

  1. Firm heterogeneity, Rules of Origin and Rules of Cumulation

    OpenAIRE

    Bombarda , Pamela; Gamberoni , Elisa

    2013-01-01

    We analyse the impact of relaxing rules of origin (ROOs) in a simple setting with heterogeneous firms that buy intermediate inputs from domestic and foreign sources. In particular, we consider the impact of switching from bilateral to diagonal cumulation when using preferences (instead of paying the MFN tariff) involving the respect of rules of origin. We find that relaxing the restrictiveness of the ROOs leads the least productive exporters to stop exporting. The empirical part confirms thes...

  2. Cumulant approach to dynamical correlation functions at finite temperatures

    International Nuclear Information System (INIS)

    Tran Minhtien.

    1993-11-01

    A new theoretical approach, based on the introduction of cumulants, to calculate thermodynamic averages and dynamical correlation functions at finite temperatures is developed. The method is formulated in Liouville instead of Hilbert space and can be applied to operators which do not require to satisfy fermion or boson commutation relations. The application of the partitioning and projection methods for the dynamical correlation functions is discussed. The present method can be applied to weakly as well as to strongly correlated systems. (author). 9 refs

  3. Severe occupational hand eczema, job stress and cumulative sickness absence.

    Science.gov (United States)

    Böhm, D; Stock Gissendanner, S; Finkeldey, F; John, S M; Werfel, T; Diepgen, T L; Breuer, K

    2014-10-01

    Stress is known to activate or exacerbate dermatoses, but the relationships between chronic stress, job-related stress and sickness absence among occupational hand eczema (OHE) patients are inadequately understood. To see whether chronic stress or burnout symptoms were associated with cumulative sickness absence in patients with OHE and to determine which factors predicted sickness absence in a model including measures of job-related and chronic stress. We investigated correlations of these factors in employed adult inpatients with a history of sickness absence due to OHE in a retrospective cross-sectional explorative study, which assessed chronic stress (Trier Inventory for the Assessment of Chronic Stress), burnout (Shirom Melamed Burnout Measure), clinical symptom severity (Osnabrück Hand Eczema Severity Index), perceived symptom severity, demographic characteristics and cumulative days of sickness absence. The study group consisted of 122 patients. OHE symptoms were not more severe among patients experiencing greater stress and burnout. Women reported higher levels of chronic stress on some measures. Cumulative days of sickness absence correlated with individual dimensions of job-related stress and, in multiple regression analysis, with an overall measure of chronic stress. Chronic stress is an additional factor predicting cumulative sickness absence among severely affected OHE patients. Other relevant factors for this study sample included the 'cognitive weariness' subscale of the Shirom Melamed Burnout Measure and the physical component summary score of the SF-36, a measure of health-related life quality. Prevention and rehabilitation should take job stress into consideration in multidisciplinary treatment strategies for severely affected OHE patients. © The Author 2014. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Finite-volume cumulant expansion in QCD-colorless plasma

    Energy Technology Data Exchange (ETDEWEB)

    Ladrem, M. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); Physics Department, Algiers (Algeria); ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Ahmed, M.A.A. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Taiz University in Turba, Physics Department, Taiz (Yemen); Alfull, Z.Z. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); Cherif, S. [ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Ghardaia University, Sciences and Technologies Department, Ghardaia (Algeria)

    2015-09-15

    Due to the finite-size effects, the localization of the phase transition in finite systems and the determination of its order, become an extremely difficult task, even in the simplest known cases. In order to identify and locate the finite-volume transition point T{sub 0}(V) of the QCD deconfinement phase transition to a colorless QGP, we have developed a new approach using the finite-size cumulant expansion of the order parameter and the L{sub mn}-method. The first six cumulants C{sub 1,2,3,4,5,6} with the corresponding under-normalized ratios (skewness Σ, kurtosis κ, pentosis Π{sub ±}, and hexosis H{sub 1,2,3}) and three unnormalized combinations of them, (O = σ{sup 2}κΣ{sup -1},U = σ{sup -2}Σ{sup -1},N = σ{sup 2}κ) are calculated and studied as functions of (T, V). A new approach, unifying in a clear and consistent way the definitions of cumulant ratios, is proposed.Anumerical FSS analysis of the obtained results has allowed us to locate accurately the finite-volume transition point. The extracted transition temperature value T{sub 0}(V) agrees with that expected T{sub 0}{sup N}(V) from the order parameter and the thermal susceptibility χ{sub T} (T, V), according to the standard procedure of localization to within about 2%. In addition to this, a very good correlation factor is obtained proving the validity of our cumulants method. The agreement of our results with those obtained by means of other models is remarkable. (orig.)

  5. Science and Societal Partnerships to Address Cumulative Impacts

    OpenAIRE

    Lundquist, Carolyn J.; Fisher, Karen T.; Le Heron, Richard; Lewis, Nick I.; Ellis, Joanne I.; Hewitt, Judi E.; Greenaway, Alison J.; Cartner, Katie J.; Burgess-Jones, Tracey C.; Schiel, David R.; Thrush, Simon F.

    2016-01-01

    Funding and priorities for ocean research are not separate from the underlying sociological, economic, and political landscapes that determine values attributed to ecological systems. Here we present a variation on science prioritization exercises, focussing on inter-disciplinary research questions with the objective of shifting broad scale management practices to better address cumulative impacts and multiple users. Marine scientists in New Zealand from a broad range of scientific and social...

  6. Cumulative prospect theory and mean variance analysis. A rigorous comparison

    OpenAIRE

    Hens, Thorsten; Mayer, Janos

    2012-01-01

    We compare asset allocations derived for cumulative prospect theory(CPT) based on two different methods: Maximizing CPT along the mean–variance efficient frontier and maximizing it without that restriction. We find that with normally distributed returns the difference is negligible. However, using standard asset allocation data of pension funds the difference is considerable. Moreover, with derivatives like call options the restriction to the mean-variance efficient frontier results in a siza...

  7. Signal anomaly detection using modified CUSUM [cumulative sum] method

    International Nuclear Information System (INIS)

    Morgenstern, V.; Upadhyaya, B.R.; Benedetti, M.

    1988-01-01

    An important aspect of detection of anomalies in signals is the identification of changes in signal behavior caused by noise, jumps, changes in band-width, sudden pulses and signal bias. A methodology is developed to identify, isolate and characterize these anomalies using a modification of the cumulative sum (CUSUM) approach. The new algorithm performs anomaly detection at three levels and is implemented on a general purpose computer. 7 refs., 4 figs

  8. Problems of describing the cumulative effect in relativistic nuclear physics

    International Nuclear Information System (INIS)

    Baldin, A.M.

    1979-01-01

    The problem of describing the cumulative effect i.e., the particle production on nuclei in the range kinematically forbidden for one-nucleon collisions, is studied. Discrimination of events containing cumulative particles fixes configurations in the wave function of a nucleus, when several nucleons are closely spaced and their quark-parton components are collectivized. For the cumulative processes under consideration large distances between quarks are very important. The fundamental facts and theoretical interpretation of the quantum field theory and of the condensed media theory in the relativistic nuclear physics are presented in brief. The collisions of the relativistic nuclei with low momentum transfers is considered in a fast moving coordinate system. The basic parameter determining this type of collisions is the energy of nucleon binding in nuclei. It has been shown that the short-range correlation model provides a good presentation of many characteristics of the multiple particle production and it may be regarded as an approximate universal property of hadron interactions

  9. Dynamic prediction of cumulative incidence functions by direct binomial regression.

    Science.gov (United States)

    Grand, Mia K; de Witte, Theo J M; Putter, Hein

    2018-03-25

    In recent years there have been a series of advances in the field of dynamic prediction. Among those is the development of methods for dynamic prediction of the cumulative incidence function in a competing risk setting. These models enable the predictions to be updated as time progresses and more information becomes available, for example when a patient comes back for a follow-up visit after completing a year of treatment, the risk of death, and adverse events may have changed since treatment initiation. One approach to model the cumulative incidence function in competing risks is by direct binomial regression, where right censoring of the event times is handled by inverse probability of censoring weights. We extend the approach by combining it with landmarking to enable dynamic prediction of the cumulative incidence function. The proposed models are very flexible, as they allow the covariates to have complex time-varying effects, and we illustrate how to investigate possible time-varying structures using Wald tests. The models are fitted using generalized estimating equations. The method is applied to bone marrow transplant data and the performance is investigated in a simulation study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  11. Energy Current Cumulants in One-Dimensional Systems in Equilibrium

    Science.gov (United States)

    Dhar, Abhishek; Saito, Keiji; Roy, Anjan

    2018-06-01

    A recent theory based on fluctuating hydrodynamics predicts that one-dimensional interacting systems with particle, momentum, and energy conservation exhibit anomalous transport that falls into two main universality classes. The classification is based on behavior of equilibrium dynamical correlations of the conserved quantities. One class is characterized by sound modes with Kardar-Parisi-Zhang scaling, while the second class has diffusive sound modes. The heat mode follows Lévy statistics, with different exponents for the two classes. Here we consider heat current fluctuations in two specific systems, which are expected to be in the above two universality classes, namely, a hard particle gas with Hamiltonian dynamics and a harmonic chain with momentum conserving stochastic dynamics. Numerical simulations show completely different system-size dependence of current cumulants in these two systems. We explain this numerical observation using a phenomenological model of Lévy walkers with inputs from fluctuating hydrodynamics. This consistently explains the system-size dependence of heat current fluctuations. For the latter system, we derive the cumulant-generating function from a more microscopic theory, which also gives the same system-size dependence of cumulants.

  12. Preference, resistance to change, and the cumulative decision model.

    Science.gov (United States)

    Grace, Randolph C

    2018-01-01

    According to behavioral momentum theory (Nevin & Grace, 2000a), preference in concurrent chains and resistance to change in multiple schedules are independent measures of a common construct representing reinforcement history. Here I review the original studies on preference and resistance to change in which reinforcement variables were manipulated parametrically, conducted by Nevin, Grace and colleagues between 1997 and 2002, as well as more recent research. The cumulative decision model proposed by Grace and colleagues for concurrent chains is shown to provide a good account of both preference and resistance to change, and is able to predict the increased sensitivity to reinforcer rate and magnitude observed with constant-duration components. Residuals from fits of the cumulative decision model to preference and resistance to change data were positively correlated, supporting the prediction of behavioral momentum theory. Although some questions remain, the learning process assumed by the cumulative decision model, in which outcomes are compared against a criterion that represents the average outcome value in the current context, may provide a plausible model for the acquisition of differential resistance to change. © 2018 Society for the Experimental Analysis of Behavior.

  13. Stakeholder attitudes towards cumulative and aggregate exposure assessment of pesticides.

    Science.gov (United States)

    Verbeke, Wim; Van Loo, Ellen J; Vanhonacker, Filiep; Delcour, Ilse; Spanoghe, Pieter; van Klaveren, Jacob D

    2015-05-01

    This study evaluates the attitudes and perspectives of different stakeholder groups (agricultural producers, pesticide manufacturers, trading companies, retailers, regulators, food safety authorities, scientists and NGOs) towards the concepts of cumulative and aggregate exposure assessment of pesticides by means of qualitative in-depth interviews (n = 15) and a quantitative stakeholder survey (n = 65). The stakeholders involved generally agreed that the use of chemical pesticides is needed, primarily for meeting the need of feeding the growing world population, while clearly acknowledging the problematic nature of human exposure to pesticide residues. Current monitoring was generally perceived to be adequate, but the timeliness and consistency of monitoring practices across countries were questioned. The concept of cumulative exposure assessment was better understood by stakeholders than the concept of aggregate exposure assessment. Identified pitfalls were data availability, data limitations, sources and ways of dealing with uncertainties, as well as information and training needs. Regulators and food safety authorities were perceived as the stakeholder groups for whom cumulative and aggregate pesticide exposure assessment methods and tools would be most useful and acceptable. Insights obtained from this exploratory study have been integrated in the development of targeted and stakeholder-tailored dissemination and training programmes that were implemented within the EU-FP7 project ACROPOLIS. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Long-term cumulative depressive symptom burden and risk of cognitive decline and dementia among very old women.

    Science.gov (United States)

    Zeki Al Hazzouri, Adina; Vittinghoff, Eric; Byers, Amy; Covinsky, Ken; Blazer, Dan; Diem, Susan; Ensrud, Kristine E; Yaffe, Kristine

    2014-05-01

    Depressive symptoms and cognitive outcomes are strongly interrelated. Despite that rates of depressive symptoms fluctuate during late life, little is known about the impact of long-term cumulative depressive symptom burden on cognitive decline and dementia in older adults. This study examines the association of nearly 20 years of cumulative depressive symptoms with cognitive outcomes in a cohort of older women. We assessed depressive symptoms in 7,240 women using the Geriatric Depression scale (GDS) at serial visits. We used a Poisson model with random slopes to estimate GDS trajectories for each participant from baseline to death or end of follow-up, and then characterized depressive symptom burden by quartile of the area under the curve. We assessed cognitive outcomes using repeated measures of the Mini-Mental State Examination (MMSE) and Trails B score over 20 years, Year-20 neuropsychological test battery, and adjudicated dementia and mild cognitive impairment (MCI). Adjusting for potential confounders, compared with women in the lowest quartile of cumulative depressive symptoms burden, women in the highest quartile had 21% more MMSE errors over time (95% CI = 17%, 26%), 20% worse Trails B score over time (95% CI = 17%, 23%), worse scores on most of the Year-20 cognitive tests, and a twofold greater likelihood of developing dementia or MCI (95% CI = 1.48, 3.11). Long-term cumulative depressive symptom burden was associated with cognitive decline and risk of dementia or MCI. Older adults with a history of depression should be closely monitored for recurrent episodes or unresolved depressive symptoms as well as any cognitive deficits.

  15. Frequency spirals

    International Nuclear Information System (INIS)

    Ottino-Löffler, Bertrand; Strogatz, Steven H.

    2016-01-01

    We study the dynamics of coupled phase oscillators on a two-dimensional Kuramoto lattice with periodic boundary conditions. For coupling strengths just below the transition to global phase-locking, we find localized spatiotemporal patterns that we call “frequency spirals.” These patterns cannot be seen under time averaging; they become visible only when we examine the spatial variation of the oscillators' instantaneous frequencies, where they manifest themselves as two-armed rotating spirals. In the more familiar phase representation, they appear as wobbly periodic patterns surrounding a phase vortex. Unlike the stationary phase vortices seen in magnetic spin systems, or the rotating spiral waves seen in reaction-diffusion systems, frequency spirals librate: the phases of the oscillators surrounding the central vortex move forward and then backward, executing a periodic motion with zero winding number. We construct the simplest frequency spiral and characterize its properties using analytical and numerical methods. Simulations show that frequency spirals in large lattices behave much like this simple prototype.

  16. Frequency spirals

    Energy Technology Data Exchange (ETDEWEB)

    Ottino-Löffler, Bertrand; Strogatz, Steven H., E-mail: strogatz@cornell.edu [Center for Applied Mathematics, Cornell University, Ithaca, New York 14853 (United States)

    2016-09-15

    We study the dynamics of coupled phase oscillators on a two-dimensional Kuramoto lattice with periodic boundary conditions. For coupling strengths just below the transition to global phase-locking, we find localized spatiotemporal patterns that we call “frequency spirals.” These patterns cannot be seen under time averaging; they become visible only when we examine the spatial variation of the oscillators' instantaneous frequencies, where they manifest themselves as two-armed rotating spirals. In the more familiar phase representation, they appear as wobbly periodic patterns surrounding a phase vortex. Unlike the stationary phase vortices seen in magnetic spin systems, or the rotating spiral waves seen in reaction-diffusion systems, frequency spirals librate: the phases of the oscillators surrounding the central vortex move forward and then backward, executing a periodic motion with zero winding number. We construct the simplest frequency spiral and characterize its properties using analytical and numerical methods. Simulations show that frequency spirals in large lattices behave much like this simple prototype.

  17. Evolution of costly explicit memory and cumulative culture.

    Science.gov (United States)

    Nakamaru, Mayuko

    2016-06-21

    Humans can acquire new information and modify it (cumulative culture) based on their learning and memory abilities, especially explicit memory, through the processes of encoding, consolidation, storage, and retrieval. Explicit memory is categorized into semantic and episodic memories. Animals have semantic memory, while episodic memory is unique to humans and essential for innovation and the evolution of culture. As both episodic and semantic memory are needed for innovation, the evolution of explicit memory influences the evolution of culture. However, previous theoretical studies have shown that environmental fluctuations influence the evolution of imitation (social learning) and innovation (individual learning) and assume that memory is not an evolutionary trait. If individuals can store and retrieve acquired information properly, they can modify it and innovate new information. Therefore, being able to store and retrieve information is essential from the perspective of cultural evolution. However, if both storage and retrieval were too costly, forgetting and relearning would have an advantage over storing and retrieving acquired information. In this study, using mathematical analysis and individual-based simulations, we investigate whether cumulative culture can promote the coevolution of costly memory and social and individual learning, assuming that cumulative culture improves the fitness of each individual. The conclusions are: (1) without cumulative culture, a social learning cost is essential for the evolution of storage-retrieval. Costly storage-retrieval can evolve with individual learning but costly social learning does not evolve. When low-cost social learning evolves, the repetition of forgetting and learning is favored more than the evolution of costly storage-retrieval, even though a cultural trait improves the fitness. (2) When cumulative culture exists and improves fitness, storage-retrieval can evolve with social and/or individual learning, which

  18. A note on families of fragility curves

    International Nuclear Information System (INIS)

    Kaplan, S.; Bier, V.M.; Bley, D.C.

    1989-01-01

    In the quantitative assessment of seismic risk, uncertainty in the fragility of a structural component is usually expressed by putting forth a family of fragility curves, with probability serving as the parameter of the family. Commonly, a lognormal shape is used both for the individual curves and for the expression of uncertainty over the family. A so-called composite single curve can also be drawn and used for purposes of approximation. This composite curve is often regarded as equivalent to the mean curve of the family. The equality seems intuitively reasonable, but according to the authors has never been proven. The paper presented proves this equivalence hypothesis mathematically. Moreover, the authors show that this equivalence hypothesis between fragility curves is itself equivalent to an identity property of the standard normal probability curve. Thus, in the course of proving the fragility curve hypothesis, the authors have also proved a rather obscure, but interesting and perhaps previously unrecognized, property of the standard normal curve

  19. El Carreto o Cumulá - Aspidosperma Dugandii Standl El Carreto o Cumulá - Aspidosperma Dugandii Standl

    Directory of Open Access Journals (Sweden)

    Dugand Armando

    1944-03-01

    Full Text Available Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin duda alguna al A. Dugandii StandI. Por otra parte, Santiago Cortés (FI. Colomb. 206. 1898; ed, 2: 239. 1912 cita el Cumulá "de Anapoima y otros lugares del (rio Magdalena" diciendo que pertenece a las Leguminosas, pero la brevísima descripción que este autor hace de la madera "naranjada y notable por densidad, dureza y resistencia a la humedad", me induce a creer que se trata del mismo Cumula coleccionado recientemente en Tocaima, ya que esta población esta situada a pocos kilómetros de Anapoima. Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin

  20. Differential geometry curves, surfaces, manifolds

    CERN Document Server

    Kohnel, Wolfgang

    2002-01-01

    This carefully written book is an introduction to the beautiful ideas and results of differential geometry. The first half covers the geometry of curves and surfaces, which provide much of the motivation and intuition for the general theory. Special topics that are explored include Frenet frames, ruled surfaces, minimal surfaces and the Gauss-Bonnet theorem. The second part is an introduction to the geometry of general manifolds, with particular emphasis on connections and curvature. The final two chapters are insightful examinations of the special cases of spaces of constant curvature and Einstein manifolds. The text is illustrated with many figures and examples. The prerequisites are undergraduate analysis and linear algebra.

  1. LINS Curve in Romanian Economy

    Directory of Open Access Journals (Sweden)

    Emilian Dobrescu

    2016-02-01

    Full Text Available The paper presents theoretical considerations and empirical evidence to test the validity of the Laffer in Narrower Sense (LINS curve as a parabola with a maximum. Attention is focused on the so-called legal-effective tax gap (letg. The econometric application is based on statistical data (1990-2013 for Romania as an emerging European economy. Three cointegrating regressions (fully modified least squares, canonical cointegrating regression and dynamic least squares and three algorithms, which are based on instrumental variables (two-stage least squares, generalized method of moments, and limited information maximum likelihood, are involved.

  2. Parametric study of guided waves dispersion curves for composite plates

    Science.gov (United States)

    Predoi, Mihai Valentin; Petre, Cristian Cǎtǎlin; Kettani, Mounsif Ech Cherif El; Leduc, Damien

    2018-02-01

    Nondestructive testing of composite panels benefit from the relatively long range propagation of guided waves in sandwich structures. The guided waves are sensitive to delamination, air bubbles inclusions and cracks and can thus bring information about hidden defects in the composite panel. The preliminary data in all such inspections is represented by the dispersion curves, representing the dependency of the phase/group velocity on the frequency for the propagating modes. In fact, all modes are more or less attenuated, so it is even more important to compute the dispersion curves, which provide also the modal attenuation as function of frequency. Another important aspect is the sensitivity of the dispersion curves on each of the elastic constant of the composite, which are orthotropic in most cases. All these aspects are investigated in the present work, based on our specially developed finite element numerical model implemented in Comsol, which has several advantages over existing methods. The dispersion curves and modal displacements are computed for an example of composite plate. Comparison with literature data validates the accuracy of our results.

  3. Cumulative Effect of Obesogenic Behaviours on Adiposity in Spanish Children and Adolescents

    Science.gov (United States)

    Schröder, Helmut; Bawaked, Rowaedh Ahmed; Ribas-Barba, Lourdes; Izquierdo-Pulido, Maria; Roman-Viñas, Blanca; Fíto, Montserrat; Serra-Majem, Lluis

    2018-01-01

    Objective Little is known about the cumulative effect of obesogenic behaviours on childhood obesity risk. We determined the cumulative effect on BMI z-score, waist-to-height ratio (WHtR), overweight and abdominal obesity of four lifestyle behaviours that have been linked to obesity. Methods In this cross-sectional analysis, data were obtained from the EnKid sudy, a representative sample of Spanish youth. The study included 1,614 boys and girls aged 5-18 years. Weight, height and waist circumference were measured. Physical activity (PA), screen time, breakfast consumption and meal frequency were self-reported on structured questionnaires. Obesogenic behaviours were defined as 1 SD from the mean of the WHO reference population. Abdominal obesity was defined as a WHtR ≥ 0.5. Results High screen time was the most prominent obesogenic behaviour (49.7%), followed by low physical activity (22.4%), low meal frequency (14.4%), and skipping breakfast (12.5%). Although 33% of participants were free of all 4 obesogenic behaviours, 1, 2, and 3 or 4 behaviours were reported by 44.5%, 19.3%, and 5.0%, respectively. BMI z-score and WHtR were positively associated (p < 0.001) with increasing numbers of concurrent obesogenic behaviours. The odds of presenting with obesogenic behaviours were significantly higher in children who were overweight (OR 2.68; 95% CI 1.50; 4.80) or had abdominal obesity (OR 2.12; 95% CI 1.28; 3.52); they reported more than 2 obesogenic behaviours. High maternal and parental education was inversely associated (p = 0.004 and p < 0.001, respectively) with increasing presence of obesogenic behaviours. Surrogate markers of adiposity increased with numbers of concurrent presence of obesogenic behaviours. The opposite was true for high maternal and paternal education. PMID:29207394

  4. Differential geometry and topology of curves

    CERN Document Server

    Animov, Yu

    2001-01-01

    Differential geometry is an actively developing area of modern mathematics. This volume presents a classical approach to the general topics of the geometry of curves, including the theory of curves in n-dimensional Euclidean space. The author investigates problems for special classes of curves and gives the working method used to obtain the conditions for closed polygonal curves. The proof of the Bakel-Werner theorem in conditions of boundedness for curves with periodic curvature and torsion is also presented. This volume also highlights the contributions made by great geometers. past and present, to differential geometry and the topology of curves.

  5. Cumulative beam break-up study of the spallation neutron source superconducting linac

    CERN Document Server

    Jeon, D; Krafft, G A; Yunn, B; Sundelin, R; Delayen, J; Kim, S; Doleans, M

    2002-01-01

    Beam instabilities due to High Order Modes (HOMs) are a concern to superconducting (SC) linacs such as the Spallation Neutron Source (SNS) linac. The effects of pulsed mode operation on transverse and longitudinal beam breakup instability are studied for H sup - beam in a consistent manner for the first time. Numerical simulation indicates that cumulative transverse beam breakup instabilities are not a concern in the SNS SC linac, primarily due to the heavy mass of H sup - beam and the HOM frequency spread resulting from manufacturing tolerances. As little as +-0.1 MHz HOM frequency spread stabilizes all the instabilities from both transverse HOMs, and also acts to stabilize the longitudinal HOMs. Such an assumed frequency spread of +-0.1 MHz HOM is small, and hence conservative compared with measured values of sigma=0.00109(f sub H sub O sub M -f sub 0)/f sub 0 obtained from Cornell and the Jefferson Lab Free Electron Laser cavities. However, a few cavities may hit resonance lines and generate a high heat lo...

  6. Curved Radio Spectra of Weak Cluster Shocks

    Science.gov (United States)

    Kang, Hyesung; Ryu, Dongsu

    2015-08-01

    In order to understand certain observed features of arc-like giant radio relics such as the rareness, uniform surface brightness, and curved integrated spectra, we explore a diffusive shock acceleration (DSA) model for radio relics in which a spherical shock impinges on a magnetized cloud containing fossil relativistic electrons. Toward this end, we perform DSA simulations of spherical shocks with the parameters relevant for the Sausage radio relic in cluster CIZA J2242.8+5301, and calculate the ensuing radio synchrotron emission from re-accelerated electrons. Three types of fossil electron populations are considered: a delta-function like population with the shock injection momentum, a power-law distribution, and a power law with an exponential cutoff. The surface brightness profile of the radio-emitting postshock region and the volume-integrated radio spectrum are calculated and compared with observations. We find that the observed width of the Sausage relic can be explained reasonably well by shocks with speed {u}{{s}}˜ 3× {10}3 {km} {{{s}}}-1 and sonic Mach number {M}{{s}}˜ 3. These shocks produce curved radio spectra that steepen gradually over (0.1-10){ν }{br} with a break frequency {ν }{br}˜ 1 GHz if the duration of electron acceleration is ˜60-80 Myr. However, the abrupt increase in the spectral index above ˜1.5 GHz observed in the Sausage relic seems to indicate that additional physical processes, other than radiative losses, operate for electrons with {γ }{{e}}≳ {10}4.

  7. Evaluating changes in stable chromosomal translocation frequency in patients receiving radioimmunotherapy

    International Nuclear Information System (INIS)

    Wong, Jeffrey Y.C.; Wang Jianyi; Liu An; Odom-Maryon, Tamara; Shively, John E.; Raubitschek, Andrew A.; Williams, Lawrence E.

    2000-01-01

    Purpose: The lack of any consistent correlation between radioimmunotherapy (RIT) dose and observed hematologic toxicity has made it difficult to validate RIT radiation dose estimates to marrow. Stable chromosomal translocations (SCT) which result after radiation exposure may be a biologic parameter that more closely correlates with RIT radiation dose. Increases in the frequency of SCT are observed after radiation exposure and are highly correlated with absorbed radiation dose. SCT are cumulative after multiple radiation doses and conserved through an extended number of cell divisions. The purpose of this study was to evaluate whether increases in SCT frequency were detectable in peripheral lymphocytes after RIT and whether the magnitude of these increases correlated with estimated radiation dose to marrow and whole body. Methods and Materials: Patients entered in a Phase I dose escalation therapy trial each received 1-3 intravenous cycles of the radiolabeled anti-carcinoembryonic antigen (CEA) monoclonal antibody, 90 Y-chimeric T84.66. Five mCi of 111 In-chimeric T84.66 was co-administered for imaging and biodistribution purposes. Blood samples were collected immediately prior to the start of therapy and 5-6 weeks after each therapy cycle. Peripheral lymphocytes were harvested after 72 hours of phytohemagglutinin stimulation and metaphase spreads prepared. Spreads were then stained by fluorescence in situ hybridization (FISH) using commercially available chromosome paint probes to chromosomes 3 and 4. Approximately 1000 spreads were evaluated for each chromosome sample. Red marrow radiation doses were estimated using the AAPM algorithm and blood clearance curves. Results: Eighteen patients were studied, each receiving at least one cycle of therapy ranging from 5-22 mCi/m 2 . Three patients received 2 cycles and two patients received 3 cycles of therapy. Cumulative estimated marrow doses ranged from 9.2 to 310 cGy. Increases in SCT frequencies were observed after

  8. Acoustic energy harvesting by piezoelectric curved beams in the cavity of a sonic crystal

    International Nuclear Information System (INIS)

    Wang, Wei-Chung; Wu, Liang-Yu; Chen, Lien-Wen; Liu, Chia-Ming

    2010-01-01

    Acoustic energy harvesting by piezoelectric curved beams in the cavity of a sonic crystal is investigated. A resonant cavity of the sonic crystal is used to localize the acoustic wave as the acoustic waves are incident into the sonic crystal at the resonant frequency. The piezoelectric curved beam is placed in the resonant cavity and vibrated by the acoustic wave. The energy harvesting can be achieved as the acoustic waves are incident at the resonant frequency. A model for energy harvesting of the piezoelectric curved beam is also developed to predict the output voltage and power of the energy harvesting. The experimental results are compared with the theoretical

  9. Flow characteristics of curved ducts

    Directory of Open Access Journals (Sweden)

    Rudolf P.

    2007-10-01

    Full Text Available Curved channels are very often present in real hydraulic systems, e.g. curved diffusers of hydraulic turbines, S-shaped bulb turbines, fittings, etc. Curvature brings change of velocity profile, generation of vortices and production of hydraulic losses. Flow simulation using CFD techniques were performed to understand these phenomena. Cases ranging from single elbow to coupled elbows in shapes of U, S and spatial right angle position with circular cross-section were modeled for Re = 60000. Spatial development of the flow was studied and consequently it was deduced that minor losses are connected with the transformation of pressure energy into kinetic energy and vice versa. This transformation is a dissipative process and is reflected in the amount of the energy irreversibly lost. Least loss coefficient is connected with flow in U-shape elbows, biggest one with flow in Sshape elbows. Finally, the extent of the flow domain influenced by presence of curvature was examined. This isimportant for proper placement of mano- and flowmeters during experimental tests. Simulations were verified with experimental results presented in literature.

  10. Improved capacitive melting curve measurements

    International Nuclear Information System (INIS)

    Sebedash, Alexander; Tuoriniemi, Juha; Pentti, Elias; Salmela, Anssi

    2009-01-01

    Sensitivity of the capacitive method for determining the melting pressure of helium can be enhanced by loading the empty side of the capacitor with helium at a pressure nearly equal to that desired to be measured and by using a relatively thin and flexible membrane in between. This way one can achieve a nanobar resolution at the level of 30 bar, which is two orders of magnitude better than that of the best gauges with vacuum reference. This extends the applicability of melting curve thermometry to lower temperatures and would allow detecting tiny anomalies in the melting pressure, which must be associated with any phenomena contributing to the entropy of the liquid or solid phases. We demonstrated this principle in measurements of the crystallization pressure of isotopic helium mixtures at millikelvin temperatures by using partly solid pure 4 He as the reference substance providing the best possible universal reference pressure. The achieved sensitivity was good enough for melting curve thermometry on mixtures down to 100 μK. Similar system can be used on pure isotopes by virtue of a blocked capillary giving a stable reference condition with liquid slightly below the melting pressure in the reference volume. This was tested with pure 4 He at temperatures 0.08-0.3 K. To avoid spurious heating effects, one must carefully choose and arrange any dielectric materials close to the active capacitor. We observed some 100 pW loading at moderate excitation voltages.

  11. Classical optics and curved spaces

    International Nuclear Information System (INIS)

    Bailyn, M.; Ragusa, S.

    1976-01-01

    In the eikonal approximation of classical optics, the unit polarization 3-vector of light satisfies an equation that depends only on the index, n, of refraction. It is known that if the original 3-space line element is d sigma 2 , then this polarization direction propagates parallely in the fictitious space n 2 d sigma 2 . Since the equation depends only on n, it is possible to invent a fictitious curved 4-space in which the light performs a null geodesic, and the polarization 3-vector behaves as the 'shadow' of a parallely propagated 4-vector. The inverse, namely, the reduction of Maxwell's equation, on a curve 'dielectric free) space, to a classical space with dielectric constant n=(-g 00 ) -1 / 2 is well known, but in the latter the dielectric constant epsilon and permeability μ must also equal (-g 00 ) -1 / 2 . The rotation of polarization as light bends around the sun by utilizing the reduction to the classical space, is calculated. This (non-) rotation may then be interpreted as parallel transport in the 3-space n 2 d sigma 2 [pt

  12. Investigating low-frequency compression using the Grid method

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Dau, Torsten; MacDonald, Ewen

    2016-01-01

    in literature. Moreover, slopes of the low-level portions of the BM I/O functions estimated at 500 Hz were examined, to determine whether the 500-Hz off-frequency forward masking curves were affected by compression. Overall, the collected data showed a trend confirming the compressive behaviour. However......There is an ongoing discussion about whether the amount of cochlear compression in humans at low frequencies (below 1 kHz) is as high as that at higher frequencies. It is controversial whether the compression affects the slope of the off-frequency forward masking curves at those frequencies. Here......, the Grid method with a 2-interval 1-up 3-down tracking rule was applied to estimate forward masking curves at two characteristic frequencies: 500 Hz and 4000 Hz. The resulting curves and the corresponding basilar membrane input-output (BM I/O) functions were found to be comparable to those reported...

  13. Fluid Overload and Cumulative Thoracostomy Output Are Associated With Surgical Site Infection After Pediatric Cardiothoracic Surgery.

    Science.gov (United States)

    Sochet, Anthony A; Nyhan, Aoibhinn; Spaeder, Michael C; Cartron, Alexander M; Song, Xiaoyan; Klugman, Darren; Brown, Anna T

    2017-08-01

    To determine the impact of cumulative, postoperative thoracostomy output, amount of bolus IV fluids and peak fluid overload on the incidence and odds of developing a deep surgical site infection following pediatric cardiothoracic surgery. A single-center, nested, retrospective, matched case-control study. A 26-bed cardiac ICU in a 303-bed tertiary care pediatric hospital. Cases with deep surgical site infection following cardiothoracic surgery were identified retrospectively from January 2010 through December 2013 and individually matched to controls at a ratio of 1:2 by age, gender, Risk Adjustment for Congenital Heart Surgery score, Society of Thoracic Surgeons-European Association for Cardiothoracic Surgery category, primary cardiac diagnosis, and procedure. None. Twelve cases with deep surgical site infection were identified and matched to 24 controls without detectable differences in perioperative clinical characteristics. Deep surgical site infection cases had larger thoracostomy output and bolus IV fluid volumes at 6, 24, and 48 hours postoperatively compared with controls. For every 1 mL/kg of thoracostomy output, the odds of developing a deep surgical site infection increase by 13%. By receiver operative characteristic curve analysis, a cutoff of 49 mL/kg of thoracostomy output at 48 hours best discriminates the development of deep surgical site infection (sensitivity 83%, specificity 83%). Peak fluid overload was greater in cases than matched controls (12.5% vs 6%; p operative characteristic curve analysis, a threshold value of 10% peak fluid overload was observed to identify deep surgical site infection (sensitivity 67%, specificity 79%). Conditional logistic regression of peak fluid overload greater than 10% on the development of deep surgical site infection yielded an odds ratio of 9.4 (95% CI, 2-46.2). Increased postoperative peak fluid overload and cumulative thoracostomy output were associated with deep surgical site infection after pediatric

  14. Test-Anchored Vibration Response Predictions for an Acoustically Energized Curved Orthogrid Panel with Mounted Components

    Science.gov (United States)

    Frady, Gregory P.; Duvall, Lowery D.; Fulcher, Clay W. G.; Laverde, Bruce T.; Hunt, Ronald A.

    2011-01-01

    rich body of vibroacoustic test data was recently generated at Marshall Space Flight Center for component-loaded curved orthogrid panels typical of launch vehicle skin structures. The test data were used to anchor computational predictions of a variety of spatially distributed responses including acceleration, strain and component interface force. Transfer functions relating the responses to the input pressure field were generated from finite element based modal solutions and test-derived damping estimates. A diffuse acoustic field model was applied to correlate the measured input sound pressures across the energized panel. This application quantifies the ability to quickly and accurately predict a variety of responses to acoustically energized skin panels with mounted components. Favorable comparisons between the measured and predicted responses were established. The validated models were used to examine vibration response sensitivities to relevant modeling parameters such as pressure patch density, mesh density, weight of the mounted component and model form. Convergence metrics include spectral densities and cumulative root-mean squared (RMS) functions for acceleration, velocity, displacement, strain and interface force. Minimum frequencies for response convergence were established as well as recommendations for modeling techniques, particularly in the early stages of a component design when accurate structural vibration requirements are needed relatively quickly. The results were compared with long-established guidelines for modeling accuracy of component-loaded panels. A theoretical basis for the Response/Pressure Transfer Function (RPTF) approach provides insight into trends observed in the response predictions and confirmed in the test data. The software developed for the RPTF method allows easy replacement of the diffuse acoustic field with other pressure fields such as a turbulent boundary layer (TBL) model suitable for vehicle ascent. Structural responses

  15. Mismatch or cumulative stress : Toward an integrated hypothesis of programming effects

    NARCIS (Netherlands)

    Nederhof, Esther; Schmidt, Mathias V.

    2012-01-01

    This paper integrates the cumulative stress hypothesis with the mismatch hypothesis, taking into account individual differences in sensitivity to programming. According to the cumulative stress hypothesis, individuals are more likely to suffer from disease as adversity accumulates. According to the

  16. The association of patient characteristics and spinal curve parameters with Lenke classification types.

    Science.gov (United States)

    Sponseller, Paul D; Flynn, John M; Newton, Peter O; Marks, Michelle C; Bastrom, Tracey P; Petcharaporn, Maty; McElroy, Mark J; Lonner, Baron S; Betz, Randal R

    2012-06-01

    Retrospective review. To determine the association of patient characteristics and spinal curve parameters with Lenke curve types. The Lenke curve classification may be used for surgical planning and clinical research. We retrospectively reviewed the records of 1912 patients with adolescent idiopathic scoliosis who underwent initial surgery at 21 years of age or younger; collected data on patient's age, patient's sex, primary curve magnitude (Society (SRS) outcomes questionnaire (SRS-22) score; and compared that data by Lenke curve type. Analysis of variance and χ tests were used as appropriate (significance level, P ≤ 0.005). RESULTS.: Lenke types vary by sex: male patients had more major thoracic (types 1-4) than major thoracolumbar/lumbar (types 5 and 6) curves, fewer lumbar C-modifiers (32% vs. 44%), and less apical lumbar translation (1.1 vs. 1.7 cm). Lenke types vary by frequency: the most common type was 1 (50%); the least common, 4 (4%). Lenke types vary by magnitude: type 4 had the greatest percentage of large curves (52% of curves .75°), most smaller curves were types 1 and 5, and type 4 had the largest mean magnitude (78° ± 17°). Lenke types vary by patient age: type 5 curves occurred in the oldest patients (average age at surgery: 15.4 ± 2.2 vs. 14.3 ± 14.6 years for all others) despite having the lowest mean magnitude (P = 0.001); curve size was negatively correlated with age at surgery (r = -0.16, P = 0.001). Lenke types vary by patient self-image: patients with type 4 curves had lower preoperative SRS outcome scores for self-image than did patients with type 1 curves (P = 0.005). Lenke types vary by sex, frequency magnitude, patient age, and patient self-image, which should be considered in designing studies.

  17. Maximally Informative Stimuli and Tuning Curves for Sigmoidal Rate-Coding Neurons and Populations

    Science.gov (United States)

    McDonnell, Mark D.; Stocks, Nigel G.

    2008-08-01

    A general method for deriving maximally informative sigmoidal tuning curves for neural systems with small normalized variability is presented. The optimal tuning curve is a nonlinear function of the cumulative distribution function of the stimulus and depends on the mean-variance relationship of the neural system. The derivation is based on a known relationship between Shannon’s mutual information and Fisher information, and the optimality of Jeffrey’s prior. It relies on the existence of closed-form solutions to the converse problem of optimizing the stimulus distribution for a given tuning curve. It is shown that maximum mutual information corresponds to constant Fisher information only if the stimulus is uniformly distributed. As an example, the case of sub-Poisson binomial firing statistics is analyzed in detail.

  18. Atlas of stress-strain curves

    CERN Document Server

    2002-01-01

    The Atlas of Stress-Strain Curves, Second Edition is substantially bigger in page dimensions, number of pages, and total number of curves than the previous edition. It contains over 1,400 curves, almost three times as many as in the 1987 edition. The curves are normalized in appearance to aid making comparisons among materials. All diagrams include metric (SI) units, and many also include U.S. customary units. All curves are captioned in a consistent format with valuable information including (as available) standard designation, the primary source of the curve, mechanical properties (including hardening exponent and strength coefficient), condition of sample, strain rate, test temperature, and alloy composition. Curve types include monotonic and cyclic stress-strain, isochronous stress-strain, and tangent modulus. Curves are logically arranged and indexed for fast retrieval of information. The book also includes an introduction that provides background information on methods of stress-strain determination, on...

  19. Transition curves for highway geometric design

    CERN Document Server

    Kobryń, Andrzej

    2017-01-01

    This book provides concise descriptions of the various solutions of transition curves, which can be used in geometric design of roads and highways. It presents mathematical methods and curvature functions for defining transition curves. .

  20. Comparison and evaluation of mathematical lactation curve ...

    African Journals Online (AJOL)

    p2492989

    A mathematical model of the lactation curve provides summary information about culling and milking strategies ..... Table 2 Statistics of the edited data for first lactation Holstein cows ..... Application of different models to the lactation curves of.

  1. Science and societal partnerships to address cumulative impacts

    Directory of Open Access Journals (Sweden)

    Carolyn J Lundquist

    2016-02-01

    Full Text Available Funding and priorities for ocean research are not separate from the underlying sociological, economic, and political landscapes that determine values attributed to ecological systems. Here we present a variation on science prioritisation exercises, focussing on inter-disciplinary research questions with the objective of shifting broad scale management practices to better address cumulative impacts and multiple users. Marine scientists in New Zealand from a broad range of scientific and social-scientific backgrounds ranked 48 statements of research priorities. At a follow up workshop, participants discussed five over-arching themes based on survey results. These themes were used to develop mechanisms to increase the relevance and efficiency of scientific research while acknowledging socio-economic and political drivers of research agendas in New Zealand’s ocean ecosystems. Overarching messages included the need to: 1 determine the conditions under which ‘surprises’ (sudden and substantive undesirable changes are likely to occur and the socio-ecological implications of such changes; 2 develop methodologies to reveal the complex and cumulative effects of change in marine systems, and their implications for resource use, stewardship, and restoration; 3 assess potential solutions to management issues that balance long-term and short-term benefits and encompass societal engagement in decision-making; 4 establish effective and appropriately resourced institutional networks to foster collaborative, solution-focused marine science; and 5 establish cross-disciplinary dialogues to translate diverse scientific and social-scientific knowledge into innovative regulatory, social and economic practice. In the face of multiple uses and cumulative stressors, ocean management frameworks must be adapted to build a collaborative framework across science, governance and society that can help stakeholders navigate uncertainties and socio-ecological surprises.

  2. Cumulative risk hypothesis: Predicting and preventing child maltreatment recidivism.

    Science.gov (United States)

    Solomon, David; Åsberg, Kia; Peer, Samuel; Prince, Gwendolyn

    2016-08-01

    Although Child Protective Services (CPS) and other child welfare agencies aim to prevent further maltreatment in cases of child abuse and neglect, recidivism is common. Having a better understanding of recidivism predictors could aid in preventing additional instances of maltreatment. A previous study identified two CPS interventions that predicted recidivism: psychotherapy for the parent, which was related to a reduced risk of recidivism, and temporary removal of the child from the parent's custody, which was related to an increased recidivism risk. However, counter to expectations, this previous study did not identify any other specific risk factors related to maltreatment recidivism. For the current study, it was hypothesized that (a) cumulative risk (i.e., the total number of risk factors) would significantly predict maltreatment recidivism above and beyond intervention variables in a sample of CPS case files and that (b) therapy for the parent would be related to a reduced likelihood of recidivism. Because it was believed that the relation between temporary removal of a child from the parent's custody and maltreatment recidivism is explained by cumulative risk, the study also hypothesized that that the relation between temporary removal of the child from the parent's custody and recidivism would be mediated by cumulative risk. After performing a hierarchical logistic regression analysis, the first two hypotheses were supported, and an additional predictor, psychotherapy for the child, also was related to reduced chances of recidivism. However, Hypothesis 3 was not supported, as risk did not significantly mediate the relation between temporary removal and recidivism. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Cumulative or delayed nephrotoxicity after cisplatin (DDP) treatment.

    Science.gov (United States)

    Pinnarò, P; Ruggeri, E M; Carlini, P; Giovannelli, M; Cognetti, F

    1986-04-30

    The present retrospective study reports data regarding renal toxicity in 115 patients (63 males, 52 females; median age, 56 years) who received cumulative doses of cisplatin (DDP) greater than or equal to 200 mg/m2. DDP was administered alone or in combination at a dose of 50-70 mg/m2 in 91 patients, and at a dose of 100 mg/m2 in 22 patients. Two patients after progression of ovarian carcinoma treated with conventional doses of DDP received 4 and 2 courses, respectively, of high-dose DDP (40 mg/m2 for 5 days) in hypertonic saline. The median number of DDP courses was 6 (range 2-14), and the median cumulative dose was 350 mg/m2 (range, 200-1200). Serum creatinine and urea nitrogen were determined before initiating the treatment and again 13-16 days after each administration. The incidence of azotemia (creatinina levels that exceeded 1.5 mg/dl) was similar before (7.8%) and after (6.1%) DDP doses of 200 mg/m2. Azotemia appears to be related to the association of DDP with other potentially nephrotoxic antineoplastic drugs (methotrexate) more than to the dose per course of DDP. Of 59 patients followed for 2 months or more after discontinuing the DDP treatment, 3 (5.1%) presented creatinine values higher than 1.5 mg/dl. The data deny that the incidence of nephrotoxicity is higher in patients receiving higher cumulative doses of DDP and confirm that increases in serum creatinine levels may occur some time after discontinuation of the drug.

  4. The proportional odds cumulative incidence model for competing risks

    DEFF Research Database (Denmark)

    Eriksson, Frank; Li, Jianing; Scheike, Thomas

    2015-01-01

    We suggest an estimator for the proportional odds cumulative incidence model for competing risks data. The key advantage of this model is that the regression parameters have the simple and useful odds ratio interpretation. The model has been considered by many authors, but it is rarely used...... in practice due to the lack of reliable estimation procedures. We suggest such procedures and show that their performance improve considerably on existing methods. We also suggest a goodness-of-fit test for the proportional odds assumption. We derive the large sample properties and provide estimators...

  5. Cumulative exposure to phthalates from phthalate-containing drug products

    DEFF Research Database (Denmark)

    Ennis, Zandra Nymand; Broe, Anne; Pottegård, Anton

    2018-01-01

    European regulatory limit of exposure ranging between 380-1710 mg/year throughout the study period. Lithium-products constituted the majority of dibutyl phthalate exposure. Diethyl phthalate exposure, mainly caused by erythromycin, theophylline and diclofenac products, did not exceed the EMA regulatory...... to quantify annual cumulated phthalate exposure from drug products among users of phthalate-containing oral medications in Denmark throughout the period of 2004-2016. METHODS: We conducted a Danish nationwide cohort study using The Danish National Prescription Registry and an internal database held...

  6. Numerical simulation of explosive magnetic cumulative generator EMG-720

    Energy Technology Data Exchange (ETDEWEB)

    Deryugin, Yu N; Zelenskij, D K; Kazakova, I F; Kargin, V I; Mironychev, P V; Pikar, A S; Popkov, N F; Ryaslov, E A; Ryzhatskova, E G [All-Russian Research Inst. of Experimental Physics, Sarov (Russian Federation)

    1997-12-31

    The paper discusses the methods and results of numerical simulations used in the development of a helical-coaxial explosive magnetic cumulative generator (EMG) with the stator up to 720 mm in diameter. In the process of designing, separate units were numerically modeled, as was the generator operation with a constant inductive-ohmic load. The 2-D processes of the armature acceleration by the explosion products were modeled as well as those of the formation of the sliding high-current contact between the armature and stator`s insulated turns. The problem of the armature integrity in the region of the detonation waves collision was numerically analyzed. 8 figs., 2 refs.

  7. Cumulative exergy losses associated with the production of lead metal

    Energy Technology Data Exchange (ETDEWEB)

    Szargut, J [Technical Univ. of Silesia, Gliwice (PL). Inst. of Thermal-Engineering; Morris, D R [New Brunswick Univ., Fredericton, NB (Canada). Dept. of Chemical Engineering

    1990-08-01

    Cumulative exergy losses result from the irreversibility of the links of a technological network leading from raw materials and fuels extracted from nature to the product under consideration. The sum of these losses can be apportioned into partial exergy losses (associated with particular links of the technological network) or into constituent exergy losses (associated with constituent subprocesses of the network). The methods of calculation of the partial and constituent exergy losses are presented, taking into account the useful byproducts substituting the major products of other processes. Analyses of partial and constituent exergy losses are made for the technological network of lead metal production. (author).

  8. The Effects of Semantic Transparency and Base Frequency on the Recognition of English Complex Words

    Science.gov (United States)

    Xu, Joe; Taft, Marcus

    2015-01-01

    A visual lexical decision task was used to examine the interaction between base frequency (i.e., the cumulative frequencies of morphologically related forms) and semantic transparency for a list of derived words. Linear mixed effects models revealed that high base frequency facilitates the recognition of the complex word (i.e., a "base…

  9. Gelfond–Bézier curves

    KAUST Repository

    Ait-Haddou, Rachid; Sakane, Yusuke; Nomura, Taishin

    2013-01-01

    We show that the generalized Bernstein bases in Müntz spaces defined by Hirschman and Widder (1949) and extended by Gelfond (1950) can be obtained as pointwise limits of the Chebyshev–Bernstein bases in Müntz spaces with respect to an interval [a,1][a,1] as the positive real number a converges to zero. Such a realization allows for concepts of curve design such as de Casteljau algorithm, blossom, dimension elevation to be transferred from the general theory of Chebyshev blossoms in Müntz spaces to these generalized Bernstein bases that we termed here as Gelfond–Bernstein bases. The advantage of working with Gelfond–Bernstein bases lies in the simplicity of the obtained concepts and algorithms as compared to their Chebyshev–Bernstein bases counterparts.

  10. Bubble Collision in Curved Spacetime

    International Nuclear Information System (INIS)

    Hwang, Dong-il; Lee, Bum-Hoon; Lee, Wonwoo; Yeom, Dong-han

    2014-01-01

    We study vacuum bubble collisions in curved spacetime, in which vacuum bubbles were nucleated in the initial metastable vacuum state by quantum tunneling. The bubbles materialize randomly at different times and then start to grow. It is known that the percolation by true vacuum bubbles is not possible due to the exponential expansion of the space among the bubbles. In this paper, we consider two bubbles of the same size with a preferred axis and assume that two bubbles form very near each other to collide. The two bubbles have the same field value. When the bubbles collide, the collided region oscillates back-and-forth and then the collided region eventually decays and disappears. We discuss radiation and gravitational wave resulting from the collision of two bubbles

  11. Bacterial streamers in curved microchannels

    Science.gov (United States)

    Rusconi, Roberto; Lecuyer, Sigolene; Guglielmini, Laura; Stone, Howard

    2009-11-01

    Biofilms, generally identified as microbial communities embedded in a self-produced matrix of extracellular polymeric substances, are involved in a wide variety of health-related problems ranging from implant-associated infections to disease transmissions and dental plaque. The usual picture of these bacterial films is that they grow and develop on surfaces. However, suspended biofilm structures, or streamers, have been found in natural environments (e.g., rivers, acid mines, hydrothermal hot springs) and are always suggested to stem from a turbulent flow. We report the formation of bacterial streamers in curved microfluidic channels. By using confocal laser microscopy we are able to directly image and characterize the spatial and temporal evolution of these filamentous structures. Such streamers, which always connect the inner corners of opposite sides of the channel, are always located in the middle plane. Numerical simulations of the flow provide evidences for an underlying hydrodynamic mechanism behind the formation of the streamers.

  12. Gelfond–Bézier curves

    KAUST Repository

    Ait-Haddou, Rachid

    2013-02-01

    We show that the generalized Bernstein bases in Müntz spaces defined by Hirschman and Widder (1949) and extended by Gelfond (1950) can be obtained as pointwise limits of the Chebyshev–Bernstein bases in Müntz spaces with respect to an interval [a,1][a,1] as the positive real number a converges to zero. Such a realization allows for concepts of curve design such as de Casteljau algorithm, blossom, dimension elevation to be transferred from the general theory of Chebyshev blossoms in Müntz spaces to these generalized Bernstein bases that we termed here as Gelfond–Bernstein bases. The advantage of working with Gelfond–Bernstein bases lies in the simplicity of the obtained concepts and algorithms as compared to their Chebyshev–Bernstein bases counterparts.

  13. Sibling curves of quadratic polynomials | Wiggins | Quaestiones ...

    African Journals Online (AJOL)

    Sibling curves were demonstrated in [1, 2] as a novel way to visualize the zeroes of real valued functions. In [3] it was shown that a polynomial of degree n has n sibling curves. This paper focuses on the algebraic and geometric properites of the sibling curves of real and complex quadratic polynomials. Key words: Quadratic ...

  14. GLOBAL AND STRICT CURVE FITTING METHOD

    NARCIS (Netherlands)

    Nakajima, Y.; Mori, S.

    2004-01-01

    To find a global and smooth curve fitting, cubic B­Spline method and gathering­ line methods are investigated. When segmenting and recognizing a contour curve of character shape, some global method is required. If we want to connect contour curves around a singular point like crossing points,

  15. Trigonometric Characterization of Some Plane Curves

    Indian Academy of Sciences (India)

    IAS Admin

    (Figure 1). A relation between tan θ and tanψ gives the trigonometric equation of the family of curves. In this article, trigonometric equations of some known plane curves are deduced and it is shown that these equations reveal some geometric characteristics of the families of the curves under consideration. In Section 2,.

  16. M-curves and symmetric products

    Indian Academy of Sciences (India)

    Indranil Biswas

    2017-08-03

    Aug 3, 2017 ... is bounded above by g + 1, where g is the genus of X [11]. Curves which have exactly the maximum number (i.e., genus +1) of components of the real part are called M-curves. Classifying real algebraic curves up to homeomorphism is straightforward, however, classifying even planar non-singular real ...

  17. Holomorphic curves in exploded manifolds: Kuranishi structure

    OpenAIRE

    Parker, Brett

    2013-01-01

    This paper constructs a Kuranishi structure for the moduli stack of holomorphic curves in exploded manifolds. To avoid some technicalities of abstract Kuranishi structures, we embed our Kuranishi structure inside a moduli stack of curves. The construction also works for the moduli stack of holomorphic curves in any compact symplectic manifold.

  18. Automated Blazar Light Curves Using Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Spencer James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-27

    This presentation describes a problem and methodology pertaining to automated blazar light curves. Namely, optical variability patterns for blazars require the construction of light curves and in order to generate the light curves, data must be filtered before processing to ensure quality.

  19. Expansion formulae for characteristics of cumulative cost in finite horizon production models

    NARCIS (Netherlands)

    Ayhan, H.; Schlegel, S.

    2001-01-01

    We consider the expected value and the tail probability of cumulative shortage and holding cost (i.e. the probability that cumulative cost is more than a certain value) in finite horizon production models. An exact expression is provided for the expected value of the cumulative cost for general

  20. Age- and gender-specific estimates of cumulative CT dose over 5 years using real radiation dose tracking data in children

    International Nuclear Information System (INIS)

    Lee, Eunsol; Goo, Hyun Woo; Lee, Jae-Yeong

    2015-01-01

    It is necessary to develop a mechanism to estimate and analyze cumulative radiation risks from multiple CT exams in various clinical scenarios in children. To identify major contributors to high cumulative CT dose estimates using actual dose-length product values collected for 5 years in children. Between August 2006 and July 2011 we reviewed 26,937 CT exams in 13,803 children. Among them, we included 931 children (median age 3.5 years, age range 0 days-15 years; M:F = 533:398) who had 5,339 CT exams. Each child underwent at least three CT scans and had accessible radiation dose reports. Dose-length product values were automatically extracted from DICOM files and we used recently updated conversion factors for age, gender, anatomical region and tube voltage to estimate CT radiation dose. We tracked the calculated CT dose estimates to obtain a 5-year cumulative value for each child. The study population was divided into three groups according to the cumulative CT dose estimates: high, ≥30 mSv; moderate, 10-30 mSv; and low, <10 mSv. We reviewed clinical data and CT protocols to identify major contributors to high and moderate cumulative CT dose estimates. Median cumulative CT dose estimate was 5.4 mSv (range 0.5-71.1 mSv), and median number of CT scans was 4 (range 3-36). High cumulative CT dose estimates were most common in children with malignant tumors (57.9%, 11/19). High frequency of CT scans was attributed to high cumulative CT dose estimates in children with ventriculoperitoneal shunt (35 in 1 child) and malignant tumors (range 18-49). Moreover, high-dose CT protocols, such as multiphase abdomen CT (median 4.7 mSv) contributed to high cumulative CT dose estimates even in children with a low number of CT scans. Disease group, number of CT scans, and high-dose CT protocols are major contributors to higher cumulative CT dose estimates in children. (orig.)

  1. Method of construction spatial transition curve

    Directory of Open Access Journals (Sweden)

    S.V. Didanov

    2013-04-01

    Full Text Available Purpose. The movement of rail transport (speed rolling stock, traffic safety, etc. is largely dependent on the quality of the track. In this case, a special role is the transition curve, which ensures smooth insertion of the transition from linear to circular section of road. The article deals with modeling of spatial transition curve based on the parabolic distribution of the curvature and torsion. This is a continuation of research conducted by the authors regarding the spatial modeling of curved contours. Methodology. Construction of the spatial transition curve is numerical methods for solving nonlinear integral equations, where the initial data are taken coordinate the starting and ending points of the curve of the future, and the inclination of the tangent and the deviation of the curve from the tangent plane at these points. System solutions for the numerical method are the partial derivatives of the equations of the unknown parameters of the law of change of torsion and length of the transition curve. Findings. The parametric equations of the spatial transition curve are calculated by finding the unknown coefficients of the parabolic distribution of the curvature and torsion, as well as the spatial length of the transition curve. Originality. A method for constructing the spatial transition curve is devised, and based on this software geometric modeling spatial transition curves of railway track with specified deviations of the curve from the tangent plane. Practical value. The resulting curve can be applied in any sector of the economy, where it is necessary to ensure a smooth transition from linear to circular section of the curved space bypass. An example is the transition curve in the construction of the railway line, road, pipe, profile, flat section of the working blades of the turbine and compressor, the ship, plane, car, etc.

  2. Cumulative hierarchies and computability over universes of sets

    Directory of Open Access Journals (Sweden)

    Domenico Cantone

    2008-05-01

    Full Text Available Various metamathematical investigations, beginning with Fraenkel’s historical proof of the independence of the axiom of choice, called for suitable definitions of hierarchical universes of sets. This led to the discovery of such important cumulative structures as the one singled out by von Neumann (generally taken as the universe of all sets and Godel’s universe of the so-called constructibles. Variants of those are exploited occasionally in studies concerning the foundations of analysis (according to Abraham Robinson’s approach, or concerning non-well-founded sets. We hence offer a systematic presentation of these many structures, partly motivated by their relevance and pervasiveness in mathematics. As we report, numerous properties of hierarchy-related notions such as rank, have been verified with the assistance of the ÆtnaNova proof-checker.Through SETL and Maple implementations of procedures which effectively handle the Ackermann’s hereditarily finite sets, we illustrate a particularly significant case among those in which the entities which form a universe of sets can be algorithmically constructed and manipulated; hereby, the fruitful bearing on pure mathematics of cumulative set hierarchies ramifies into the realms of theoretical computer science and algorithmics.

  3. Cumulative Effects Assessment: Linking Social, Ecological, and Governance Dimensions

    Directory of Open Access Journals (Sweden)

    Marian Weber

    2012-06-01

    Full Text Available Setting social, economic, and ecological objectives is ultimately a process of social choice informed by science. In this special feature we provide a multidisciplinary framework for the use of cumulative effects assessment in land use planning. Forest ecosystems are facing considerable challenges driven by population growth and increasing demands for resources. In a suite of case studies that span the boreal forest of Western Canada to the interior Atlantic forest of Paraguay we show how transparent and defensible methods for scenario analysis can be applied in data-limited regions and how social dimensions of land use change can be incorporated in these methods, particularly in aboriginal communities that have lived in these ecosystems for generations. The case studies explore how scenario analysis can be used to evaluate various land use options and highlight specific challenges with identifying social and ecological responses, determining thresholds and targets for land use, and integrating local and traditional knowledge in land use planning. Given that land use planning is ultimately a value-laden and often politically charged process we also provide some perspective on various collective and expert-based processes for identifying cumulative impacts and thresholds. The need for good science to inform and be informed by culturally appropriate democratic processes calls for well-planned and multifaceted approaches both to achieve an informed understanding of both residents and governments of the interactive and additive changes caused by development, and to design action agendas to influence such change at the ecological and social level.

  4. Maternal distress and parenting in the context of cumulative disadvantage.

    Science.gov (United States)

    Arditti, Joyce; Burton, Linda; Neeves-Botelho, Sara

    2010-06-01

    This article presents an emergent conceptual model of the features and links between cumulative disadvantage, maternal distress, and parenting practices in low-income families in which parental incarceration has occurred. The model emerged from the integration of extant conceptual and empirical research with grounded theory analysis of longitudinal ethnographic data from Welfare, Children, and Families: A Three-City Study. Fourteen exemplar family cases were used in the analysis. Results indicated that mothers in these families experienced life in the context of cumulative disadvantage, reporting a cascade of difficulties characterized by neighborhood worries, provider concerns, bureaucratic difficulties, violent intimate relationships, and the inability to meet children's needs. Mothers, however, also had an intense desire to protect their children, and to make up for past mistakes. Although, in response to high levels of maternal distress and disadvantage, most mothers exhibited harsh discipline of their children, some mothers transformed their distress by advocating for their children under difficult circumstances. Women's use of harsh discipline and advocacy was not necessarily an "either/or" phenomenon as half of the mothers included in our analysis exhibited both harsh discipline and care/advocacy behaviors. Maternal distress characterized by substance use, while connected to harsh disciplinary behavior, did not preclude mothers engaging in positive parenting behaviors.

  5. Cumulative phase delay imaging for contrast-enhanced ultrasound tomography

    International Nuclear Information System (INIS)

    Demi, Libertario; Van Sloun, Ruud J G; Wijkstra, Hessel; Mischi, Massimo

    2015-01-01

    Standard dynamic-contrast enhanced ultrasound (DCE-US) imaging detects and estimates ultrasound-contrast-agent (UCA) concentration based on the amplitude of the nonlinear (harmonic) components generated during ultrasound (US) propagation through UCAs. However, harmonic components generation is not specific to UCAs, as it also occurs for US propagating through tissue. Moreover, nonlinear artifacts affect standard DCE-US imaging, causing contrast to tissue ratio reduction, and resulting in possible misclassification of tissue and misinterpretation of UCA concentration. Furthermore, no contrast-specific modality exists for DCE-US tomography; in particular speed-of-sound changes due to UCAs are well within those caused by different tissue types. Recently, a new marker for UCAs has been introduced. A cumulative phase delay (CPD) between the second harmonic and fundamental component is in fact observable for US propagating through UCAs, and is absent in tissue. In this paper, tomographic US images based on CPD are for the first time presented and compared to speed-of-sound US tomography. Results show the applicability of this marker for contrast specific US imaging, with cumulative phase delay imaging (CPDI) showing superior capabilities in detecting and localizing UCA, as compared to speed-of-sound US tomography. Cavities (filled with UCA) which were down to 1 mm in diameter were clearly detectable. Moreover, CPDI is free of the above mentioned nonlinear artifacts. These results open important possibilities to DCE-US tomography, with potential applications to breast imaging for cancer localization. (fast track communication)

  6. Cumulant expansions for measuring water exchange using diffusion MRI

    Science.gov (United States)

    Ning, Lipeng; Nilsson, Markus; Lasič, Samo; Westin, Carl-Fredrik; Rathi, Yogesh

    2018-02-01

    The rate of water exchange across cell membranes is a parameter of biological interest and can be measured by diffusion magnetic resonance imaging (dMRI). In this work, we investigate a stochastic model for the diffusion-and-exchange of water molecules. This model provides a general solution for the temporal evolution of dMRI signal using any type of gradient waveform, thereby generalizing the signal expressions for the Kärger model. Moreover, we also derive a general nth order cumulant expansion of the dMRI signal accounting for water exchange, which has not been explored in earlier studies. Based on this analytical expression, we compute the cumulant expansion for dMRI signals for the special case of single diffusion encoding (SDE) and double diffusion encoding (DDE) sequences. Our results provide a theoretical guideline on optimizing experimental parameters for SDE and DDE sequences, respectively. Moreover, we show that DDE signals are more sensitive to water exchange at short-time scale but provide less attenuation at long-time scale than SDE signals. Our theoretical analysis is also validated using Monte Carlo simulations on synthetic structures.

  7. A Cumulant-based Analysis of Nonlinear Magnetospheric Dynamics

    International Nuclear Information System (INIS)

    Johnson, Jay R.; Wing, Simon

    2004-01-01

    Understanding magnetospheric dynamics and predicting future behavior of the magnetosphere is of great practical interest because it could potentially help to avert catastrophic loss of power and communications. In order to build good predictive models it is necessary to understand the most critical nonlinear dependencies among observed plasma and electromagnetic field variables in the coupled solar wind/magnetosphere system. In this work, we apply a cumulant-based information dynamical measure to characterize the nonlinear dynamics underlying the time evolution of the Dst and Kp geomagnetic indices, given solar wind magnetic field and plasma input. We examine the underlying dynamics of the system, the temporal statistical dependencies, the degree of nonlinearity, and the rate of information loss. We find a significant solar cycle dependence in the underlying dynamics of the system with greater nonlinearity for solar minimum. The cumulant-based approach also has the advantage that it is reliable even in the case of small data sets and therefore it is possible to avoid the assumption of stationarity, which allows for a measure of predictability even when the underlying system dynamics may change character. Evaluations of several leading Kp prediction models indicate that their performances are sub-optimal during active times. We discuss possible improvements of these models based on this nonparametric approach

  8. Strategy for an assessment of cumulative ecological impacts

    International Nuclear Information System (INIS)

    Boucher, P.; Collins, J.; Nelsen, J.

    1995-01-01

    The US Department of Energy (DOE) has developed a strategy to conduct an assessment of the cumulative ecological impact of operations at the 300-square-mile Savannah River Site. This facility has over 400 identified waste units and contains several large watersheds. In addition to individual waste units, residual contamination must be evaluated in terms of its contribution to ecological risks at zonal and site-wide levels. DOE must be able to generate sufficient information to facilitate cleanup in the immediate future within the context of a site-wide ecological risk assessment that may not be completed for many years. The strategy superimposes a more global perspective on ecological assessments of individual waste units and provides strategic underpinnings for conducting individual screening-level and baseline risk assessments at the operable unit and zonal or watershed levels. It identifies ecological endpoints and risk assessment tools appropriate for each level of the risk assessment. In addition, it provides a clear mechanism for identifying clean sites through screening-level risk assessments and for elevating sites with residual contamination to the next level of assessment. Whereas screening-level and operable unit-level risk assessments relate directly to cleanup, zonal and site-wide assessments verity or confirm the overall effectiveness of remediation. The latter assessments must show, for example, whether multiple small areas with residual pesticide contamination that have minimal individual impact would pose a cumulative risk from bioaccumulation because they are within the habitat range of an ecological receptor

  9. SCEW: a Microsoft Excel add-in for easy creation of survival curves.

    Science.gov (United States)

    Khan, Haseeb Ahmad

    2006-07-01

    Survival curves are frequently used for reporting survival or mortality outcomes of experimental pharmacological/toxicological studies and of clinical trials. Microsoft Excel is a simple and widely used tool for creation of numerous types of graphic presentations however it is difficult to create step-wise survival curves in Excel. Considering the familiarity of clinicians and biomedical scientists with Excel, an algorithm survival curves in Excel worksheet (SCEW) has been developed for easy creation of survival curves directly in Excel worksheets. The algorithm has been integrated in the form of Excel add-in for easy installation and usage. The program is based on modification of frequency data for binary break-up using the spreadsheet formula functions whereas a macro subroutine automates the creation of survival curves. The advantages of this program are simple data input, minimal procedural steps and the creation of survival curves in the familiar confines of Excel.

  10. Unraveling the photovoltaic technology learning curve by incorporation of input price changes and scale effects

    International Nuclear Information System (INIS)

    Yu, C.F.; van Sark, W.G.J.H.M.; Alsema, E.A.

    2011-01-01

    In a large number of energy models, the use of learning curves for estimating technological improvements has become popular. This is based on the assumption that technological development can be monitored by following cost development as a function of market size. However, recent data show that in some stages of photovoltaic technology (PV) production, the market price of PV modules stabilizes even though the cumulative capacity increases. This implies that no technological improvement takes place in these periods: the cost predicted by the learning curve in the PV study is lower than the market one. We propose that this bias results from ignoring the effects of input prices and scale effects, and that incorporating the input prices and scale effects into the learning curve theory is an important issue in making cost predictions more reliable. In this paper, a methodology is described to incorporate the scale and input-prices effect as the additional variables into the one factor learning curve, which leads to the definition of the multi-factor learning curve. This multi-factor learning curve is not only derived from economic theories, but also supported by an empirical study. The results clearly show that input prices and scale effects are to be included, and that, although market prices are stabilizing, learning is still taking place. (author)

  11. Logistic curves, extraction costs and effective peak oil

    International Nuclear Information System (INIS)

    Brecha, Robert J.

    2012-01-01

    Debates about the possibility of a near-term maximum in world oil production have become increasingly prominent over the past decade, with the focus often being on the quantification of geologically available and technologically recoverable amounts of oil in the ground. Economically, the important parameter is not a physical limit to resources in the ground, but whether market price signals and costs of extraction will indicate the efficiency of extracting conventional or nonconventional resources as opposed to making substitutions over time for other fuels and technologies. We present a hybrid approach to the peak-oil question with two models in which the use of logistic curves for cumulative production are supplemented with data on projected extraction costs and historical rates of capacity increase. While not denying the presence of large quantities of oil in the ground, even with foresight, rates of production of new nonconventional resources are unlikely to be sufficient to make up for declines in availability of conventional oil. Furthermore we show how the logistic-curve approach helps to naturally explain high oil prices even when there are significant quantities of low-cost oil yet to be extracted. - Highlights: ► Extraction cost information together with logistic curves to model oil extraction. ► Two models of extraction sequence for different oil resources. ► Importance of time-delay and extraction rate limits for new resources. ► Model results qualitatively reproduce observed extraction cost dynamics. ► Confirmation of “effective” peak oil, even though resources are in ground.

  12. Power forward curves: a managerial perspective

    International Nuclear Information System (INIS)

    Nagarajan, Shankar

    1999-01-01

    This chapter concentrates on managerial application of power forward curves, and examines the determinants of electricity prices such as transmission constraints, its inability to be stored in a conventional way, its seasonality and weather dependence, the generation stack, and the swing risk. The electricity forward curve, classical arbitrage, constructing a forward curve, volatilities, and electricity forward curve models such as the jump-diffusion model, the mean-reverting heteroscedastic volatility model, and an econometric model of forward prices are examined. A managerial perspective of the applications of the forward curve is presented covering plant valuation, capital budgeting, performance measurement, product pricing and structuring, asset optimisation, valuation of transmission options, and risk management

  13. Retrograde curves of solidus and solubility

    International Nuclear Information System (INIS)

    Vasil'ev, M.V.

    1979-01-01

    The investigation was concerned with the constitutional diagrams of the eutectic type with ''retrograde solidus'' and ''retrograde solubility curve'' which must be considered as diagrams with degenerate monotectic transformation. The solidus and the solubility curves form a retrograde curve with a common retrograde point representing the solubility maximum. The two branches of the Aetrograde curve can be described with the aid of two similar equations. Presented are corresponding equations for the Cd-Zn system and shown is the possibility of predicting the run of the solubility curve

  14. Direct assessment of cumulative aryl hydrocarbon receptor agonist activity in sera from experimentally exposed mice and environmentally exposed humans

    DEFF Research Database (Denmark)

    Schlezinger, Jennifer J; Bernard, Pamela L; Haas, Amelia

    2010-01-01

    (PCB)-exposed Faroe Islanders using an AhR-driven reporter cell line. To validate relationships between serum AhR agonist levels and biological outcomes, AhR agonist activity in mouse sera correlated with toxic end points. AhR agonist activity in unmanipulated ("neat") human sera was compared......, was associated with the frequency of recent pilot whale dinners, but did not correlate with levels of PCBs quantified by GC/MS. Surprisingly, significant "baseline" AhR activity was found in commercial human sera. CONCLUSIONS: An AhR reporter assay revealed cumulative levels of AhR activation potential in neat...

  15. Variant at serotonin transporter gene predicts increased imitation in toddlers: relevance to the human capacity for cumulative culture.

    Science.gov (United States)

    Schroeder, Kari Britt; Asherson, Philip; Blake, Peter R; Fenstermacher, Susan K; Saudino, Kimberly J

    2016-04-01

    Cumulative culture ostensibly arises from a set of sociocognitive processes which includes high-fidelity production imitation, prosociality and group identification. The latter processes are facilitated by unconscious imitation or social mimicry. The proximate mechanisms of individual variation in imitation may thus shed light on the evolutionary history of the human capacity for cumulative culture. In humans, a genetic component to variation in the propensity for imitation is likely. A functional length polymorphism in the serotonin transporter gene, the short allele at 5HTTLPR, is associated with heightened responsiveness to the social environment as well as anatomical and activational differences in the brain's imitation circuity. Here, we evaluate whether this polymorphism contributes to variation in production imitation and social mimicry. Toddlers with the short allele at 5HTTLPR exhibit increased social mimicry and increased fidelity of demonstrated novel object manipulations. Thus, the short allele is associated with two forms of imitation that may underlie the human capacity for cumulative culture. The short allele spread relatively recently, possibly due to selection, and its frequency varies dramatically on a global scale. Diverse observations can be unified via conceptualization of 5HTTLPR as influencing the propensity to experience others' emotions, actions and sensations, potentially through the mirror mechanism. © 2016 The Author(s).

  16. The association between high levels of cumulative life stress and aberrant resting state EEG dynamics in old age.

    Science.gov (United States)

    Marshall, Amanda C; Cooper, Nicholas R

    2017-07-01

    Cumulative experienced stress produces shortcomings in old adults' cognitive performance. These are reflected in electrophysiological changes tied to task execution. This study explored whether stress-related aberrations in older adults' electroencephalographic (EEG) activity were also apparent in the system at rest. To this effect, the amount of stressful life events experienced by 60 young and 60 elderly participants were assessed in conjunction with resting state power changes in the delta, theta, alpha, and beta frequencies during a resting EEG recording. Findings revealed elevated levels of delta power among elderly individuals reporting high levels of cumulative life stress. These differed significantly from young high and low stress individuals and old adults with low levels of stress. Increases of delta activity have been linked to the emergence of conditions such as Alzheimer's Disease and Mild Cognitive Impairment. Thus, a potential interpretation of our findings associates large amounts of cumulative stress with an increased risk of developing age-related cognitive pathologies in later life. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. High Frequency Oscillatory Ventilation

    Directory of Open Access Journals (Sweden)

    AC Bryan

    1996-01-01

    Full Text Available High frequency oscillatory (HFO ventilation using low tidal volume and peak airway pressures is extremely efficient at eliminating carbon dioxide and raising pH in the newborn infant with acute respiratory failure. Improvement in oxygenation requires a strategy of sustained or repetitive inflations to 25 to 30 cm H2O in order to place the lung on the deflation limb of the pressure-volume curve. This strategy has also been shown to decrease the amount of secondary lung injury in animal models. Experience of the use of HFO ventilation as a rescue therapy as well as several published controlled trials have shown improved outcomes and a decrease in the use of extracorporeal membrane oxygenation when it has been used in newborns.

  18. The writhe of open and closed curves

    International Nuclear Information System (INIS)

    Berger, Mitchell A; Prior, Chris

    2006-01-01

    Twist and writhe measure basic geometric properties of a ribbon or tube. While these measures have applications in molecular biology, materials science, fluid mechanics and astrophysics, they are under-utilized because they are often considered difficult to compute. In addition, many applications involve curves with endpoints (open curves); but for these curves the definition of writhe can be ambiguous. This paper provides simple expressions for the writhe of closed curves, and provides a new definition of writhe for open curves. The open curve definition is especially appropriate when the curve is anchored at endpoints on a plane or stretches between two parallel planes. This definition can be especially useful for magnetic flux tubes in the solar atmosphere, and for isotropic rods with ends fixed to a plane

  19. [Analgesic quality in a postoperative pain service: continuous assessment with the cumulative sum (cusum) method].

    Science.gov (United States)

    Baptista Macaroff, W M; Castroman Espasandín, P

    2007-01-01

    The aim of this study was to assess the cumulative sum (cusum) method for evaluating the performance of our hospital's acute postoperative pain service. The period of analysis was 7 months. Analgesic failure was defined as a score of 3 points or more on a simple numerical scale. Acceptable failure (p0) was set at 20% of patients upon admission to the postanesthetic recovery unit and at 7% 24 hours after surgery. Unacceptable failure was set at double the p0 rate at each time (40% and 14%, respectively). The unit's patient records were used to generate a cusum graph for each evaluation. Nine hundred four records were included. The rate of failure was 31.6% upon admission to the unit and 12.1% at the 24-hour postoperative assessment. The curve rose rapidly to the value set for p0 at both evaluation times (n = 14 and n = 17, respectively), later leveled off, and began to fall after 721 and 521 cases, respectively. Our study shows the efficacy of the cusum method for monitoring a proposed quality standard. The graph also showed periods of suboptimal performance that would not have been evident from analyzing the data en block. Thus the cusum method would facilitate rapid detection of periods in which quality declines.

  20. Cumulative fatigue and creep-fatigue damage at 3500C on recrystallized zircaloy 4

    International Nuclear Information System (INIS)

    Brun, G.; Pelchat, J.; Floze, J.C.; Galimberti, M.

    1985-06-01

    An experimental programme undertaken by C.E.A., E.D.F. and FRAGEMA with the aim of characterizing the fatigue and creep fatigue behaviour of zircaloy-4 following annealing treatments (recrystallized, stress-delived) is in progress. The results given below concern only recrystallized material. Cyclic properties, low-cycle fatigue curves and creep behaviour laws under stresses have been established. Sequential tests of pure fatigue and creep-fatigue were performed. The cumulative life fractions at fracture depend on the sequence of leading, stress history and number of cycles of prestressing. The MINER's rule appears to be conservative with regard to a low-high loading sequence whereas it is not for the reverse high-low loading sequences. Fatigue and creep damage are not interchangeable. Pre-creep improves the fatigue resistance. Pre-fatigue improves the creep strength as long as the beneficial effect of cyclic hardening overcomes the damaging effect of surface cracking. The introduction of a tension hold time into the fatigue cycle slightly increases cyclic hardening and reduces the number of cycles to failure. For hold times of less than one hour, the sum of fatigue and creep life fractions is closed to one

  1. Using non-time-series to determine supply elasticity: how far do prices change the Hubbert curve?

    International Nuclear Information System (INIS)

    Reynolds, D.B.

    2002-01-01

    An important concern of OPEC's work is to be able to understand how much supply of oil exists in different countries, in order to help better conserve oil. This paper extends M. King Hubbert's oil production and discovery forecasting model (Hubbert, 1962), using a non-time-series cumulative discovery and production quadratic Hubbert curve and structural shift variables to model technology and regulation changes. The model can be used to determine better world oil supplies. Price is tested, to see how powerful it is for increasing or decreasing oil supply. Using a trend of cumulative production, instead of time, will help to better fix the supply elasticity with respect to price, which is shown to be very inelastic. An interesting question is whether cumulative discovery or production constitutes an I(2) variable. This paper explains that they are not I(2) variables. (Author)

  2. Path integrals on curved manifolds

    International Nuclear Information System (INIS)

    Grosche, C.; Steiner, F.

    1987-01-01

    A general framework for treating path integrals on curved manifolds is presented. We also show how to perform general coordinate and space-time transformations in path integrals. The main result is that one has to subtract a quantum correction ΔV ∝ ℎ 2 from the classical Lagrangian L, i.e. the correct effective Lagrangian to be used in the path integral is L eff = L-ΔV. A general prescription for calculating the quantum correction ΔV is given. It is based on a canonical approach using Weyl-ordering and the Hamiltonian path integral defined by the midpoint prescription. The general framework is illustrated by several examples: The d-dimensional rotator, i.e. the motion on the sphere S d-1 , the path integral in d-dimensional polar coordinates, the exact treatment of the hydrogen atom in R 2 and R 3 by performing a Kustaanheimo-Stiefel transformation, the Langer transformation and the path integral for the Morse potential. (orig.)

  3. Page curves for tripartite systems

    International Nuclear Information System (INIS)

    Hwang, Junha; Lee, Deok Sang; Nho, Dongju; Oh, Jeonghun; Park, Hyosub; Zoe, Heeseung; Yeom, Dong-han

    2017-01-01

    We investigate information flow and Page curves for tripartite systems. We prepare a tripartite system (say, A , B , and C ) of a given number of states and calculate information and entropy contents by assuming random states. Initially, every particle was in A (this means a black hole), and as time goes on, particles move to either B (this means Hawking radiation) or C (this means a broadly defined remnant, including a non-local transport of information, the last burst, an interior large volume, or a bubble universe, etc). If the final number of states of the remnant is smaller than that of Hawking radiation, then information will be stored by both the radiation and the mutual information between the radiation and the remnant, while the remnant itself does not contain information. On the other hand, if the final number of states of the remnant is greater than that of Hawking radiation, then the radiation contains negligible information, while the remnant and the mutual information between the radiation and the remnant contain information. Unless the number of states of the remnant is large enough compared to the entropy of the black hole, Hawking radiation must contain information; and we meet the menace of black hole complementarity again. Therefore, this contrasts the tension between various assumptions and candidates of the resolution of the information loss problem. (paper)

  4. Vacuum polarization in curved spacetime

    International Nuclear Information System (INIS)

    Guy, R.W.

    1979-01-01

    A necessary step in the process of understanding the quantum theory of gravity is the calculation of the stress-energy tensor of quantized fields in curved space-times. The determination of the stress tensor, a formally divergent object, is made possible in this dissertation by utilizing the zeta-function method of regularization and renormalization. By employing this scheme's representation of the renormalized effective action functional, an expression of the stress tensor for a massless, conformally invariant scalar field, first given by DeWitt, is derived. The form of the renormalized stress tensor is first tested in various examples of flat space-times. It is shown to vanish in Minkowski space and to yield the accepted value of the energy density in the Casimir effect. Next, the stress tensor is calculated in two space-times of constant curvature, the Einstein universe and the deSitter universe, and the results are shown to agree with those given by an expression of the stress tensor that is valid in conformally flat space-times. This work culminates in the determination of the stress tensor on the horizon of a Schwarzschild black hole. This is accomplished by approximating the radial part of the eigen-functions and the metric in the vicinity of the horizon. The stress tensor at this level approximation is found to be pure trace. The approximated forms of the Schwarzschild metric describes a conformally flat space-time that possesses horizons

  5. Cumulative effects in Swedish EIA practice - difficulties and obstacles

    International Nuclear Information System (INIS)

    Waernbaeck, Antoienette; Hilding-Rydevik, Tuija

    2009-01-01

    The importance of considering cumulative effects (CE) in the context of environmental assessment is manifested in the EU regulations. The demands on the contents of Environmental Impact Assessment (EIA) and Strategic Environmental Assessment (SEA) documents explicitly ask for CE to be described. In Swedish environmental assessment documents CE are rarely described or included. The aim of this paper is to look into the reasons behind this fact in the Swedish context. The paper describes and analyse how actors implementing the EIA and SEA legislation in Sweden perceive the current situation in relation to the legislative demands and the inclusion of cumulative effects. Through semi-structured interviews the following questions have been explored: Is the phenomenon of CE discussed and included in the EIA/SEA process? What do the actors include in and what is their knowledge of the term and concept of CE? Which difficulties and obstacles do these actors experience and what possibilities for inclusion of CE do they see in the EIA/SEA process? A large number of obstacles and hindrances emerged from the interviews conducted. It can be concluded from the analysis that the will to act does seem to exist. A lack of knowledge in respect of how to include cumulative effects and a lack of clear regulations concerning how this should be done seem to be perceived as the main obstacles. The knowledge of the term and the phenomenon is furthermore quite narrow and not all encompassing. They experience that there is a lack of procedures in place. They also seem to lack knowledge of methods in relation to how to actually work, in practice, with CE and how to include CE in the EIA/SEA process. It can be stated that the existence of this poor picture in relation to practice concerning CE in the context of impact assessment mirrors the existing and so far rather vague demands in respect of the inclusion and assessment of CE in Swedish EIA and SEA legislation, regulations, guidelines and

  6. Technical Note: SCUDA: A software platform for cumulative dose assessment

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seyoun; McNutt, Todd; Quon, Harry; Wong, John; Lee, Junghoon, E-mail: rshekhar@childrensnational.org, E-mail: junghoon@jhu.edu [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Plishker, William [IGI Technologies, Inc., College Park, Maryland 20742 (United States); Shekhar, Raj, E-mail: rshekhar@childrensnational.org, E-mail: junghoon@jhu.edu [IGI Technologies, Inc., College Park, Maryland 20742 and Sheikh Zayed Institute for Pediatric Surgical Innovation, Children’s National Health System, Washington, DC 20010 (United States)

    2016-10-15

    Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides

  7. Scenario analysis for estimating the learning rate of photovoltaic power generation based on learning curve theory in South Korea

    International Nuclear Information System (INIS)

    Hong, Sungjun; Chung, Yanghon; Woo, Chungwon

    2015-01-01

    South Korea, as the 9th largest energy consuming in 2013 and the 7th largest greenhouse gas emitting country in 2011, established ‘Low Carbon Green Growth’ as the national vision in 2008, and is announcing various active energy policies that are set to gain the attention of the world. In this paper, we estimated the decrease of photovoltaic power generation cost in Korea based on the learning curve theory. Photovoltaic energy is one of the leading renewable energy sources, and countries all over the world are currently expanding R and D, demonstration and deployment of photovoltaic technology. In order to estimate the learning rate of photovoltaic energy in Korea, both conventional 1FLC (one-factor learning curve), which considers only the cumulative power generation, and 2FLC, which also considers R and D investment were applied. The 1FLC analysis showed that the cost of power generation decreased by 3.1% as the cumulative power generation doubled. The 2FCL analysis presented that the cost decreases by 2.33% every time the cumulative photovoltaic power generation is doubled and by 5.13% every time R and D investment is doubled. Moreover, the effect of R and D investment on photovoltaic technology took after around 3 years, and the depreciation rate of R and D investment was around 20%. - Highlights: • We analyze the learning effects of photovoltaic energy technology in Korea. • In order to calculate the learning rate, we use 1FLC (one-factor learning curve) and 2FLC methods, respectively. • 1FLC method considers only the cumulative power generation. • 2FLC method considers both cumulative power generation and knowledge stock. • We analyze a variety of scenarios by time lag and depreciation rate of R and D investment

  8. The cumulative impact of annual coral bleaching can turn some coral species winners into losers.

    Science.gov (United States)

    Grottoli, Andréa G; Warner, Mark E; Levas, Stephen J; Aschaffenburg, Matthew D; Schoepf, Verena; McGinley, Michael; Baumann, Justin; Matsui, Yohei

    2014-12-01

    Mass coral bleaching events caused by elevated seawater temperatures result in extensive coral loss throughout the tropics, and are projected to increase in frequency and severity. If bleaching becomes an annual event later in this century, more than 90% of coral reefs worldwide may be at risk of long-term degradation. While corals can recover from single isolated bleaching and can acclimate to recurring bleaching events that are separated by multiple years, it is currently unknown if and how they will survive and possibly acclimatize to annual coral bleaching. Here, we demonstrate for the first time that annual coral bleaching can dramatically alter thermal tolerance in Caribbean corals. We found that high coral energy reserves and changes in the dominant algal endosymbiont type (Symbiodinium spp.) facilitated rapid acclimation in Porites divaricata, whereas low energy reserves and a lack of algal phenotypic plasticity significantly increased susceptibility in Porites astreoides to bleaching the following year. Phenotypic plasticity in the dominant endosymbiont type of Orbicella faveolata did not prevent repeat bleaching, but may have facilitated rapid recovery. Thus, coral holobiont response to an isolated single bleaching event is not an accurate predictor of its response to bleaching the following year. Rather, the cumulative impact of annual coral bleaching can turn some coral species 'winners' into 'losers', and can also facilitate acclimation and turn some coral species 'losers' into 'winners'. Overall, these findings indicate that cumulative impact of annual coral bleaching could result in some species becoming increasingly susceptible to bleaching and face a long-term decline, while phenotypically plastic coral species will acclimatize and persist. Thus, annual coral bleaching and recovery could contribute to the selective loss of coral diversity as well as the overall decline of coral reefs in the Caribbean. © 2014 John Wiley & Sons Ltd.

  9. Negative impact of high cumulative glucocorticoid dose on bone metabolism of patients with myasthenia gravis.

    Science.gov (United States)

    Braz, Nayara Felicidade Tomaz; Rocha, Natalia Pessoa; Vieira, Érica Leandro Marciano; Gomez, Rodrigo Santiago; Barbosa, Izabela Guimarães; Malheiro, Olívio Brito; Kakehasi, Adriana Maria; Teixeira, Antonio Lucio

    2017-08-01

    This current study aimed to evaluate the frequency of low bone mass, osteopenia, and osteoporosis in patients with myasthenia gravis (MG) and to investigate the possible association between bone mineral density (BMD) and plasma levels of bone metabolism markers. Eighty patients with MG and 62 controls BMD were measured in the right femoral neck and lumbar spine by dual-energy X-ray absorptiometry. Plasma concentrations of osteocalcin, osteopontin, osteoprotegerin, tumor necrosis factor (TNF-α), interleukin (IL)-1β, IL-6, dickkopf (DKK-1), sclerostin, insulin, leptin, adrenocorticotropic hormone, parathyroid hormone, and fibroblast growth factor (FGF-23) were analyzed by Luminex®. The mean age of patients was 41.9 years, with 13.5 years of length of illness, and a mean cumulative dose of glucocorticoids 38,123 mg. Patients had significant reduction in BMD of the lumbar, the femoral neck, and in the whole body when compared with controls. Fourteen percent MG patients had osteoporosis at the lumbar spine and 2.5% at the femoral neck. In comparison with controls, patients with MG presented lower levels of osteocalcin, adrenocorticotropic hormone, parathyroid hormone, sclerostin, TNF-α, and DKK-1 and higher levels of FGF-23, leptin, and IL-6. There was a significant negative correlation between cumulative glucocorticoid dose and serum calcium, lumbar spine T-score, femoral neck BMD, T-score, and Z-score. After multivariate analysis, higher TNF-α levels increased the likelihood of presenting low bone mass by 2.62. MG patients under corticotherapy presented low BMD and altered levels of bone markers.

  10. CDF-XL: computing cumulative distribution functions of reaction time data in Excel.

    Science.gov (United States)

    Houghton, George; Grange, James A

    2011-12-01

    In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.

  11. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  12. Analysis of normalized-characteristic curves and determination of the granulometric state of dissolved uranium dioxides

    International Nuclear Information System (INIS)

    Melichar, F.; Neumann, L.

    1977-01-01

    Methods are presented for the analysis of normalized-characteristic curves, which make it possible to determine the granulometric composition of a dissolved polydispersion - the cumulative mass distribution of particles - as a function of the relative particle size. If the size of the largest particle in the dissolved polydispersion is known, these methods allow the determination of the dependence of cumulative mass ratios of particles on their absolute sizes. In the inverse method of the geometrical model for determining the granulometric composition of a dissolved polydispersion, the polydispersion is represented by a finite number of monodispersions. An accurate analysis of normalized-characteristic equations leads to the Akselrud dissolution model. As against the other two methods, the latter allows the determination of the granulometric composition for an arbitrary number of particle sizes. The method of the granulometric atlas is a method for estimating the granulometric composition of a dissolved polydispersion and is based on comparison of a normalized-characteristic curve for an unknown granulometric composition with an atlas of normalized-characteristic curves for selected granulometric spectra of polydispersions. (author)

  13. The Value of Hydrograph Partitioning Curves for Calibrating Hydrological Models in Glacierized Basins

    Science.gov (United States)

    He, Zhihua; Vorogushyn, Sergiy; Unger-Shayesteh, Katy; Gafurov, Abror; Kalashnikova, Olga; Omorova, Elvira; Merz, Bruno

    2018-03-01

    This study refines the method for calibrating a glacio-hydrological model based on Hydrograph Partitioning Curves (HPCs), and evaluates its value in comparison to multidata set optimization approaches which use glacier mass balance, satellite snow cover images, and discharge. The HPCs are extracted from the observed flow hydrograph using catchment precipitation and temperature gradients. They indicate the periods when the various runoff processes, such as glacier melt or snow melt, dominate the basin hydrograph. The annual cumulative curve of the difference between average daily temperature and melt threshold temperature over the basin, as well as the annual cumulative curve of average daily snowfall on the glacierized areas are used to identify the starting and end dates of snow and glacier ablation periods. Model parameters characterizing different runoff processes are calibrated on different HPCs in a stepwise and iterative way. Results show that the HPC-based method (1) delivers model-internal consistency comparably to the tri-data set calibration method; (2) improves the stability of calibrated parameter values across various calibration periods; and (3) estimates the contributions of runoff components similarly to the tri-data set calibration method. Our findings indicate the potential of the HPC-based approach as an alternative for hydrological model calibration in glacierized basins where other calibration data sets than discharge are often not available or very costly to obtain.

  14. The Kepler Light Curves of AGN: A Detailed Analysis

    Science.gov (United States)

    Smith, Krista Lynne; Mushotzky, Richard F.; Boyd, Patricia T.; Malkan, Matt; Howell, Steve B.; Gelino, Dawn M.

    2018-04-01

    We present a comprehensive analysis of 21 light curves of Type 1 active galactic nuclei (AGN) from the Kepler spacecraft. First, we describe the necessity and development of a customized pipeline for treating Kepler data of stochastically variable sources like AGN. We then present the light curves, power spectral density functions (PSDs), and flux histograms. The light curves display an astonishing variety of behaviors, many of which would not be detected in ground-based studies, including switching between distinct flux levels. Six objects exhibit PSD flattening at characteristic timescales that roughly correlate with black hole mass. These timescales are consistent with orbital timescales or free-fall accretion timescales. We check for correlations of variability and high-frequency PSD slope with accretion rate, black hole mass, redshift, and luminosity. We find that bolometric luminosity is anticorrelated with both variability and steepness of the PSD slope. We do not find evidence of the linear rms–flux relationships or lognormal flux distributions found in X-ray AGN light curves, indicating that reprocessing is not a significant contributor to optical variability at the 0.1%–10% level.

  15. Estimation of Typhoon Wind Hazard Curves for Nuclear Sites

    Energy Technology Data Exchange (ETDEWEB)

    Choun, Young-Sun; Kim, Min-Kyu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The intensity of such typhoons, which can influence the Korean Peninsula, is on an increasing trend owing to a rapid change of climate of the Northwest Pacific Ocean. Therefore, nuclear facilities should be prepared against future super-typhoons. Currently, the U.S. Nuclear Regulatory Commission requires that a new NPP should be designed to endure the design-basis hurricane wind speeds corresponding to an annual exceedance frequency of 10{sup -7} (return period of 10 million years). A typical technique used to estimate typhoon wind speeds is based on a sampling of the key parameters of typhoon wind models from the distribution functions fitting statistical distributions to the observation data. Thus, the estimated wind speeds for long return periods include an unavoidable uncertainty owing to a limited observation. This study estimates the typhoon wind speeds for nuclear sites using a Monte Carlo simulation, and derives wind hazard curves using a logic-tree framework to reduce the epistemic uncertainty. Typhoon wind speeds were estimated for different return periods through a Monte-Carlo simulation using the typhoon observation data, and the wind hazard curves were derived using a logic-tree framework for three nuclear sites. The hazard curves for the simulated and probable maximum winds were obtained. The mean hazard curves for the simulated and probable maximum winds can be used for the design and risk assessment of an NPP.

  16. Cumulative trauma and symptom complexity in children: a path analysis.

    Science.gov (United States)

    Hodges, Monica; Godbout, Natacha; Briere, John; Lanktree, Cheryl; Gilbert, Alicia; Kletzka, Nicole Taylor

    2013-11-01

    Multiple trauma exposures during childhood are associated with a range of psychological symptoms later in life. In this study, we examined whether the total number of different types of trauma experienced by children (cumulative trauma) is associated with the complexity of their subsequent symptomatology, where complexity is defined as the number of different symptom clusters simultaneously elevated into the clinical range. Children's symptoms in six different trauma-related areas (e.g., depression, anger, posttraumatic stress) were reported both by child clients and their caretakers in a clinical sample of 318 children. Path analysis revealed that accumulated exposure to multiple different trauma types predicts symptom complexity as reported by both children and their caretakers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Near-Field Source Localization Using a Special Cumulant Matrix

    Science.gov (United States)

    Cui, Han; Wei, Gang

    A new near-field source localization algorithm based on a uniform linear array was proposed. The proposed algorithm estimates each parameter separately but does not need pairing parameters. It can be divided into two important steps. The first step is bearing-related electric angle estimation based on the ESPRIT algorithm by constructing a special cumulant matrix. The second step is the other electric angle estimation based on the 1-D MUSIC spectrum. It offers much lower computational complexity than the traditional near-field 2-D MUSIC algorithm and has better performance than the high-order ESPRIT algorithm. Simulation results demonstrate that the performance of the proposed algorithm is close to the Cramer-Rao Bound (CRB).

  18. Cumulative growth of minor hysteresis loops in the Kolmogorov model

    International Nuclear Information System (INIS)

    Meilikhov, E. Z.; Farzetdinova, R. M.

    2013-01-01

    The phenomenon of nonrepeatability of successive remagnetization cycles in Co/M (M = Pt, Pd, Au) multilayer film structures is explained in the framework of the Kolmogorov crystallization model. It is shown that this model of phase transitions can be adapted so as to adequately describe the process of magnetic relaxation in the indicated systems with “memory.” For this purpose, it is necessary to introduce some additional elements into the model, in particular, (i) to take into account the fact that every cycle starts from a state “inherited” from the preceding cycle and (ii) to assume that the rate of growth of a new magnetic phase depends on the cycle number. This modified model provides a quite satisfactory qualitative and quantitative description of all features of successive magnetic relaxation cycles in the system under consideration, including the surprising phenomenon of cumulative growth of minor hysteresis loops.

  19. Cumulative protons in 12C fragmentation at intermediate energy

    International Nuclear Information System (INIS)

    Abramov, B.M.; Alekseev, P.N.; Borodin, Y.A.; Bulychjov, S.A.; Dukhovskoi, I.A.; Khanov, A.I.; Krutenkova, A.P.; Kulikov, V.V.; Martemianov, M.A.; Matsuk, M.A.; Turdakina, E.N.

    2014-01-01

    In the FRAGM experiment at heavy ion accelerator complex TWAC-ITEP, the proton yields at an angle 3.5 degrees have been measured in fragmentation of carbon ions at T 0 equals 0.3, 0.6, 0.95 and 2.0 GeV/nucleon on beryllium target. The data are presented as invariant proton yields on cumulative variable x in the range 0.9 < x < 2.4. Proton spectra cover six orders of invariant cross section magnitude. They have been analyzed in the framework of quark cluster fragmentation model. Fragmentation functions of quark- gluon string model are used. The probabilities of the existence of multi-quark clusters in carbon nuclei are estimated to be 8 - 12% for six-quark clusters and 0.2 - 0.6% for nine- quark clusters. (authors)

  20. Ratcheting up the ratchet: on the evolution of cumulative culture.

    Science.gov (United States)

    Tennie, Claudio; Call, Josep; Tomasello, Michael

    2009-08-27

    Some researchers have claimed that chimpanzee and human culture rest on homologous cognitive and learning mechanisms. While clearly there are some homologous mechanisms, we argue here that there are some different mechanisms at work as well. Chimpanzee cultural traditions represent behavioural biases of different populations, all within the species' existing cognitive repertoire (what we call the 'zone of latent solutions') that are generated by founder effects, individual learning and mostly product-oriented (rather than process-oriented) copying. Human culture, in contrast, has the distinctive characteristic that it accumulates modifications over time (what we call the 'ratchet effect'). This difference results from the facts that (i) human social learning is more oriented towards process than product and (ii) unique forms of human cooperation lead to active teaching, social motivations for conformity and normative sanctions against non-conformity. Together, these unique processes of social learning and cooperation lead to humans' unique form of cumulative cultural evolution.

  1. EXPERIMENTAL VALIDATION OF CUMULATIVE SURFACE LOCATION ERROR FOR TURNING PROCESSES

    Directory of Open Access Journals (Sweden)

    Adam K. Kiss

    2016-02-01

    Full Text Available The aim of this study is to create a mechanical model which is suitable to investigate the surface quality in turning processes, based on the Cumulative Surface Location Error (CSLE, which describes the series of the consecutive Surface Location Errors (SLE in roughing operations. In the established model, the investigated CSLE depends on the currently and the previously resulted SLE by means of the variation of the width of cut. The phenomenon of the system can be described as an implicit discrete map. The stationary Surface Location Error and its bifurcations were analysed and flip-type bifurcation was observed for CSLE. Experimental verification of the theoretical results was carried out.

  2. Ratcheting up the ratchet: on the evolution of cumulative culture

    Science.gov (United States)

    Tennie, Claudio; Call, Josep; Tomasello, Michael

    2009-01-01

    Some researchers have claimed that chimpanzee and human culture rest on homologous cognitive and learning mechanisms. While clearly there are some homologous mechanisms, we argue here that there are some different mechanisms at work as well. Chimpanzee cultural traditions represent behavioural biases of different populations, all within the species’ existing cognitive repertoire (what we call the ‘zone of latent solutions’) that are generated by founder effects, individual learning and mostly product-oriented (rather than process-oriented) copying. Human culture, in contrast, has the distinctive characteristic that it accumulates modifications over time (what we call the ‘ratchet effect’). This difference results from the facts that (i) human social learning is more oriented towards process than product and (ii) unique forms of human cooperation lead to active teaching, social motivations for conformity and normative sanctions against non-conformity. Together, these unique processes of social learning and cooperation lead to humans’ unique form of cumulative cultural evolution. PMID:19620111

  3. Cumulative neutrino background from quasar-driven outflows

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xiawei; Loeb, Abraham, E-mail: xiawei.wang@cfa.harvard.edu, E-mail: aloeb@cfa.harvard.edu [Department of Astronomy, Harvard University, 60 Garden Street, Cambridge, MA 02138 (United States)

    2016-12-01

    Quasar-driven outflows naturally account for the missing component of the extragalactic γ-ray background through neutral pion production in interactions between protons accelerated by the forward outflow shock and interstellar protons. We study the simultaneous neutrino emission by the same protons. We adopt outflow parameters that best fit the extragalactic γ-ray background data and derive a cumulative neutrino background of ∼ 10{sup −7} GeV cm{sup −2} s{sup −1} sr{sup −1} at neutrino energies E {sub ν} ∼> 10 TeV, which naturally explains the most recent IceCube data without tuning any free parameters. The link between the γ-ray and neutrino emission from quasar outflows can be used to constrain the high-energy physics of strong shocks at cosmological distances.

  4. Using Fuzzy Probability Weights in Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Užga-Rebrovs Oļegs

    2016-12-01

    Full Text Available During the past years, a rapid growth has been seen in the descriptive approaches to decision choice. As opposed to normative expected utility theory, these approaches are based on the subjective perception of probabilities by the individuals, which takes place in real situations of risky choice. The modelling of this kind of perceptions is made on the basis of probability weighting functions. In cumulative prospect theory, which is the focus of this paper, decision prospect outcome weights are calculated using the obtained probability weights. If the value functions are constructed in the sets of positive and negative outcomes, then, based on the outcome value evaluations and outcome decision weights, generalised evaluations of prospect value are calculated, which are the basis for choosing an optimal prospect.

  5. Modelling the evolution and diversity of cumulative culture

    Science.gov (United States)

    Enquist, Magnus; Ghirlanda, Stefano; Eriksson, Kimmo

    2011-01-01

    Previous work on mathematical models of cultural evolution has mainly focused on the diffusion of simple cultural elements. However, a characteristic feature of human cultural evolution is the seemingly limitless appearance of new and increasingly complex cultural elements. Here, we develop a general modelling framework to study such cumulative processes, in which we assume that the appearance and disappearance of cultural elements are stochastic events that depend on the current state of culture. Five scenarios are explored: evolution of independent cultural elements, stepwise modification of elements, differentiation or combination of elements and systems of cultural elements. As one application of our framework, we study the evolution of cultural diversity (in time as well as between groups). PMID:21199845

  6. Optimal execution with price impact under Cumulative Prospect Theory

    Science.gov (United States)

    Zhao, Jingdong; Zhu, Hongliang; Li, Xindan

    2018-01-01

    Optimal execution of a stock (or portfolio) has been widely studied in academia and in practice over the past decade, and minimizing transaction costs is a critical point. However, few researchers consider the psychological factors for the traders. What are traders truly concerned with - buying low in the paper accounts or buying lower compared to others? We consider the optimal trading strategies in terms of the price impact and Cumulative Prospect Theory and identify some specific properties. Our analyses indicate that a large proportion of the execution volume is distributed at both ends of the transaction time. But the trader's optimal strategies may not be implemented at the same transaction size and speed in different market environments.

  7. Practical management of cumulative anthropogenic impacts with working marine examples

    DEFF Research Database (Denmark)

    Kyhn, Line Anker; Wright, Andrew J.

    2014-01-01

    for petroleum. Human disturbances, including the noise almost ubiquitously associated with human activity, are likely to increase the incidence, magnitude, and duration of adverse effects on marine life, including stress responses. Stress responses have the potential to induce fitness consequences...... on impact can be facilitated through implementation of regular application cycles for project authorization or improved programmatic and aggregated impact assessments that simultaneously consider multiple projects. Cross-company collaborations and a better incorporation of uncertainty into decision making...... could also help limit, if not reduce, cumulative impacts of multiple human activities. These simple management steps may also form the basis of a rudimentary form of marine spatial planning and could be used in support of future ecosystem-based management efforts....

  8. Practical management of cumulative anthropogenic impacts with working marine examples.

    Science.gov (United States)

    Wright, Andrew J; Kyhn, Line A

    2015-04-01

    Human pressure on the environment is expanding and intensifying, especially in coastal and offshore areas. Major contributors to this are the current push for offshore renewable energy sources, which are thought of as environmentally friendly sources of power, as well as the continued demand for petroleum. Human disturbances, including the noise almost ubiquitously associated with human activity, are likely to increase the incidence, magnitude, and duration of adverse effects on marine life, including stress responses. Stress responses have the potential to induce fitness consequences for individuals, which add to more obvious directed takes (e.g., hunting or fishing) to increase the overall population-level impact. To meet the requirements of marine spatial planning and ecosystem-based management, many efforts are ongoing to quantify the cumulative impacts of all human actions on marine species or populations. Meanwhile, regulators face the challenge of managing these accumulating and interacting impacts with limited scientific guidance. We believe there is scientific support for capping the level of impact for (at a minimum) populations in decline or with unknown statuses. This cap on impact can be facilitated through implementation of regular application cycles for project authorization or improved programmatic and aggregated impact assessments that simultaneously consider multiple projects. Cross-company collaborations and a better incorporation of uncertainty into decision making could also help limit, if not reduce, cumulative impacts of multiple human activities. These simple management steps may also form the basis of a rudimentary form of marine spatial planning and could be used in support of future ecosystem-based management efforts. © 2014 Society for Conservation Biology.

  9. County-level cumulative environmental quality associated with cancer incidence.

    Science.gov (United States)

    Jagai, Jyotsna S; Messer, Lynne C; Rappazzo, Kristen M; Gray, Christine L; Grabich, Shannon C; Lobdell, Danelle T

    2017-08-01

    Individual environmental exposures are associated with cancer development; however, environmental exposures occur simultaneously. The Environmental Quality Index (EQI) is a county-level measure of cumulative environmental exposures that occur in 5 domains. The EQI was linked to county-level annual age-adjusted cancer incidence rates from the Surveillance, Epidemiology, and End Results (SEER) Program state cancer profiles. All-site cancer and the top 3 site-specific cancers for male and female subjects were considered. Incident rate differences (IRDs; annual rate difference per 100,000 persons) and 95% confidence intervals (CIs) were estimated using fixed-slope, random intercept multilevel linear regression models. Associations were assessed with domain-specific indices and analyses were stratified by rural/urban status. Comparing the highest quintile/poorest environmental quality with the lowest quintile/best environmental quality for overall EQI, all-site county-level cancer incidence rate was positively associated with poor environmental quality overall (IRD, 38.55; 95% CI, 29.57-47.53) and for male (IRD, 32.60; 95% CI, 16.28-48.91) and female (IRD, 30.34; 95% CI, 20.47-40.21) subjects, indicating a potential increase in cancer incidence with decreasing environmental quality. Rural/urban stratified models demonstrated positive associations comparing the highest with the lowest quintiles for all strata, except the thinly populated/rural stratum and in the metropolitan/urbanized stratum. Prostate and breast cancer demonstrated the strongest positive associations with poor environmental quality. We observed strong positive associations between the EQI and all-site cancer incidence rates, and associations differed by rural/urban status and environmental domain. Research focusing on single environmental exposures in cancer development may not address the broader environmental context in which cancers develop, and future research should address cumulative environmental

  10. Economic and policy implications of the cumulative carbon budget

    Science.gov (United States)

    Allen, M. R.; Otto, F. E. L.; Otto, A.; Hepburn, C.

    2014-12-01

    The importance of cumulative carbon emissions in determining long-term risks of climate change presents considerable challenges to policy makers. The traditional notion of "total CO2-equivalent emissions", which forms the backbone of agreements such as the Kyoto Protocol and the European Emissions Trading System, is fundamentally flawed. Measures to reduce short-lived climate pollutants benefit the current generation, while measures to reduce long-lived climate pollutants benefit future generations, so there is no sense in which they can ever be considered equivalent. Debates over the correct metric used to compute CO2-equivalence are thus entirely moot: both long-lived and short-lived emissions will need to be addressed if all generations are to be protected from dangerous climate change. As far as long-lived climate pollutants are concerned, the latest IPCC report highlights the overwhelming importance of carbon capture and storage in determining the cost of meeting the goal of limiting anthropogenic warming to two degrees. We will show that this importance arises directly from the cumulative carbon budget and the role of CCS as the technology of last resort before economic activity needs to be restricted to meet ambitious climate targets. It highlights the need to increase the rate of CCS deployment by orders of magnitude if the option of avoiding two degrees is to be retained. The difficulty of achieving this speed of deployment through conventional incentives and carbon-pricing mechanisms suggests a need for a much more direct mandatory approach. Despite their theoretical economic inefficiency, the success of recent regulatory measures in achieving greenhouse gas emissions reductions in jurisdictions such as the United States suggests an extension of the regulatory approach could be a more effective and politically acceptable means of achieving adequately rapid CCS deployment than conventional carbon taxes or cap-and-trade systems.

  11. Construction of calibration curve for accountancy tank

    International Nuclear Information System (INIS)

    Kato, Takayuki; Goto, Yoshiki; Nidaira, Kazuo

    2009-01-01

    Tanks are equipped in a reprocessing plant for accounting solution of nuclear material. The careful measurement of volume in tanks is very important to implement rigorous accounting of nuclear material. The calibration curve relating the volume and level of solution needs to be constructed, where the level is determined by differential pressure of dip tubes. Several calibration curves are usually employed, but it's not explicitly decided how many segment are used, where to select segment, or what should be the degree of polynomial curve. These parameters, i.e., segment and degree of polynomial curve are mutually interrelated to give the better performance of calibration curve. Here we present the construction technique of giving optimum calibration curves and their characteristics. (author)

  12. MICA: Multiple interval-based curve alignment

    Science.gov (United States)

    Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf

    2018-01-01

    MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA

  13. Inverse Diffusion Curves Using Shape Optimization.

    Science.gov (United States)

    Zhao, Shuang; Durand, Fredo; Zheng, Changxi

    2018-07-01

    The inverse diffusion curve problem focuses on automatic creation of diffusion curve images that resemble user provided color fields. This problem is challenging since the 1D curves have a nonlinear and global impact on resulting color fields via a partial differential equation (PDE). We introduce a new approach complementary to previous methods by optimizing curve geometry. In particular, we propose a novel iterative algorithm based on the theory of shape derivatives. The resulting diffusion curves are clean and well-shaped, and the final image closely approximates the input. Our method provides a user-controlled parameter to regularize curve complexity, and generalizes to handle input color fields represented in a variety of formats.

  14. Low velocity target detection based on time-frequency image for high frequency ground wave radar

    Institute of Scientific and Technical Information of China (English)

    YAN Songhua; WU Shicai; WEN Biyang

    2007-01-01

    The Doppler spectral broadening resulted from non-stationary movement of target and radio-frequency interference will decrease the veracity of target detection by high frequency ground wave(HEGW)radar.By displaying the change of signal energy on two dimensional time-frequency images based on time-frequency analysis,a new mathematical morphology method to distinguish target from nonlinear time-frequency curves is presented.The analyzed results from the measured data verify that with this new method the target can be detected correctly from wide Doppler spectrum.

  15. String Sigma Models on Curved Supermanifolds

    Directory of Open Access Journals (Sweden)

    Roberto Catenacci

    2018-04-01

    Full Text Available We use the techniques of integral forms to analyze the easiest example of two-dimensional sigma models on a supermanifold. We write the action as an integral of a top integral form over a D = 2 supermanifold, and we show how to interpolate between different superspace actions. Then, we consider curved supermanifolds, and we show that the definitions used for flat supermanifolds can also be used for curved supermanifolds. We prove it by first considering the case of a curved rigid supermanifold and then the case of a generic curved supermanifold described by a single superfield E.

  16. Regional Marginal Abatement Cost Curves for NOx

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data underlying the figures included in the manuscript "Marginal abatement cost curve for NOx incorporating controls, renewable electricity, energy efficiency and...

  17. Rating curve estimation of nutrient loads in Iowa rivers

    Science.gov (United States)

    Stenback, G.A.; Crumpton, W.G.; Schilling, K.E.; Helmers, M.J.

    2011-01-01

    Accurate estimation of nutrient loads in rivers and streams is critical for many applications including determination of sources of nutrient loads in watersheds, evaluating long-term trends in loads, and estimating loading to downstream waterbodies. Since in many cases nutrient concentrations are measured on a weekly or monthly frequency, there is a need to estimate concentration and loads during periods when no data is available. The objectives of this study were to: (i) document the performance of a multiple regression model to predict loads of nitrate and total phosphorus (TP) in Iowa rivers and streams; (ii) determine whether there is any systematic bias in the load prediction estimates for nitrate and TP; and (iii) evaluate streamflow and concentration factors that could affect the load prediction efficiency. A commonly cited rating curve regression is utilized to estimate riverine nitrate and TP loads for rivers in Iowa with watershed areas ranging from 17.4 to over 34,600km2. Forty-nine nitrate and 44 TP datasets each comprising 5-22years of approximately weekly to monthly concentrations were examined. Three nitrate data sets had sample collection frequencies averaging about three samples per week. The accuracy and precision of annual and long term riverine load prediction was assessed by direct comparison of rating curve load predictions with observed daily loads. Significant positive bias of annual and long term nitrate loads was detected. Long term rating curve nitrate load predictions exceeded observed loads by 25% or more at 33% of the 49 measurement sites. No bias was found for TP load prediction although 15% of the 44 cases either underestimated or overestimate observed long-term loads by more than 25%. The rating curve was found to poorly characterize nitrate and phosphorus variation in some rivers. ?? 2010 .

  18. The cumulative burden of surviving childhood cancer: an initial report from the St Jude Lifetime Cohort Study (SJLIFE).

    Science.gov (United States)

    Bhakta, Nickhill; Liu, Qi; Ness, Kirsten K; Baassiri, Malek; Eissa, Hesham; Yeo, Frederick; Chemaitilly, Wassim; Ehrhardt, Matthew J; Bass, Johnnie; Bishop, Michael W; Shelton, Kyla; Lu, Lu; Huang, Sujuan; Li, Zhenghong; Caron, Eric; Lanctot, Jennifer; Howell, Carrie; Folse, Timothy; Joshi, Vijaya; Green, Daniel M; Mulrooney, Daniel A; Armstrong, Gregory T; Krull, Kevin R; Brinkman, Tara M; Khan, Raja B; Srivastava, Deo K; Hudson, Melissa M; Yasui, Yutaka; Robison, Leslie L

    2017-12-09

    Survivors of childhood cancer develop early and severe chronic health conditions (CHCs). A quantitative landscape of morbidity of survivors, however, has not been described. We aimed to describe the cumulative burden of curative cancer therapy in a clinically assessed ageing population of long-term survivors of childhood cancer. The St Jude Lifetime Cohort Study (SJLIFE) retrospectively collected data on CHCs in all patients treated for childhood cancer at the St Jude Children's Research Hospital who survived 10 years or longer from initial diagnosis and were 18 years or older as of June 30, 2015. Age-matched and sex-frequency-matched community controls were used for comparison. 21 treatment exposure variables were included in the analysis, with data abstracted from medical records. 168 CHCs for all participants were graded for severity using a modified Common Terminology Criteria of Adverse Events. Multiple imputation with predictive mean matching was used for missing occurrences and grades of CHCs in the survivors who were not clinically evaluable. Mean cumulative count was used for descriptive cumulative burden analysis and marked-point-process regression was used for inferential cumulative burden analysis. Of 5522 patients treated for childhood cancer at St Jude Children's Research Hospital who had complete records, survived 10 years or longer, and were 18 years or older at time of study, 3010 (54·5%) were alive, had enrolled, and had had prospective clinical assessment. 2512 (45·5%) of the 5522 patients were not clinically evaluable. The cumulative incidence of CHCs at age 50 years was 99·9% (95% CI 99·9-99·9) for grade 1-5 CHCs and 96·0% (95% CI 95·3-96·8%) for grade 3-5 CHCs. By age 50 years, a survivor had experienced, on average, 17·1 (95% CI 16·2-18·1) CHCs of any grade, of which 4·7 (4·6-4·9) were CHCs of grade 3-5. The cumulative burden in matched community controls of grade 1-5 CHCs was 9·2 (95% CI 7·9-10·6; pgrade 3-5 CHCs was 2·3 (1

  19. Cumulative Effect of Obesogenic Behaviours on Adiposity in Spanish Children and Adolescents

    Directory of Open Access Journals (Sweden)

    Helmut Schröder

    2017-12-01

    Full Text Available Objective: Little is known about the cumulative effect of obesogenic behaviours on childhood obesity risk. We determined the cumulative effect on BMI z-score, waist-to-height ratio (WHtR, overweight and abdominal obesity of four lifestyle behaviours that have been linked to obesity. Methods: In this cross-sectional analysis, data were obtained from the EnKid sudy, a representative sample of Spanish youth. The study included 1,614 boys and girls aged 5-18 years. Weight, height and waist circumference were measured. Physical activity (PA, screen time, breakfast consumption and meal frequency were self-reported on structured questionnaires. Obesogenic behaviours were defined as z-score was computed using age- and sex-specific reference values from the World Health Organization (WHO. Overweight including obesity was defined as a BMI > 1 SD from the mean of the WHO reference population. Abdominal obesity was defined as a WHtR ≥ 0.5. Results: High screen time was the most prominent obesogenic behaviour (49.7%, followed by low physical activity (22.4%, low meal frequency (14.4%, and skipping breakfast (12.5%. Although 33% of participants were free of all 4 obesogenic behaviours, 1, 2, and 3 or 4 behaviours were reported by 44.5%, 19.3%, and 5.0%, respectively. BMI z-score and WHtR were positively associated (p < 0.001 with increasing numbers of concurrent obesogenic behaviours. The odds of presenting with obesogenic behaviours were significantly higher in children who were overweight (OR 2.68; 95% CI 1.50; 4.80 or had abdominal obesity (OR 2.12; 95% CI 1.28; 3.52; they reported more than 2 obesogenic behaviours. High maternal and parental education was inversely associated (p = 0.004 and p < 0.001, respectively with increasing presence of obesogenic behaviours. Surrogate markers of adiposity increased with numbers of concurrent presence of obesogenic behaviours. The opposite was true for high maternal and paternal education.

  20. Shaking Table Tests of Curved Bridge considering Bearing Friction Sliding Isolation

    Directory of Open Access Journals (Sweden)

    Lei Yan

    2016-01-01

    Full Text Available Specific to severe damage to curved bridges in earthquakes caused by the excessive force of the fixed bearings and piers, a new seismic design method on curved bridges considering bearing friction sliding isolation is proposed in this paper. Seismic model bridge and isolation model bridge with similarity ratio of 1/20 were made and the shaking table comparison test was conducted. The experimental results show that the isolation model curved bridge suffered less seismic damage than the seismic model curved bridge. The fundamental frequencies of the seismic model bridge and isolation model bridge decreased and the damping ratio increased with the increase of seismic intensity. Compared with seismic curved bridge, the maximum reduction rates of peak acceleration along the radial and tangential directions on the top of pier of the isolation model curved bridge were 47.3% and 55.5%, respectively, and the maximum reduction rate of the peak strain on the bottom of pier of the isolation model curved bridge was 43.4%. For the isolation model curved bridge, the maximum reduction rate of peak acceleration on the top of pier was 24.6% compared with that on the bottom of pier. The study results can provide experimental basis for the seismic design of curved bridges.