WorldWideScience

Sample records for optimal acceptance threshold

  1. Calculation of Pareto-optimal solutions to multiple-objective problems using threshold-of-acceptability constraints

    Science.gov (United States)

    Giesy, D. P.

    1978-01-01

    A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

  2. Faster magnet sorting with a threshold acceptance algorithm

    International Nuclear Information System (INIS)

    Lidia, S.; Carr, R.

    1995-01-01

    We introduce here a new technique for sorting magnets to minimize the field errors in permanent magnet insertion devices. Simulated annealing has been used in this role, but we find the technique of threshold acceptance produces results of equal quality in less computer time. Threshold accepting would be of special value in designing very long insertion devices, such as long free electron lasers (FELs). Our application of threshold acceptance to magnet sorting showed that it converged to equivalently low values of the cost function, but that it converged significantly faster. We present typical cases showing time to convergence for various error tolerances, magnet numbers, and temperature schedules

  3. Faster magnet sorting with a threshold acceptance algorithm

    International Nuclear Information System (INIS)

    Lidia, S.

    1994-08-01

    The authors introduce here a new technique for sorting magnets to minimize the field errors in permanent magnet insertion devices. Simulated annealing has been used in this role, but they find the technique of threshold acceptance produces results of equal quality in less computer time. Threshold accepting would be of special value in designing very long insertion devices, such as long FEL's. Their application of threshold acceptance to magnet sorting showed that it converged to equivalently low values of the cost function, but that it converged significantly faster. They present typical cases showing time to convergence for various error tolerances, magnet numbers, and temperature schedules

  4. Automated backbone assignment of labeled proteins using the threshold accepting algorithm

    International Nuclear Information System (INIS)

    Leutner, Michael; Gschwind, Ruth M.; Liermann, Jens; Schwarz, Christian; Gemmecker, Gerd; Kessler, Horst

    1998-01-01

    The sequential assignment of backbone resonances is the first step in the structure determination of proteins by heteronuclear NMR. For larger proteins, an assignment strategy based on proton side-chain information is no longer suitable for the use in an automated procedure. Our program PASTA (Protein ASsignment by Threshold Accepting) is therefore designed to partially or fully automate the sequential assignment of proteins, based on the analysis of NMR backbone resonances plus C β information. In order to overcome the problems caused by peak overlap and missing signals in an automated assignment process, PASTA uses threshold accepting, a combinatorial optimization strategy, which is superior to simulated annealing due to generally faster convergence and better solutions. The reliability of this algorithm is shown by reproducing the complete sequential backbone assignment of several proteins from published NMR data. The robustness of the algorithm against misassigned signals, noise, spectral overlap and missing peaks is shown by repeating the assignment with reduced sequential information and increased chemical shift tolerances. The performance of the program on real data is finally demonstrated with automatically picked peak lists of human nonpancreatic synovial phospholipase A 2 , a protein with 124 residues

  5. Toward optimizing patient-specific IMRT QA techniques in the accurate detection of dosimetrically acceptable and unacceptable patient plans.

    Science.gov (United States)

    McKenzie, Elizabeth M; Balter, Peter A; Stingo, Francesco C; Jones, Jimmy; Followill, David S; Kry, Stephen F

    2014-12-01

    The authors investigated the performance of several patient-specific intensity-modulated radiation therapy (IMRT) quality assurance (QA) dosimeters in terms of their ability to correctly identify dosimetrically acceptable and unacceptable IMRT patient plans, as determined by an in-house-designed multiple ion chamber phantom used as the gold standard. A further goal was to examine optimal threshold criteria that were consistent and based on the same criteria among the various dosimeters. The authors used receiver operating characteristic (ROC) curves to determine the sensitivity and specificity of (1) a 2D diode array undergoing anterior irradiation with field-by-field evaluation, (2) a 2D diode array undergoing anterior irradiation with composite evaluation, (3) a 2D diode array using planned irradiation angles with composite evaluation, (4) a helical diode array, (5) radiographic film, and (6) an ion chamber. This was done with a variety of evaluation criteria for a set of 15 dosimetrically unacceptable and 9 acceptable clinical IMRT patient plans, where acceptability was defined on the basis of multiple ion chamber measurements using independent ion chambers and a phantom. The area under the curve (AUC) on the ROC curves was used to compare dosimeter performance across all thresholds. Optimal threshold values were obtained from the ROC curves while incorporating considerations for cost and prevalence of unacceptable plans. Using common clinical acceptance thresholds, most devices performed very poorly in terms of identifying unacceptable plans. Grouping the detector performance based on AUC showed two significantly different groups. The ion chamber, radiographic film, helical diode array, and anterior-delivered composite 2D diode array were in the better-performing group, whereas the anterior-delivered field-by-field and planned gantry angle delivery using the 2D diode array performed less well. Additionally, based on the AUCs, there was no significant difference

  6. Optimizing Systems of Threshold Detection Sensors

    National Research Council Canada - National Science Library

    Banschbach, David C

    2008-01-01

    .... Below the threshold all signals are ignored. We develop a mathematical model for setting individual sensor thresholds to obtain optimal probability of detecting a significant event, given a limit on the total number of false positives allowed...

  7. Automatic Multi-Level Thresholding Segmentation Based on Multi-Objective Optimization

    Directory of Open Access Journals (Sweden)

    L. DJEROU,

    2012-01-01

    Full Text Available In this paper, we present a new multi-level image thresholding technique, called Automatic Threshold based on Multi-objective Optimization "ATMO" that combines the flexibility of multi-objective fitness functions with the power of a Binary Particle Swarm Optimization algorithm "BPSO", for searching the "optimum" number of the thresholds and simultaneously the optimal thresholds of three criteria: the between-class variances criterion, the minimum error criterion and the entropy criterion. Some examples of test images are presented to compare our segmentation method, based on the multi-objective optimization approach with Otsu’s, Kapur’s and Kittler’s methods. Our experimental results show that the thresholding method based on multi-objective optimization is more efficient than the classical Otsu’s, Kapur’s and Kittler’s methods.

  8. Perceptibility and acceptability thresholds for colour differences in dentistry

    NARCIS (Netherlands)

    Khashayar, G.; Bain, P.A.; Salari, S.; Dozic, A.; Kleverlaan, C.J.; Feilzer, A.J.

    2014-01-01

    Introduction Data on acceptability (AT) and perceptibility thresholds (PT) for colour differences vary in dental literature. There is consensus that the determination of ΔE* is appropriate to define AT and PT, however there is no consensus regarding the values that should be used. The aim of this

  9. Modified Discrete Grey Wolf Optimizer Algorithm for Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Linguo Li

    2017-01-01

    Full Text Available The computation of image segmentation has become more complicated with the increasing number of thresholds, and the option and application of the thresholds in image thresholding fields have become an NP problem at the same time. The paper puts forward the modified discrete grey wolf optimizer algorithm (MDGWO, which improves on the optimal solution updating mechanism of the search agent by the weights. Taking Kapur’s entropy as the optimized function and based on the discreteness of threshold in image segmentation, the paper firstly discretizes the grey wolf optimizer (GWO and then proposes a new attack strategy by using the weight coefficient to replace the search formula for optimal solution used in the original algorithm. The experimental results show that MDGWO can search out the optimal thresholds efficiently and precisely, which are very close to the result examined by exhaustive searches. In comparison with the electromagnetism optimization (EMO, the differential evolution (DE, the Artifical Bee Colony (ABC, and the classical GWO, it is concluded that MDGWO has advantages over the latter four in terms of image segmentation quality and objective function values and their stability.

  10. Acceptance threshold hypothesis is supported by chemical similarity of cuticular hydrocarbons in a stingless bee, Melipona asilvai.

    Science.gov (United States)

    Nascimento, D L; Nascimento, F S

    2012-11-01

    The ability to discriminate nestmates from non-nestmates in insect societies is essential to protect colonies from conspecific invaders. The acceptance threshold hypothesis predicts that organisms whose recognition systems classify recipients without errors should optimize the balance between acceptance and rejection. In this process, cuticular hydrocarbons play an important role as cues of recognition in social insects. The aims of this study were to determine whether guards exhibit a restrictive level of rejection towards chemically distinct individuals, becoming more permissive during the encounters with either nestmate or non-nestmate individuals bearing chemically similar profiles. The study demonstrates that Melipona asilvai (Hymenoptera: Apidae: Meliponini) guards exhibit a flexible system of nestmate recognition according to the degree of chemical similarity between the incoming forager and its own cuticular hydrocarbons profile. Guards became less restrictive in their acceptance rates when they encounter non-nestmates with highly similar chemical profiles, which they probably mistake for nestmates, hence broadening their acceptance level.

  11. Threshold-driven optimization for reference-based auto-planning

    Science.gov (United States)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  12. Characteristics of Omega-Optimized Portfolios at Different Levels of Threshold Returns

    Directory of Open Access Journals (Sweden)

    Renaldas Vilkancas

    2014-12-01

    Full Text Available There is little literature considering effects that the loss-gain threshold used for dividing good and bad outcomes by all downside (upside risk measures has on portfolio optimization and performance. The purpose of this study is to assess the performance of portfolios optimized with respect to the Omega function developed by Keating and Shadwick at different levels of the threshold returns. The most common choices of the threshold values used in various Omega studies cover the risk-free rate and the average market return or simply a zero return, even though the inventors of this measure for risk warn that “using the values of the Omega function at particular points can be critically misleading” and that “only the entire Omega function contains information on distribution”. The obtained results demonstrate the importance of the selected values of the threshold return on portfolio performance – higher levels of the threshold lead to an increase in portfolio returns, albeit at the expense of a higher risk. In fact, within a certain threshold interval, Omega-optimized portfolios achieved the highest net return, compared with all other strategies for portfolio optimization using three different test datasets. However, beyond a certain limit, high threshold values will actually start hurting portfolio performance while meta-heuristic optimizers typically are able to produce a solution at any level of the threshold, and the obtained results would most likely be financially meaningless.

  13. Ultrafuzziness Optimization Based on Type II Fuzzy Sets for Image Thresholding

    Directory of Open Access Journals (Sweden)

    Hudan Studiawan

    2010-11-01

    Full Text Available Image thresholding is one of the processing techniques to provide high quality preprocessed image. Image vagueness and bad illumination are common obstacles yielding in a poor image thresholding output. By assuming image as fuzzy sets, several different fuzzy thresholding techniques have been proposed to remove these obstacles during threshold selection. In this paper, we proposed an algorithm for thresholding image using ultrafuzziness optimization to decrease uncertainty in fuzzy system by common fuzzy sets like type II fuzzy sets. Optimization was conducted by involving ultrafuzziness measurement for background and object fuzzy sets separately. Experimental results demonstrated that the proposed image thresholding method had good performances for images with high vagueness, low level contrast, and grayscale ambiguity.

  14. Estimating the optimal growth-maximising public debt threshold for ...

    African Journals Online (AJOL)

    This paper attempts to estimate an optimal growth-maximising public debt threshold for Zimbabwe. The public debt threshold is estimated by assessing the relationship between public debt and economic growth. The analysis is undertaken to determine the tipping point beyond which increases in public debt adversely affect ...

  15. Social acceptance of comparative optimism and realism.

    Science.gov (United States)

    Milhabet, I; Verlhiac, J F

    2011-10-01

    Studies of optimism and realism (the accuracy of people's outlook on the future) seek to understand the respective effects of these elements on social approbation. Two experiments examined how comparative optimism (vs. pessimism) and realism (vs. unrealism) interacted to influence the targets' social acceptance based on their perceptions about the future. The results showed that realism, or accuracy of prediction, increased the positive social effects of a comparatively optimistic outlook on the future. In contrast, targets who exhibited comparative pessimism were more socially acceptable when their predictions were unrealistic rather than realistic. This phenomenon was examined by also considering the polarity of the events about which judgments were expressed. These results contribute to the body of research about the relationship between optimism and pessimism and the relationship between optimism and realism.

  16. Hard decoding algorithm for optimizing thresholds under general Markovian noise

    Science.gov (United States)

    Chamberland, Christopher; Wallman, Joel; Beale, Stefanie; Laflamme, Raymond

    2017-04-01

    Quantum error correction is instrumental in protecting quantum systems from noise in quantum computing and communication settings. Pauli channels can be efficiently simulated and threshold values for Pauli error rates under a variety of error-correcting codes have been obtained. However, realistic quantum systems can undergo noise processes that differ significantly from Pauli noise. In this paper, we present an efficient hard decoding algorithm for optimizing thresholds and lowering failure rates of an error-correcting code under general completely positive and trace-preserving (i.e., Markovian) noise. We use our hard decoding algorithm to study the performance of several error-correcting codes under various non-Pauli noise models by computing threshold values and failure rates for these codes. We compare the performance of our hard decoding algorithm to decoders optimized for depolarizing noise and show improvements in thresholds and reductions in failure rates by several orders of magnitude. Our hard decoding algorithm can also be adapted to take advantage of a code's non-Pauli transversal gates to further suppress noise. For example, we show that using the transversal gates of the 5-qubit code allows arbitrary rotations around certain axes to be perfectly corrected. Furthermore, we show that Pauli twirling can increase or decrease the threshold depending upon the code properties. Lastly, we show that even if the physical noise model differs slightly from the hypothesized noise model used to determine an optimized decoder, failure rates can still be reduced by applying our hard decoding algorithm.

  17. Image Segmentation using a Refined Comprehensive Learning Particle Swarm Optimizer for Maximum Tsallis Entropy Thresholding

    OpenAIRE

    L. Jubair Ahmed; A. Ebenezer Jeyakumar

    2013-01-01

    Thresholding is one of the most important techniques for performing image segmentation. In this paper to compute optimum thresholds for Maximum Tsallis entropy thresholding (MTET) model, a new hybrid algorithm is proposed by integrating the Comprehensive Learning Particle Swarm Optimizer (CPSO) with the Powell’s Conjugate Gradient (PCG) method. Here the CPSO will act as the main optimizer for searching the near-optimal thresholds while the PCG method will be used to fine tune the best solutio...

  18. OPTIMAL PRE-MERGER NOTIFICATION THRESHOLDS: A CONTRIBUTION TO THE ITALIAN DEBATE

    Directory of Open Access Journals (Sweden)

    Paolo Buccirossi

    2014-12-01

    Full Text Available This paper outlines a theoretical framework to define the optimal notification thresholds so as to minimize the sum of Type I and Type II error costs. Results suggest that, when the notification rule takes into account in a cumulative way both the aggregate turnover of the merging parties and their individual turnover, the optimal values of these turnovers are interdependent. The model is then applied to the Italian case. The value of the threshold for the aggregate turnover has been obtained by benchmarking the rules set in the EU Member States through a simple econometric exercise. The value of the threshold for the individual turnover is then calculated applying the theoretical framework and the estimated costs of Type I and Type II errors.

  19. What Is the Optimal Threshold at Which to Recommend Breast Biopsy?

    OpenAIRE

    Burnside, Elizabeth S.; Chhatwal, Jagpreet; Alagoz, Oguzhan

    2012-01-01

    Background A 2% threshold, traditionally used as a level above which breast biopsy recommended, has been generalized to all patients from several specific situations analyzed in the literature. We use a sequential decision analytic model considering clinical and mammography features to determine the optimal general threshold for image guided breast biopsy and the sensitivity of this threshold to variation of these features. Methodology/Principal Findings We built a decision analytical model c...

  20. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  1. Otsu Based Optimal Multilevel Image Thresholding Using Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    N. Sri Madhava Raja

    2014-01-01

    Full Text Available Histogram based multilevel thresholding approach is proposed using Brownian distribution (BD guided firefly algorithm (FA. A bounded search technique is also presented to improve the optimization accuracy with lesser search iterations. Otsu’s between-class variance function is maximized to obtain optimal threshold level for gray scale images. The performances of the proposed algorithm are demonstrated by considering twelve benchmark images and are compared with the existing FA algorithms such as Lévy flight (LF guided FA and random operator guided FA. The performance assessment comparison between the proposed and existing firefly algorithms is carried using prevailing parameters such as objective function, standard deviation, peak-to-signal ratio (PSNR, structural similarity (SSIM index, and search time of CPU. The results show that BD guided FA provides better objective function, PSNR, and SSIM, whereas LF based FA provides faster convergence with relatively lower CPU time.

  2. Optimal threshold estimation for binary classifiers using game theory.

    Science.gov (United States)

    Sanchez, Ignacio Enrique

    2016-01-01

    Many bioinformatics algorithms can be understood as binary classifiers. They are usually compared using the area under the receiver operating characteristic ( ROC ) curve. On the other hand, choosing the best threshold for practical use is a complex task, due to uncertain and context-dependent skews in the abundance of positives in nature and in the yields/costs for correct/incorrect classification. We argue that considering a classifier as a player in a zero-sum game allows us to use the minimax principle from game theory to determine the optimal operating point. The proposed classifier threshold corresponds to the intersection between the ROC curve and the descending diagonal in ROC space and yields a minimax accuracy of 1-FPR. Our proposal can be readily implemented in practice, and reveals that the empirical condition for threshold estimation of "specificity equals sensitivity" maximizes robustness against uncertainties in the abundance of positives in nature and classification costs.

  3. Optimal Policies for Random and Periodic Garbage Collections with Tenuring Threshold

    Science.gov (United States)

    Zhao, Xufeng; Nakamura, Syouji; Nakagawa, Toshio

    It is an important problem to determine the tenuring threshold to meet the pause time goal for a generational garbage collector. From such viewpoint, this paper proposes two stochastic models based on the working schemes of a generational garbage collector: One is random collection which occurs at a nonhomogeneous Poisson process and the other is periodic collection which occurs at periodic times. Since the cost suffered for minor collection increases, as the amount of surviving objects accumulates, tenuring minor collection should be made at some tenuring threshold. Using the techniques of cumulative processes and reliability theory, expected cost rates with tenuring threshold are obtained, and optimal policies which minimize them are discussed analytically and computed numerically.

  4. A Framework for Optimizing Phytosanitary Thresholds in Seed Systems.

    Science.gov (United States)

    Choudhury, Robin Alan; Garrett, Karen A; Klosterman, Steven J; Subbarao, Krishna V; McRoberts, Neil

    2017-10-01

    Seedborne pathogens and pests limit production in many agricultural systems. Quarantine programs help prevent the introduction of exotic pathogens into a country, but few regulations directly apply to reducing the reintroduction and spread of endemic pathogens. Use of phytosanitary thresholds helps limit the movement of pathogen inoculum through seed, but the costs associated with rejected seed lots can be prohibitive for voluntary implementation of phytosanitary thresholds. In this paper, we outline a framework to optimize thresholds for seedborne pathogens, balancing the cost of rejected seed lots and benefit of reduced inoculum levels. The method requires relatively small amounts of data, and the accuracy and robustness of the analysis improves over time as data accumulate from seed testing. We demonstrate the method first and illustrate it with a case study of seedborne oospores of Peronospora effusa, the causal agent of spinach downy mildew. A seed lot threshold of 0.23 oospores per seed could reduce the overall number of oospores entering the production system by 90% while removing 8% of seed lots destined for distribution. Alternative mitigation strategies may result in lower economic losses to seed producers, but have uncertain efficacy. We discuss future challenges and prospects for implementing this approach.

  5. Robust optimization of the laser induced damage threshold of dielectric mirrors for high power lasers.

    Science.gov (United States)

    Chorel, Marine; Lanternier, Thomas; Lavastre, Éric; Bonod, Nicolas; Bousquet, Bruno; Néauport, Jérôme

    2018-04-30

    We report on a numerical optimization of the laser induced damage threshold of multi-dielectric high reflection mirrors in the sub-picosecond regime. We highlight the interplay between the electric field distribution, refractive index and intrinsic laser induced damage threshold of the materials on the overall laser induced damage threshold (LIDT) of the multilayer. We describe an optimization method of the multilayer that minimizes the field enhancement in high refractive index materials while preserving a near perfect reflectivity. This method yields a significant improvement of the damage resistance since a maximum increase of 40% can be achieved on the overall LIDT of the multilayer.

  6. Limitations of acceptability curves for presenting uncertainty in cost-effectiveness analysis

    NARCIS (Netherlands)

    Groot Koerkamp, Bas; Hunink, M. G. Myriam; Stijnen, Theo; Hammitt, James K.; Kuntz, Karen M.; Weinstein, Milton C.

    2007-01-01

    Clinical journals increasingly illustrate uncertainty about the cost and effect of health care interventions using cost-effectiveness acceptability curves (CEACs). CEACs present the probability that each competing alternative is optimal for a range of values of the cost-effectiveness threshold. The

  7. Mate choice when males are in patches: optimal strategies and good rules of thumb.

    Science.gov (United States)

    Hutchinson, John M C; Halupka, Konrad

    2004-11-07

    In standard mate-choice models, females encounter males sequentially and decide whether to inspect the quality of another male or to accept a male already inspected. What changes when males are clumped in patches and there is a significant cost to travel between patches? We use stochastic dynamic programming to derive optimum strategies under various assumptions. With zero costs to returning to a male in the current patch, the optimal strategy accepts males above a quality threshold which is constant whenever one or more males in the patch remain uninspected; this threshold drops when inspecting the last male in the patch, so returns may occur only then and are never to a male in a previously inspected patch. With non-zero within-patch return costs, such a two-threshold rule still performs extremely well, but a more gradual decline in acceptance threshold is optimal. Inability to return at all need not decrease performance by much. The acceptance threshold should also decline if it gets harder to discover the last males in a patch. Optimal strategies become more complex when mean male quality varies systematically between patches or years, and females estimate this in a Bayesian manner through inspecting male qualities. It can then be optimal to switch patch before inspecting all males on a patch, or, exceptionally, to return to an earlier patch. We compare performance of various rules of thumb in these environments and in ones without a patch structure. A two-threshold rule performs excellently, as do various simplifications of it. The best-of-N rule outperforms threshold rules only in non-patchy environments with between-year quality variation. The cutoff rule performs poorly.

  8. Intelligent Network Flow Optimization (INFLO) prototype acceptance test summary.

    Science.gov (United States)

    2015-05-01

    This report summarizes the results of System Acceptance Testing for the implementation of the Intelligent Network : Flow Optimization (INFLO) Prototype bundle within the Dynamic Mobility Applications (DMA) portion of the Connected : Vehicle Program. ...

  9. A Jackson network model and threshold policy for joint optimization of energy and delay in multi-hop wireless networks

    KAUST Repository

    Xia, Li; Shihada, Basem

    2014-01-01

    This paper studies the joint optimization problem of energy and delay in a multi-hop wireless network. The optimization variables are the transmission rates, which are adjustable according to the packet queueing length in the buffer. The optimization goal is to minimize the energy consumption of energy-critical nodes and the packet transmission delay throughout the network. In this paper, we aim at understanding the well-known decentralized algorithms which are threshold based from a different research angle. By using a simplified network model, we show that we can adopt the semi-open Jackson network model and study this optimization problem in closed form. This simplified network model further allows us to establish some significant optimality properties. We prove that the system performance is monotonic with respect to (w.r.t.) the transmission rate. We also prove that the threshold-type policy is optimal, i.e., when the number of packets in the buffer is larger than a threshold, transmit with the maximal rate (power); otherwise, no transmission. With these optimality properties, we develop a heuristic algorithm to iteratively find the optimal threshold. Finally, we conduct some simulation experiments to demonstrate the main idea of this paper.

  10. A Jackson network model and threshold policy for joint optimization of energy and delay in multi-hop wireless networks

    KAUST Repository

    Xia, Li

    2014-11-20

    This paper studies the joint optimization problem of energy and delay in a multi-hop wireless network. The optimization variables are the transmission rates, which are adjustable according to the packet queueing length in the buffer. The optimization goal is to minimize the energy consumption of energy-critical nodes and the packet transmission delay throughout the network. In this paper, we aim at understanding the well-known decentralized algorithms which are threshold based from a different research angle. By using a simplified network model, we show that we can adopt the semi-open Jackson network model and study this optimization problem in closed form. This simplified network model further allows us to establish some significant optimality properties. We prove that the system performance is monotonic with respect to (w.r.t.) the transmission rate. We also prove that the threshold-type policy is optimal, i.e., when the number of packets in the buffer is larger than a threshold, transmit with the maximal rate (power); otherwise, no transmission. With these optimality properties, we develop a heuristic algorithm to iteratively find the optimal threshold. Finally, we conduct some simulation experiments to demonstrate the main idea of this paper.

  11. Air Traffic Controller Acceptability of Unmanned Aircraft System Detect-and-Avoid Thresholds

    Science.gov (United States)

    Mueller, Eric R.; Isaacson, Douglas R.; Stevens, Derek

    2016-01-01

    A human-in-the-loop experiment was conducted with 15 retired air traffic controllers to investigate two research questions: (a) what procedures are appropriate for the use of unmanned aircraft system (UAS) detect-and-avoid systems, and (b) how long in advance of a predicted close encounter should pilots request or execute a separation maneuver. The controller participants managed a busy Oakland air route traffic control sector with mixed commercial/general aviation and manned/UAS traffic, providing separation services, miles-in-trail restrictions and issuing traffic advisories. Controllers filled out post-scenario and post-simulation questionnaires, and metrics were collected on the acceptability of procedural options and temporal thresholds. The states of aircraft were also recorded when controllers issued traffic advisories. Subjective feedback indicated a strong preference for pilots to request maneuvers to remain well clear from intruder aircraft rather than deviate from their IFR clearance. Controllers also reported that maneuvering at 120 seconds until closest point of approach (CPA) was too early; maneuvers executed with less than 90 seconds until CPA were more acceptable. The magnitudes of the requested maneuvers were frequently judged to be too large, indicating a possible discrepancy between the quantitative UAS well clear standard and the one employed subjectively by manned pilots. The ranges between pairs of aircraft and the times to CPA at which traffic advisories were issued were used to construct empirical probability distributions of those metrics. Given these distributions, we propose that UAS pilots wait until an intruder aircraft is approximately 80 seconds to CPA or 6 nmi away before requesting a maneuver, and maneuver immediately if the intruder is within 60 seconds and 4 nmi. These thresholds should make the use of UAS detect and avoid systems compatible with current airspace procedures and controller expectations.

  12. Gain optimization in fiber optical parametric amplifiers by combining standard and high-SBS threshold highly nonlinear fibers

    DEFF Research Database (Denmark)

    Da Ros, Francesco; Rottwitt, Karsten; Peucheret, Christophe

    2012-01-01

    Combining Al-doped and Ge-doped HNLFs as gain media in FOPAs is proposed and optimized, resulting in efficient SBS mitigation while circumventing the additional loss of the high SBS threshold Al-doped fiber.......Combining Al-doped and Ge-doped HNLFs as gain media in FOPAs is proposed and optimized, resulting in efficient SBS mitigation while circumventing the additional loss of the high SBS threshold Al-doped fiber....

  13. Energy-Specific Optimization of Attenuation Thresholds for Low-Energy Virtual Monoenergetic Images in Renal Lesion Evaluation.

    Science.gov (United States)

    Patel, Bhavik N; Farjat, Alfredo; Schabel, Christoph; Duvnjak, Petar; Mileto, Achille; Ramirez-Giraldo, Juan Carlos; Marin, Daniele

    2018-05-01

    The purpose of this study was to determine in vitro and in vivo the optimal threshold for renal lesion vascularity at low-energy (40-60 keV) virtual monoenergetic imaging. A rod simulating unenhanced renal parenchymal attenuation (35 HU) was fitted with a syringe containing water. Three iodinated solutions (0.38, 0.57, and 0.76 mg I/mL) were inserted into another rod that simulated enhanced renal parenchyma (180 HU). Rods were inserted into cylindric phantoms of three different body sizes and scanned with single- and dual-energy MDCT. In addition, 102 patients (32 men, 70 women; mean age, 66.8 ± 12.9 [SD] years) with 112 renal lesions (67 nonvascular, 45 vascular) measuring 1.1-8.9 cm underwent single-energy unenhanced and contrast-enhanced dual-energy CT. Optimal threshold attenuation values that differentiated vascular from nonvascular lesions at 40-60 keV were determined. Mean optimal threshold values were 30.2 ± 3.6 (standard error), 20.9 ± 1.3, and 16.1 ± 1.0 HU in the phantom, and 35.9 ± 3.6, 25.4 ± 1.8, and 17.8 ± 1.8 HU in the patients at 40, 50, and 60 keV. Sensitivity and specificity for the thresholds did not change significantly between low-energy and 70-keV virtual monoenergetic imaging (sensitivity, 87-98%; specificity, 90-91%). The AUC from 40 to 70 keV was 0.96 (95% CI, 0.93-0.99) to 0.98 (95% CI, 0.95-1.00). Low-energy virtual monoenergetic imaging at energy-specific optimized attenuation thresholds can be used for reliable characterization of renal lesions.

  14. Optimal Investment Under Transaction Costs: A Threshold Rebalanced Portfolio Approach

    Science.gov (United States)

    Tunc, Sait; Donmez, Mehmet Ali; Kozat, Suleyman Serdar

    2013-06-01

    We study optimal investment in a financial market having a finite number of assets from a signal processing perspective. We investigate how an investor should distribute capital over these assets and when he should reallocate the distribution of the funds over these assets to maximize the cumulative wealth over any investment period. In particular, we introduce a portfolio selection algorithm that maximizes the expected cumulative wealth in i.i.d. two-asset discrete-time markets where the market levies proportional transaction costs in buying and selling stocks. We achieve this using "threshold rebalanced portfolios", where trading occurs only if the portfolio breaches certain thresholds. Under the assumption that the relative price sequences have log-normal distribution from the Black-Scholes model, we evaluate the expected wealth under proportional transaction costs and find the threshold rebalanced portfolio that achieves the maximal expected cumulative wealth over any investment period. Our derivations can be readily extended to markets having more than two stocks, where these extensions are pointed out in the paper. As predicted from our derivations, we significantly improve the achieved wealth over portfolio selection algorithms from the literature on historical data sets.

  15. Adaptive Wavelet Threshold Denoising Method for Machinery Sound Based on Improved Fruit Fly Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Jing Xu

    2016-07-01

    Full Text Available As the sound signal of a machine contains abundant information and is easy to measure, acoustic-based monitoring or diagnosis systems exhibit obvious superiority, especially in some extreme conditions. However, the sound directly collected from industrial field is always polluted. In order to eliminate noise components from machinery sound, a wavelet threshold denoising method optimized by an improved fruit fly optimization algorithm (WTD-IFOA is proposed in this paper. The sound is firstly decomposed by wavelet transform (WT to obtain coefficients of each level. As the wavelet threshold functions proposed by Donoho were discontinuous, many modified functions with continuous first and second order derivative were presented to realize adaptively denoising. However, the function-based denoising process is time-consuming and it is difficult to find optimal thresholds. To overcome these problems, fruit fly optimization algorithm (FOA was introduced to the process. Moreover, to avoid falling into local extremes, an improved fly distance range obeying normal distribution was proposed on the basis of original FOA. Then, sound signal of a motor was recorded in a soundproof laboratory, and Gauss white noise was added into the signal. The simulation results illustrated the effectiveness and superiority of the proposed approach by a comprehensive comparison among five typical methods. Finally, an industrial application on a shearer in coal mining working face was performed to demonstrate the practical effect.

  16. Optimal post-warranty maintenance policy with repair time threshold for minimal repair

    International Nuclear Information System (INIS)

    Park, Minjae; Mun Jung, Ki; Park, Dong Ho

    2013-01-01

    In this paper, we consider a renewable minimal repair–replacement warranty policy and propose an optimal maintenance model after the warranty is expired. Such model adopts the repair time threshold during the warranty period and follows with a certain type of system maintenance policy during the post-warranty period. As for the criteria for optimality, we utilize the expected cost rate per unit time during the life cycle of the system, which has been frequently used in many existing maintenance models. Based on the cost structure defined for each failure of the system, we formulate the expected cost rate during the life cycle of the system, assuming that a renewable minimal repair–replacement warranty policy with the repair time threshold is provided to the user during the warranty period. Once the warranty is expired, the maintenance of the system is the user's sole responsibility. The life cycle of the system is defined on the perspective of the user and the expected cost rate per unit time is derived in this context. We obtain the optimal maintenance policy during the maintenance period following the expiration of the warranty period by minimizing such a cost rate. Numerical examples using actual failure data are presented to exemplify the applicability of the methodologies proposed in this paper.

  17. Graph Analysis and Modularity of Brain Functional Connectivity Networks: Searching for the Optimal Threshold

    Directory of Open Access Journals (Sweden)

    Cécile Bordier

    2017-08-01

    Full Text Available Neuroimaging data can be represented as networks of nodes and edges that capture the topological organization of the brain connectivity. Graph theory provides a general and powerful framework to study these networks and their structure at various scales. By way of example, community detection methods have been widely applied to investigate the modular structure of many natural networks, including brain functional connectivity networks. Sparsification procedures are often applied to remove the weakest edges, which are the most affected by experimental noise, and to reduce the density of the graph, thus making it theoretically and computationally more tractable. However, weak links may also contain significant structural information, and procedures to identify the optimal tradeoff are the subject of active research. Here, we explore the use of percolation analysis, a method grounded in statistical physics, to identify the optimal sparsification threshold for community detection in brain connectivity networks. By using synthetic networks endowed with a ground-truth modular structure and realistic topological features typical of human brain functional connectivity networks, we show that percolation analysis can be applied to identify the optimal sparsification threshold that maximizes information on the networks' community structure. We validate this approach using three different community detection methods widely applied to the analysis of brain connectivity networks: Newman's modularity, InfoMap and Asymptotical Surprise. Importantly, we test the effects of noise and data variability, which are critical factors to determine the optimal threshold. This data-driven method should prove particularly useful in the analysis of the community structure of brain networks in populations characterized by different connectivity strengths, such as patients and controls.

  18. Multilevel Thresholding Segmentation Based on Harmony Search Optimization

    Directory of Open Access Journals (Sweden)

    Diego Oliva

    2013-01-01

    Full Text Available In this paper, a multilevel thresholding (MT algorithm based on the harmony search algorithm (HSA is introduced. HSA is an evolutionary method which is inspired in musicians improvising new harmonies while playing. Different to other evolutionary algorithms, HSA exhibits interesting search capabilities still keeping a low computational overhead. The proposed algorithm encodes random samples from a feasible search space inside the image histogram as candidate solutions, whereas their quality is evaluated considering the objective functions that are employed by the Otsu’s or Kapur’s methods. Guided by these objective values, the set of candidate solutions are evolved through the HSA operators until an optimal solution is found. Experimental results demonstrate the high performance of the proposed method for the segmentation of digital images.

  19. Optimization Problems on Threshold Graphs

    Directory of Open Access Journals (Sweden)

    Elena Nechita

    2010-06-01

    Full Text Available During the last three decades, different types of decompositions have been processed in the field of graph theory. Among these we mention: decompositions based on the additivity of some characteristics of the graph, decompositions where the adjacency law between the subsets of the partition is known, decompositions where the subgraph induced by every subset of the partition must have predeterminate properties, as well as combinations of such decompositions. In this paper we characterize threshold graphs using the weakly decomposition, determine: density and stability number, Wiener index and Wiener polynomial for threshold graphs.

  20. An n -material thresholding method for improving integerness of solutions in topology optimization

    International Nuclear Information System (INIS)

    Watts, Seth; Engineering); Tortorelli, Daniel A.; Engineering)

    2016-01-01

    It is common in solving topology optimization problems to replace an integer-valued characteristic function design field with the material volume fraction field, a real-valued approximation of the design field that permits "fictitious" mixtures of materials during intermediate iterations in the optimization process. This is reasonable so long as one can interpolate properties for such materials and so long as the final design is integer valued. For this purpose, we present a method for smoothly thresholding the volume fractions of an arbitrary number of material phases which specify the design. This method is trivial for two-material design problems, for example, the canonical topology design problem of specifying the presence or absence of a single material within a domain, but it becomes more complex when three or more materials are used, as often occurs in material design problems. We take advantage of the similarity in properties between the volume fractions and the barycentric coordinates on a simplex to derive a thresholding, method which is applicable to an arbitrary number of materials. As we show in a sensitivity analysis, this method has smooth derivatives, allowing it to be used in gradient-based optimization algorithms. Finally, we present results, which show synergistic effects when used with Solid Isotropic Material with Penalty and Rational Approximation of Material Properties material interpolation functions, popular methods of ensuring integerness of solutions.

  1. Effect of threshold quantization in opportunistic splitting algorithm

    KAUST Repository

    Nam, Haewoon

    2011-12-01

    This paper discusses algorithms to find the optimal threshold and also investigates the impact of threshold quantization on the scheduling outage performance of the opportunistic splitting scheduling algorithm. Since this algorithm aims at finding the user with the highest channel quality within the minimal number of mini-slots by adjusting the threshold every mini-slot, optimizing the threshold is of paramount importance. Hence, in this paper we first discuss how to compute the optimal threshold along with two tight approximations for the optimal threshold. Closed-form expressions are provided for those approximations for simple calculations. Then, we consider linear quantization of the threshold to take the limited number of bits for signaling messages in practical systems into consideration. Due to the limited granularity for the quantized threshold value, an irreducible scheduling outage floor is observed. The numerical results show that the two approximations offer lower scheduling outage probability floors compared to the conventional algorithm when the threshold is quantized. © 2006 IEEE.

  2. Joint optimization of maintenance, buffers and machines in manufacturing lines

    Science.gov (United States)

    Nahas, Nabil; Nourelfath, Mustapha

    2018-01-01

    This article considers a series manufacturing line composed of several machines separated by intermediate buffers of finite capacity. The goal is to find the optimal number of preventive maintenance actions performed on each machine, the optimal selection of machines and the optimal buffer allocation plan that minimize the total system cost, while providing the desired system throughput level. The mean times between failures of all machines are assumed to increase when applying periodic preventive maintenance. To estimate the production line throughput, a decomposition method is used. The decision variables in the formulated optimal design problem are buffer levels, types of machines and times between preventive maintenance actions. Three heuristic approaches are developed to solve the formulated combinatorial optimization problem. The first heuristic consists of a genetic algorithm, the second is based on the nonlinear threshold accepting metaheuristic and the third is an ant colony system. The proposed heuristics are compared and their efficiency is shown through several numerical examples. It is found that the nonlinear threshold accepting algorithm outperforms the genetic algorithm and ant colony system, while the genetic algorithm provides better results than the ant colony system for longer manufacturing lines.

  3. ANALYSING ACCEPTANCE SAMPLING PLANS BY MARKOV CHAINS

    Directory of Open Access Journals (Sweden)

    Mohammad Mirabi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: In this research, a Markov analysis of acceptance sampling plans in a single stage and in two stages is proposed, based on the quality of the items inspected. In a stage of this policy, if the number of defective items in a sample of inspected items is more than the upper threshold, the batch is rejected. However, the batch is accepted if the number of defective items is less than the lower threshold. Nonetheless, when the number of defective items falls between the upper and lower thresholds, the decision-making process continues to inspect the items and collect further samples. The primary objective is to determine the optimal values of the upper and lower thresholds using a Markov process to minimise the total cost associated with a batch acceptance policy. A solution method is presented, along with a numerical demonstration of the application of the proposed methodology.

    AFRIKAANSE OPSOMMING: In hierdie navorsing word ’n Markov-ontleding gedoen van aannamemonsternemingsplanne wat plaasvind in ’n enkele stap of in twee stappe na gelang van die kwaliteit van die items wat geïnspekteer word. Indien die eerste monster toon dat die aantal defektiewe items ’n boonste grens oorskry, word die lot afgekeur. Indien die eerste monster toon dat die aantal defektiewe items minder is as ’n onderste grens, word die lot aanvaar. Indien die eerste monster toon dat die aantal defektiewe items in die gebied tussen die boonste en onderste grense lê, word die besluitnemingsproses voortgesit en verdere monsters word geneem. Die primêre doel is om die optimale waardes van die booonste en onderste grense te bepaal deur gebruik te maak van ’n Markov-proses sodat die totale koste verbonde aan die proses geminimiseer kan word. ’n Oplossing word daarna voorgehou tesame met ’n numeriese voorbeeld van die toepassing van die voorgestelde oplossing.

  4. Time and Covariance Threshold Triggered Optimal Uncooperative Rendezvous Using Angles-Only Navigation

    Directory of Open Access Journals (Sweden)

    Yue You

    2017-01-01

    Full Text Available A time and covariance threshold triggered optimal maneuver planning method is proposed for orbital rendezvous using angles-only navigation (AON. In the context of Yamanaka-Ankersen orbital relative motion equations, the square root unscented Kalman filter (SRUKF AON algorithm is developed to compute the relative state estimations from a low-volume/mass, power saving, and low-cost optical/infrared camera’s observations. Multi-impulsive Hill guidance law is employed in closed-loop linear covariance analysis model, based on which the quantitative relative position robustness and relative velocity robustness index are defined. By balancing fuel consumption, relative position robustness, and relative velocity robustness, we developed a time and covariance threshold triggered two-level optimal maneuver planning method, showing how these results correlate to past methods and missions and how they could potentially influence future ones. Numerical simulation proved that it is feasible to control the spacecraft with a two-line element- (TLE- level uncertain, 34.6% of range, initial relative state to a 100 m v-bar relative station keeping point, at where the trajectory dispersion reduces to 3.5% of range, under a 30% data gap per revolution on account of the eclipse. Comparing with the traditional time triggered maneuver planning method, the final relative position accuracy is improved by one order and the relative trajectory robustness and collision probability are obviously improved and reduced, respectively.

  5. Optimizing Precipitation Thresholds for Best Correlation Between Dry Lightning and Wildfires

    Science.gov (United States)

    Vant-Hull, Brian; Thompson, Tollisha; Koshak, William

    2018-03-01

    This work examines how to adjust the definition of "dry lightning" in order to optimize the correlation between dry lightning flash count and the climatology of large (>400 km2) lightning-ignited wildfires over the contiguous United States (CONUS). The National Lightning Detection Network™ and National Centers for Environmental Prediction Stage IV radar-based, gauge-adjusted precipitation data are used to form climatic data sets. For a 13 year analysis period over CONUS, a correlation of 0.88 is found between annual totals of wildfires and dry lightning. This optimal correlation is found by defining dry lightning as follows: on a 0.1° hourly grid, a precipitation threshold of no more than 0.3 mm may accumulate during any hour over a period of 3-4 days preceding the flash. Regional optimized definitions vary. When annual totals are analyzed as done here, no clear advantage is found by weighting positive polarity cloud-to-ground (+CG) lightning differently than -CG lightning. The high variability of dry lightning relative to the precipitation and lightning from which it is derived suggests it would be an independent and useful climate indicator.

  6. Image quality, threshold contrast and mean glandular dose in CR mammography

    International Nuclear Information System (INIS)

    Jakubiak, R R; Gamba, H R; Neves, E B; Peixoto, J E

    2013-01-01

    In many countries, computed radiography (CR) systems represent the majority of equipment used in digital mammography. This study presents a method for optimizing image quality and dose in CR mammography of patients with breast thicknesses between 45 and 75 mm. Initially, clinical images of 67 patients (group 1) were analyzed by three experienced radiologists, reporting about anatomical structures, noise and contrast in low and high pixel value areas, and image sharpness and contrast. Exposure parameters (kV, mAs and target/filter combination) used in the examinations of these patients were reproduced to determine the contrast-to-noise ratio (CNR) and mean glandular dose (MGD). The parameters were also used to radiograph a CDMAM (version 3.4) phantom (Artinis Medical Systems, The Netherlands) for image threshold contrast evaluation. After that, different breast thicknesses were simulated with polymethylmethacrylate layers and various sets of exposure parameters were used in order to determine optimal radiographic parameters. For each simulated breast thickness, optimal beam quality was defined as giving a target CNR to reach the threshold contrast of CDMAM images for acceptable MGD. These results were used for adjustments in the automatic exposure control (AEC) by the maintenance team. Using optimized exposure parameters, clinical images of 63 patients (group 2) were evaluated as described above. Threshold contrast, CNR and MGD for such exposure parameters were also determined. Results showed that the proposed optimization method was effective for all breast thicknesses studied in phantoms. The best result was found for breasts of 75 mm. While in group 1 there was no detection of the 0.1 mm critical diameter detail with threshold contrast below 23%, after the optimization, detection occurred in 47.6% of the images. There was also an average MGD reduction of 7.5%. The clinical image quality criteria were attended in 91.7% for all breast thicknesses evaluated in both

  7. Considering a Threshold Energy in Reactive Transport Modeling of Microbially Mediated Redox Reactions in an Arsenic-Affected Aquifer

    Directory of Open Access Journals (Sweden)

    Marco Rotiroti

    2018-01-01

    Full Text Available The reductive dissolution of Fe-oxide driven by organic matter oxidation is the primary mechanism accepted for As mobilization in several alluvial aquifers. These processes are often mediated by microorganisms that require a minimum Gibbs energy available to conduct the reaction in order to sustain their life functions. Implementing this threshold energy in reactive transport modeling is rarely used in the existing literature. This work presents a 1D reactive transport modeling of As mobilization by the reductive dissolution of Fe-oxide and subsequent immobilization by co-precipitation in iron sulfides considering a threshold energy for the following terminal electron accepting processes: (a Fe-oxide reduction, (b sulfate reduction, and (c methanogenesis. The model is then extended by implementing a threshold energy on both reaction directions for the redox reaction pairs Fe(III reduction/Fe(II oxidation and methanogenesis/methane oxidation. The optimal threshold energy fitted in 4.50, 3.76, and 1.60 kJ/mol e− for sulfate reduction, Fe(III reduction/Fe(II oxidation, and methanogenesis/methane oxidation, respectively. The use of models implementing bidirectional threshold energy is needed when a redox reaction pair can be transported between domains with different redox potentials. This may often occur in 2D or 3D simulations.

  8. 'Outbreak Gold Standard' selection to provide optimized threshold for infectious diseases early-alert based on China Infectious Disease Automated-alert and Response System.

    Science.gov (United States)

    Wang, Rui-Ping; Jiang, Yong-Gen; Zhao, Gen-Ming; Guo, Xiao-Qin; Michael, Engelgau

    2017-12-01

    The China Infectious Disease Automated-alert and Response System (CIDARS) was successfully implemented and became operational nationwide in 2008. The CIDARS plays an important role in and has been integrated into the routine outbreak monitoring efforts of the Center for Disease Control (CDC) at all levels in China. In the CIDARS, thresholds are determined using the "Mean+2SD‟ in the early stage which have limitations. This study compared the performance of optimized thresholds defined using the "Mean +2SD‟ method to the performance of 5 novel algorithms to select optimal "Outbreak Gold Standard (OGS)‟ and corresponding thresholds for outbreak detection. Data for infectious disease were organized by calendar week and year. The "Mean+2SD‟, C1, C2, moving average (MA), seasonal model (SM), and cumulative sum (CUSUM) algorithms were applied. Outbreak signals for the predicted value (Px) were calculated using a percentile-based moving window. When the outbreak signals generated by an algorithm were in line with a Px generated outbreak signal for each week, this Px was then defined as the optimized threshold for that algorithm. In this study, six infectious diseases were selected and classified into TYPE A (chickenpox and mumps), TYPE B (influenza and rubella) and TYPE C [hand foot and mouth disease (HFMD) and scarlet fever]. Optimized thresholds for chickenpox (P 55 ), mumps (P 50 ), influenza (P 40 , P 55 , and P 75 ), rubella (P 45 and P 75 ), HFMD (P 65 and P 70 ), and scarlet fever (P 75 and P 80 ) were identified. The C1, C2, CUSUM, SM, and MA algorithms were appropriate for TYPE A. All 6 algorithms were appropriate for TYPE B. C1 and CUSUM algorithms were appropriate for TYPE C. It is critical to incorporate more flexible algorithms as OGS into the CIDRAS and to identify the proper OGS and corresponding recommended optimized threshold by different infectious disease types.

  9. Optimal threshold functions for fault detection and isolation

    DEFF Research Database (Denmark)

    Stoustrup, J.; Niemann, Hans Henrik; Cour-Harbo, A. la

    2003-01-01

    Fault diagnosis systems usually comprises two parts: a filtering part and a decision part, the latter typically based on threshold functions. In this paper, systematic ways to choose the threshold values are proposed. Two different test functions for the filtered signals are discussed and a method...

  10. Optimal replacement policy of products with repair-cost threshold after the extended warranty

    Institute of Scientific and Technical Information of China (English)

    Lijun Shang; Zhiqiang Cai

    2017-01-01

    The reliability of the product sold under a warranty is usually maintained by the manufacturer during the warranty period. After the expiry of the warranty, however, the consumer confronts a problem about how to maintain the reliability of the product. This paper proposes, from the consumer's perspective, a replace-ment policy after the extended warranty, under the assumption that the product is sold under the renewable free replacement warranty (RFRW) policy in which the replacement is dependent on the repair-cost threshold. The proposed replacement policy is the replacement after the extended warranty is performed by the consumer based on the repair-cost threshold or preventive replacement (PR) age, which are decision variables. The expected cost rate model is derived from the consumer's perspective. The existence and uniqueness of the optimal solution that minimizes the expected cost rate per unit time are offered. Finally, a numeri-cal example is presented to exemplify the proposed model.

  11. The optimal trackability threshold of fractional anisotropy for diffusion tensor tractography of the corticospinal tract

    International Nuclear Information System (INIS)

    Kunimatsu, Akira; Aoki, Shigeki; Masutani, Yoshitaka; Abe, Osamu; Hayashi, Naoto; Mori, Harushi; Masumoto, Tomohiko; Ohtomo, Kuni

    2004-01-01

    In order to ensure that three-dimensional diffusion tensor tractography (3D-DTT) of the corticospinal tract (CST), is performed accurately and efficiently, we set out to find the optimal lower threshold of fractional anisotropy (FA) below which tract elongation is terminated (trackability threshold). Thirteen patients with acute or early subacute ischemic stroke causing motor deficits were enrolled in this study. We performed 3D-DTT of the CST with diffusion tensor MR (magnetic resonance) imaging. We segmented the CST and established a cross-section of the CST in a transaxial plane as a region of interest. Thus, we selectively measured the FA values of the right and left corticospinal tracts at the level of the cerebral peduncle, the posterior limb of the internal capsule, and the centrum semiovale. The FA values of the CST were also measured on the affected side at the level where the clinically relevant infarction was present in isotropic diffusion-weighted imaging. 3D-DTT allowed us to selectively measure the FA values of the CST. Among the 267 regions of interest we measured, the minimum FA value was 0.22. The FA values of the CST were smaller and more variable in the centrum semiovale than in the other regions. The mean minus twice the standard deviation of the FA values of the CST in the centrum semiovale was calculated at 0.22 on the normal unaffected side and 0.16 on the affected side. An FA value of about 0.20 was found to be the optimal trackability threshold. (author)

  12. Optimal Threshold Determination for Discriminating Driving Anger Intensity Based on EEG Wavelet Features and ROC Curve Analysis

    Directory of Open Access Journals (Sweden)

    Ping Wan

    2016-08-01

    Full Text Available Driving anger, called “road rage”, has become increasingly common nowadays, affecting road safety. A few researches focused on how to identify driving anger, however, there is still a gap in driving anger grading, especially in real traffic environment, which is beneficial to take corresponding intervening measures according to different anger intensity. This study proposes a method for discriminating driving anger states with different intensity based on Electroencephalogram (EEG spectral features. First, thirty drivers were recruited to conduct on-road experiments on a busy route in Wuhan, China where anger could be inducted by various road events, e.g., vehicles weaving/cutting in line, jaywalking/cyclist crossing, traffic congestion and waiting red light if they want to complete the experiments ahead of basic time for extra paid. Subsequently, significance analysis was used to select relative energy spectrum of β band (β% and relative energy spectrum of θ band (θ% for discriminating the different driving anger states. Finally, according to receiver operating characteristic (ROC curve analysis, the optimal thresholds (best cut-off points of β% and θ% for identifying none anger state (i.e., neutral were determined to be 0.2183 ≤ θ% < 1, 0 < β% < 0.2586; low anger state is 0.1539 ≤ θ% < 0.2183, 0.2586 ≤ β% < 0.3269; moderate anger state is 0.1216 ≤ θ% < 0.1539, 0.3269 ≤ β% < 0.3674; high anger state is 0 < θ% < 0.1216, 0.3674 ≤ β% < 1. Moreover, the discrimination performances of verification indicate that, the overall accuracy (Acc of the optimal thresholds of β% for discriminating the four driving anger states is 80.21%, while 75.20% for that of θ%. The results can provide theoretical foundation for developing driving anger detection or warning devices based on the relevant optimal thresholds.

  13. An optimal maintenance policy for machine replacement problem using dynamic programming

    Directory of Open Access Journals (Sweden)

    Mohsen Sadegh Amalnik

    2017-06-01

    Full Text Available In this article, we present an acceptance sampling plan for machine replacement problem based on the backward dynamic programming model. Discount dynamic programming is used to solve a two-state machine replacement problem. We plan to design a model for maintenance by consid-ering the quality of the item produced. The purpose of the proposed model is to determine the optimal threshold policy for maintenance in a finite time horizon. We create a decision tree based on a sequential sampling including renew, repair and do nothing and wish to achieve an optimal threshold for making decisions including renew, repair and continue the production in order to minimize the expected cost. Results show that the optimal policy is sensitive to the data, for the probability of defective machines and parameters defined in the model. This can be clearly demonstrated by a sensitivity analysis technique.

  14. Novel method to achieve price-optimized, fully nutritious, health-promoting and acceptable national food baskets

    DEFF Research Database (Denmark)

    Parlesak, Alexandr; Robertson, Aileen

    2015-01-01

    available foods. The study was designed to obtain healthy, affordable, and socially acceptable diets for three European countries (Denmark, Slovenia, and Romania) and in three regions within Canada, Argentina, and Switzerland. Moreover, the costs for the “limiting” micronutrients and relative price......Purpose: The purpose of this study was to generate a framework for the development of health-promoting, fully nutritious, socially acceptable, and affordable national food baskets to be used as an advocacy tool by governments. In addition to containing all (micro-)nutrient requirements, food...... baskets should also reflect dietary guidelines to prevent non-communicable diseases and be optimized to achieve the highest possible social acceptance. So far, integrative approaches that include all these aspects are lacking. Methods: Food composition, local availability, food prices, national...

  15. Optimization of multi-branch switched diversity systems

    KAUST Repository

    Nam, Haewoon

    2009-10-01

    A performance optimization based on the optimal switching threshold(s) for a multi-branch switched diversity system is discussed in this paper. For the conventional multi-branch switched diversity system with a single switching threshold, the optimal switching threshold is a function of both the average channel SNR and the number of diversity branches, where computing the optimal switching threshold is not a simple task when the number of diversity branches is high. The newly proposed multi-branch switched diversity system is based on a sequence of switching thresholds, instead of a single switching threshold, where a different diversity branch uses a different switching threshold for signal comparison. Thanks to the fact that each switching threshold in the sequence can be optimized only based on the number of the remaining diversity branches, the proposed system makes it easy to find these switching thresholds. Furthermore, some selected numerical and simulation results show that the proposed switched diversity system with the sequence of optimal switching thresholds outperforms the conventional system with the single optimal switching threshold. © 2009 IEEE.

  16. An evolutionary algorithm for port-of-entry security optimization considering sensor thresholds

    International Nuclear Information System (INIS)

    Concho, Ana Lisbeth; Ramirez-Marquez, Jose Emmanuel

    2010-01-01

    According to the US Customs and Border Protection (CBP), the number of offloaded ship cargo containers arriving at US seaports each year amounts to more than 11 million. The costs of locating an undetonated terrorist weapon at one US port, or even worst, the cost caused by a detonated weapon of mass destruction, would amount to billions of dollars. These costs do not yet account for the devastating consequences that it would cause in the ability to keep the supply chain operating and the sociological and psychological effects. As such, this paper is concerned with developing a container inspection strategy that minimizes the total cost of inspection while maintaining a user specified detection rate for 'suspicious' containers. In this respect and based on a general decision-tree model, this paper presents a holistic evolutionary algorithm for finding the following: (1) optimal threshold values for every sensor and (2) the optimal configuration of the inspection strategy. The algorithm is under the assumption that different sensors with different reliability and cost characteristics can be used. Testing and experimentation show the proposed approach consistently finds high quality solutions in a reduced computational time.

  17. Utilization threshold of surface water and groundwater based on the system optimization of crop planting structure

    Directory of Open Access Journals (Sweden)

    Qiang FU,Jiahong LI,Tianxiao LI,Dong LIU,Song CUI

    2016-09-01

    Full Text Available Based on the diversity of the agricultural system, this research calculates the planting structures of rice, maize and soybean considering the optimal economic-social-ecological aspects. Then, based on the uncertainty and randomness of the water resources system, the interval two-stage stochastic programming method, which introduces the uncertainty of the interval number, is used to calculate the groundwater exploitation and the use efficiency of surface water. The method considers the minimum cost of water as the objective of the uncertainty model for surface water and groundwater joint scheduling optimization for different planting structures. Finally, by calculating harmonious entropy, the optimal exploitation utilization interval of surface water and groundwater is determined for optimal cultivation in the Sanjiang Plain. The optimal matching of the planting structure under the economic system is suitable when the mining ratio of the surface is in 44.13%—45.45% and the exploitation utilization of groundwater is in 54.82%—66.86%, the optimal planting structure under the social system is suitable when surface water mining ratio is in 47.84%—48.04% and the groundwater exploitation threshold is in 67.07%—72.00%. This article optimizes the economic-social-ecological-water system, which is important for the development of a water- and food-conserving society and providing a more accurate management environment.

  18. Nutritionally Optimized, Culturally Acceptable, Cost-Minimized Diets for Low Income Ghanaian Families Using Linear Programming.

    Science.gov (United States)

    Nykänen, Esa-Pekka A; Dunning, Hanna E; Aryeetey, Richmond N O; Robertson, Aileen; Parlesak, Alexandr

    2018-04-07

    The Ghanaian population suffers from a double burden of malnutrition. Cost of food is considered a barrier to achieving a health-promoting diet. Food prices were collected in major cities and in rural areas in southern Ghana. Linear programming (LP) was used to calculate nutritionally optimized diets (food baskets (FBs)) for a low-income Ghanaian family of four that fulfilled energy and nutrient recommendations in both rural and urban settings. Calculations included implementing cultural acceptability for families living in extreme and moderate poverty (food budget under USD 1.9 and 3.1 per day respectively). Energy-appropriate FBs minimized for cost, following Food Balance Sheets (FBS), lacked key micronutrients such as iodine, vitamin B12 and iron for the mothers. Nutritionally adequate FBs were achieved in all settings when optimizing for a diet cheaper than USD 3.1. However, when delimiting cost to USD 1.9 in rural areas, wild foods had to be included in order to meet nutritional adequacy. Optimization suggested to reduce roots, tubers and fruits and to increase cereals, vegetables and oil-bearing crops compared with FBS. LP is a useful tool to design culturally acceptable diets at minimum cost for low-income Ghanaian families to help advise national authorities how to overcome the double burden of malnutrition.

  19. Design optimization of radiation-hardened CMOS integrated circuits

    International Nuclear Information System (INIS)

    1975-01-01

    Ionizing-radiation-induced threshold voltage shifts in CMOS integrated circuits will drastically degrade circuit performance unless the design parameters related to the fabrication process are properly chosen. To formulate an approach to CMOS design optimization, experimentally observed analytical relationships showing strong dependences between threshold voltage shifts and silicon dioxide thickness are utilized. These measurements were made using radiation-hardened aluminum-gate CMOS inverter circuits and have been corroborated by independent data taken from MOS capacitor structures. Knowledge of these relationships allows one to define ranges of acceptable CMOS design parameters based upon radiation-hardening capabilities and post-irradiation performance specifications. Furthermore, they permit actual design optimization of CMOS integrated circuits which results in optimum pre- and post-irradiation performance with respect to speed, noise margins, and quiescent power consumption. Theoretical and experimental results of these procedures, the applications of which can mean the difference between failure and success of a CMOS integrated circuit in a radiation environment, are presented

  20. Hierarchical Artificial Bee Colony Optimizer with Divide-and-Conquer and Crossover for Multilevel Threshold Image Segmentation

    Directory of Open Access Journals (Sweden)

    Maowei He

    2014-01-01

    Full Text Available This paper presents a novel optimization algorithm, namely, hierarchical artificial bee colony optimization (HABC, for multilevel threshold image segmentation, which employs a pool of optimal foraging strategies to extend the classical artificial bee colony framework to a cooperative and hierarchical fashion. In the proposed hierarchical model, the higher-level species incorporates the enhanced information exchange mechanism based on crossover operator to enhance the global search ability between species. In the bottom level, with the divide-and-conquer approach, each subpopulation runs the original ABC method in parallel to part-dimensional optimum, which can be aggregated into a complete solution for the upper level. The experimental results for comparing HABC with several successful EA and SI algorithms on a set of benchmarks demonstrated the effectiveness of the proposed algorithm. Furthermore, we applied the HABC to the multilevel image segmentation problem. Experimental results of the new algorithm on a variety of images demonstrated the performance superiority of the proposed algorithm.

  1. Effect of threshold quantization in opportunistic splitting algorithm

    KAUST Repository

    Nam, Haewoon; Alouini, Mohamed-Slim

    2011-01-01

    This paper discusses algorithms to find the optimal threshold and also investigates the impact of threshold quantization on the scheduling outage performance of the opportunistic splitting scheduling algorithm. Since this algorithm aims at finding

  2. Does more energy consumption bolster economic growth? An application of the nonlinear threshold regression model

    International Nuclear Information System (INIS)

    Huang, B.-N.; Hwang, M.J.; Yang, C.W.

    2008-01-01

    This paper separates data extending from 1971 to 2002 into the energy crisis period (1971-1980) and the post-energy crisis period (1981-2000) for 82 countries. The cross-sectional data (yearly averages) in these two periods are used to investigate the nonlinear relationships between energy consumption growth and economic growth when threshold variables are used. If threshold variables are higher than certain optimal threshold levels, there is either no significant relationship or else a significant negative relationship between energy consumption and economic growth. However, when these threshold variables are lower than certain optimal levels, there is a significant positive relationship between the two. In 48 out of the 82 countries studied, none of the four threshold variables is found to be higher than the optimal levels. It is inferred that these 48 countries should adopt a more aggressive energy policy. As for the other 34 countries, at least one threshold variable is higher than the optimal threshold level and thus these countries should adopt energy policies with varying degrees of conservation based on the number of threshold variables that are higher than the optimal threshold levels

  3. Social psychological approach to the problem of threshold

    International Nuclear Information System (INIS)

    Nakayachi, Kazuya

    1999-01-01

    This paper discusses the threshold of carcinogen risk from the viewpoint of social psychology. First, the results of a survey suggesting that renunciation of the Linear No-Threshold (LNT) hypothesis would have no influence on the public acceptance (PA) of nuclear power plants are reported. Second, the relationship between the adoption of the LNT hypothesis and the standardization of management for various risks are discussed. (author)

  4. Risk Acceptance Criteria and/or Decision optimization

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1996-01-01

    Acceptance criteria applied in practical risk analysis are recapitulated including the concept of rist profile. Modelling of risk profiles is illustrated on the basis of compound Poisson process models. The current practice of authoritative acceptance criteria formulation is discussed from...... a decision theoretical point of view. It is argued that the phenomenon of risk aversion rather than being of concern to the authority should be of concern to the owner. Finally it is discussed whether there is an ethical problem when formally capitalising human lives with a positive interest rate. Keywords......: Risk acceptance, Risk profile, Compound Poisson model for risk profile, Capitalization of human life, Risk aversion....

  5. Automatic Threshold Setting and Its Uncertainty Quantification in Wind Turbine Condition Monitoring System

    DEFF Research Database (Denmark)

    Marhadi, Kun Saptohartyadi; Skrimpas, Georgios Alexandros

    2015-01-01

    Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times the underly......Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times...... the underlying probability distribution that describes the data is not known. Choosing an incorrect distribution to describe the data and then setting up thresholds based on the chosen distribution could result in sub-optimal thresholds. Moreover, in wind turbine applications the collected data available may...... not represent the whole operating conditions of a turbine, which results in uncertainty in the parameters of the fitted probability distribution and the thresholds calculated. In this study, Johnson, Normal, and Weibull distributions are investigated; which distribution can best fit vibration data collected...

  6. Application of threshold concepts to ecological management problems: occupancy of Golden Eagles in Denali National Park, Alaska: Chapter 5

    Science.gov (United States)

    Eaton, Mitchell J.; Martin, Julien; Nichols, James D.; McIntyre, Carol; McCluskie, Maggie C.; Schmutz, Joel A.; Lubow, Bruce L.; Runge, Michael C.; Edited by Guntenspergen, Glenn R.

    2014-01-01

    In this chapter, we demonstrate the application of the various classes of thresholds, detailed in earlier chapters and elsewhere, via an actual but simplified natural resource management case study. We intend our example to provide the reader with the ability to recognize and apply the theoretical concepts of utility, ecological and decision thresholds to management problems through a formalized decision-analytic process. Our case study concerns the management of human recreational activities in Alaska’s Denali National Park, USA, and the possible impacts of such activities on nesting Golden Eagles, Aquila chrysaetos. Managers desire to allow visitors the greatest amount of access to park lands, provided that eagle nesting-site occupancy is maintained at a level determined to be acceptable by the managers themselves. As these two management objectives are potentially at odds, we treat minimum desired occupancy level as a utility threshold which, then, serves to guide the selection of annual management alternatives in the decision process. As human disturbance is not the only factor influencing eagle occupancy, we model nesting-site dynamics as a function of both disturbance and prey availability. We incorporate uncertainty in these dynamics by considering several hypotheses, including a hypothesis that site occupancy is affected only at a threshold level of prey abundance (i.e., an ecological threshold effect). By considering competing management objectives and accounting for two forms of thresholds in the decision process, we are able to determine the optimal number of annual nesting-site restrictions that will produce the greatest long-term benefits for both eagles and humans. Setting a utility threshold of 75 occupied sites, out of a total of 90 potential nesting sites, the optimization specified a decision threshold at approximately 80 occupied sites. At the point that current occupancy falls below 80 sites, the recommended decision is to begin restricting

  7. Determining color difference thresholds in denture base acrylic resin.

    Science.gov (United States)

    Ren, Jiabao; Lin, Hong; Huang, Qingmei; Zheng, Gang

    2015-11-01

    In restorative prostheses, color is important, but the choice of color difference formula used to quantify color change in acrylic resins is not straightforward. The purpose of this in vitro study was to choose a color difference formula that best represented differences between the calculated color and the observed imperceptible to unacceptable color and to determine the corresponding perceptibility and acceptability threshold of color stability for denture base acrylic resins. A total of 291 acrylic resin denture base plates were fabricated and subjected to radiation tests from zero to 42 hours in accordance with ISO 7491:2000. Color was measured with a portable spectrophotometer, and color differences were calculated with 3 International Commission on Illumination (CIE) formulas: CIELab, CMC(1:1), and CIEDE2000. Thirty-four observers with no deficiencies in color perception participated in psychophysical perceptibility and acceptability assessments under controlled conditions in vitro. These 2 types of assessments were regressed to each observer by each formula to generate receiver operator characteristic (ROC) curves. Areas under the curves (AUCs) were then calculated and analyzed to exclude observers with poor color discrimination. AUCs were subjected to 1-way ANOVA (α=.05) to deter the statistical significance of discriminability among the 3 formulas in terms of perceptibility and acceptability judgments. Student-Newman-Keuls tests (α=.05) were used for post hoc comparison. CMC(1:1) and CIEDE2000 formulas performed better for imperceptible to unacceptable color differences, with corresponding CMC(1:1) and CIEDE2000 values for perceptibility of 2.52 and 1.72, respectively, and acceptability thresholds of 6.21 and 4.08, respectively. Formulas CMC(1:1) and CIEDE2000 possess higher discriminability than that of CIELab in the assessment of perceptible color difference threshold of denture base acrylic resin. A statistically significant difference exists

  8. Measurements of NN → dπ near threshold

    International Nuclear Information System (INIS)

    Hutcheon, D.A.

    1990-09-01

    New, precise measurements of the differential cross sections for np → dπ 0 and π + d → pp and of analyzing powers for pp → dπ + have been made at energies within 10 MeV (c.m.) of threshold. They allow the pion s-wave and p-wave parts of the production strength to be distinguished unambiguously, yielding an s-wave strength at threshold which is significantly smaller than the previously accepted value. There is no evidence for charge independence breaking nor for πNN resonances near threshold. (Author) (17 refs., 17 figs., tab.)

  9. Optimization of Second Fault Detection Thresholds to Maximize Mission POS

    Science.gov (United States)

    Anzalone, Evan

    2018-01-01

    both magnitude and time. As such, the Navigation team is taking advantage of the INS's capability to schedule and change fault detection thresholds in flight. These values are optimized along a nominal trajectory in order to maximize probability of mission success, and reducing the probability of false positives (defined as when the INS would report a second fault condition resulting in loss of mission, but the vehicle would still meet insertion requirements within system-level margins). This paper will describe an optimization approach using Genetic Algorithms to tune the threshold parameters to maximize vehicle resilience to second fault events as a function of potential fault magnitude and time of fault over an ascent mission profile. The analysis approach, and performance assessment of the results will be presented to demonstrate the applicability of this process to second fault detection to maximize mission probability of success.

  10. Decision-making when data and inferences are not conclusive: risk-benefit and acceptable regret approach.

    Science.gov (United States)

    Hozo, Iztok; Schell, Michael J; Djulbegovic, Benjamin

    2008-07-01

    The absolute truth in research is unobtainable, as no evidence or research hypothesis is ever 100% conclusive. Therefore, all data and inferences can in principle be considered as "inconclusive." Scientific inference and decision-making need to take into account errors, which are unavoidable in the research enterprise. The errors can occur at the level of conclusions that aim to discern the truthfulness of research hypothesis based on the accuracy of research evidence and hypothesis, and decisions, the goal of which is to enable optimal decision-making under present and specific circumstances. To optimize the chance of both correct conclusions and correct decisions, the synthesis of all major statistical approaches to clinical research is needed. The integration of these approaches (frequentist, Bayesian, and decision-analytic) can be accomplished through formal risk:benefit (R:B) analysis. This chapter illustrates the rational choice of a research hypothesis using R:B analysis based on decision-theoretic expected utility theory framework and the concept of "acceptable regret" to calculate the threshold probability of the "truth" above which the benefit of accepting a research hypothesis outweighs its risks.

  11. Does sensory stimulation threshold affect lumbar facet radiofrequency denervation outcomes? A prospective clinical correlational study.

    Science.gov (United States)

    Cohen, Steven P; Strassels, Scott A; Kurihara, Connie; Lesnick, Ivan K; Hanling, Steven R; Griffith, Scott R; Buckenmaier, Chester C; Nguyen, Conner

    2011-11-01

    Radiofrequency facet denervation is one of the most frequently performed procedures for chronic low back pain. Although sensory stimulation is generally used as a surrogate measure to denote sufficient proximity of the electrode to the nerve, no study has examined whether stimulation threshold influences outcome. We prospectively recorded data in 61 consecutive patients undergoing lumbar facet radiofrequency denervation who experienced significant pain relief after medial branch blocks. For each nerve lesioned, multiple attempts were made to maximize sensory stimulation threshold (SST). Mean SST was calculated on the basis of the lowest stimulation perceived at 0.1-V increments for each medial branch. A positive outcome was defined as a ≥50% reduction in back pain coupled with a positive satisfaction score lasting ≥3 months. The relationship between mean SST and denervation outcomes was evaluated via a receiver's operating characteristic (ROC) curve, and stratifying outcomes on the basis of various cutoff values. No correlation was noted between mean SST and pain relief at rest (Pearson's r=-0.01, 95% confidence interval [CI]: -0.24 to 0.23, P=0.97), with activity (r=-0.17, 95% CI: -0.40 to 0.07, P=0.20), or a successful outcome. No optimal SST could be identified. There is no significant relationship between mean SST during lumbar facet radiofrequency denervation and treatment outcome, which may be due to differences in general sensory perception. Because stimulation threshold was optimized for each patient, these data cannot be interpreted to suggest that sensory testing should not be performed, or that high sensory stimulation thresholds obtained on the first attempt should be deemed acceptable.

  12. Simplified Threshold RSA with Adaptive and Proactive Security

    DEFF Research Database (Denmark)

    Almansa Guerra, Jesus Fernando; Damgård, Ivan Bjerre; Nielsen, Jesper Buus

    2006-01-01

    We present the currently simplest, most efficient, optimally resilient, adaptively secure, and proactive threshold RSA scheme. A main technical contribution is a new rewinding strategy for analysing threshold signature schemes. This new rewinding strategy allows to prove adaptive security...... of a proactive threshold signature scheme which was previously assumed to be only statically secure. As a separate contribution we prove that our protocol is secure in the UC framework....

  13. Influence of Optimization of Process Parameters on Threshold Voltage for Development of HfO2/TiSi2 18 nm PMOS

    Directory of Open Access Journals (Sweden)

    Atan N.

    2016-01-01

    Full Text Available Manufacturing a 18-nm transistor requires a variety of parameters, materials, temperatures, and methods. In this research, HfO2 was used as the gate dielectric ad TiO2 was used as the gate material. The transistor HfO2/TiSi2 18-nm PMOS was invented using SILVACO TCAD. Ion implantation was adopted in the fabrication process for the method’s practicality and ability to be used to suppress short channel effects. The study involved ion implantation methods: compensation implantation, halo implantation energy, halo tilt, and source–drain implantation. Taguchi method is the best optimization process for a threshold voltage of HfO2/TiSi2 18-nm PMOS. In this case, the method adopted was Taguchi orthogonal array L9. The process parameters (ion implantations and noise factors were evaluated by examining the Taguchi’s signal-to-noise ratio (SNR and nominal-the-best for the threshold voltage (VTH. After optimization, the result showed that the VTH value of the 18-nm PMOS device was -0.291339.

  14. ‘Soglitude’- introducing a method of thinking thresholds

    Directory of Open Access Journals (Sweden)

    Tatjana Barazon

    2010-04-01

    philosophical, artistic or scientific, it tends to free itself from rigid or fixed models and accepts change and development as the fundamental nature of things. Thinking thresholds as a method of thought progress cannot be done in a single process and therefore asks for participation in its proper nature. The soglitude springs namely from the acceptance of a multitude of points of view, as it is shown by the numerous contributions we present in this issue ‘Seuils, Thresholds, Soglitudes’ of Conserveries mémorielles.

  15. The relationship of VOI threshold, volume and B/S on DISA images

    International Nuclear Information System (INIS)

    Song Liejing; Wang Mingming; Si Hongwei; Li Fei

    2011-01-01

    Objective: To explore the relationship of VOI threshold, Volume and B/S on DISA phantom images. Methods: Ten hollow spheres were placed in cylinder phantom. According to the B/S of 1 : 7, 1 : 5 and 1 : 4, 99m TcO 4- and 18 F-FDG was filled into the container and spheres simultaneously and separately. Images were acquired by DISA and SIDA protocol. Volume of interest (VOI) for each sphere was analyzed by threshold method and to fit expression individually for validating of the relationship. Results: The equation for the estimation of optimal threshold was as following Tm = d + c × Bm/(e + f × Vm) + b/Vm. In majority of data, the calculated threshold was in the 1% interval that optimal thresholds were really in. Those who were not in were at the lower or upper intervals. Conclusions: Both DISA and SIDA images, based o the relationship of VOI thresh- old. Volume and B/S and real volume, this method could accurately calculate optimal threshold with an error less than 1% for spheres whose volumes ranged from 3.3 to 30.8 ml. (authors)

  16. Optimization of multi-branch switched diversity systems

    KAUST Repository

    Nam, Haewoon; Alouini, Mohamed-Slim

    2009-01-01

    A performance optimization based on the optimal switching threshold(s) for a multi-branch switched diversity system is discussed in this paper. For the conventional multi-branch switched diversity system with a single switching threshold

  17. Threshold effect under nonlinear limitation of the intensity of high-power light

    International Nuclear Information System (INIS)

    Tereshchenko, S A; Podgaetskii, V M; Gerasimenko, A Yu; Savel'ev, M S

    2015-01-01

    A model is proposed to describe the properties of limiters of high-power laser radiation, which takes into account the threshold character of nonlinear interaction of radiation with the working medium of the limiter. The generally accepted non-threshold model is a particular case of the threshold model if the threshold radiation intensity is zero. Experimental z-scan data are used to determine the nonlinear optical characteristics of media with carbon nanotubes, polymethine and pyran dyes, zinc selenide, porphyrin-graphene and fullerene-graphene. A threshold effect of nonlinear interaction between laser radiation and some of investigated working media of limiters is revealed. It is shown that the threshold model more adequately describes experimental z-scan data. (nonlinear optical phenomena)

  18. Threshold selection for classification of MR brain images by clustering method

    Energy Technology Data Exchange (ETDEWEB)

    Moldovanu, Simona [Faculty of Sciences and Environment, Department of Chemistry, Physics and Environment, Dunărea de Jos University of Galaţi, 47 Domnească St., 800008, Romania, Phone: +40 236 460 780 (Romania); Dumitru Moţoc High School, 15 Milcov St., 800509, Galaţi (Romania); Obreja, Cristian; Moraru, Luminita, E-mail: luminita.moraru@ugal.ro [Faculty of Sciences and Environment, Department of Chemistry, Physics and Environment, Dunărea de Jos University of Galaţi, 47 Domnească St., 800008, Romania, Phone: +40 236 460 780 (Romania)

    2015-12-07

    Given a grey-intensity image, our method detects the optimal threshold for a suitable binarization of MR brain images. In MR brain image processing, the grey levels of pixels belonging to the object are not substantially different from the grey levels belonging to the background. Threshold optimization is an effective tool to separate objects from the background and further, in classification applications. This paper gives a detailed investigation on the selection of thresholds. Our method does not use the well-known method for binarization. Instead, we perform a simple threshold optimization which, in turn, will allow the best classification of the analyzed images into healthy and multiple sclerosis disease. The dissimilarity (or the distance between classes) has been established using the clustering method based on dendrograms. We tested our method using two classes of images: the first consists of 20 T2-weighted and 20 proton density PD-weighted scans from two healthy subjects and from two patients with multiple sclerosis. For each image and for each threshold, the number of the white pixels (or the area of white objects in binary image) has been determined. These pixel numbers represent the objects in clustering operation. The following optimum threshold values are obtained, T = 80 for PD images and T = 30 for T2w images. Each mentioned threshold separate clearly the clusters that belonging of the studied groups, healthy patient and multiple sclerosis disease.

  19. An optimal maintenance policy for machine replacement problem using dynamic programming

    OpenAIRE

    Mohsen Sadegh Amalnik; Morteza Pourgharibshahi

    2017-01-01

    In this article, we present an acceptance sampling plan for machine replacement problem based on the backward dynamic programming model. Discount dynamic programming is used to solve a two-state machine replacement problem. We plan to design a model for maintenance by consid-ering the quality of the item produced. The purpose of the proposed model is to determine the optimal threshold policy for maintenance in a finite time horizon. We create a decision tree based on a sequential sampling inc...

  20. Developing thresholds of potential concern for invasive alien species: Hypotheses and concepts

    Directory of Open Access Journals (Sweden)

    Llewellyn C. Foxcroft

    2009-03-01

    Conservation implication: In accepting that species and systems are variable, and that flux is inevitable and desirable, these TPCs developed for invasive alien species specifi cally, provide end points against which monitoring can be assessed. Once a threshold is reached, the cause of the threshold being exceeded is examined and management interventions recommended.

  1. Differentiation of Adrenal Adenoma and Nonadenoma in Unenhanced CT: New Optimal Threshold Value and the Usefulness of Size Criteria for Differentiation

    International Nuclear Information System (INIS)

    Park, Sung Hee; Kim, Myeong Jin; Kim, Joo Hee; Lim, Joon Seok; Kim, Ki Whang

    2007-01-01

    To determine the optimal threshold for the attenuation values in unenhanced computed tomography (CT) and assess the value of the size criteria for differentiating between an adrenal adenoma and a nonadenoma. The unenhanced CT images of 45 patients at our institution, who underwent a surgical resection of an adrenal masses between January 2001 and July 2005, were retrospectively reviewed. Forty-five adrenal masses included 25 cortical adenomas, 12 pheochromocytomas, three lymphomas, and five metastases confirmed by pathology were examined. The CT images were obtained at a slice thickness of 2 mm to 3 mm. The mAs were varied from 100 to 160 and 200 to 280, while the 120 KVp was maintained in all cases. The mean attenuation values of an adrenal adenoma and nonadenoma were compared using an unpaired t test. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy at thresholds of 10 HU, 20 HU, and 25 HU were compared. The diagnostic accuracy according to the size criteria from 2 cm to 6 cm was also compared. The twenty-five adenomas showed significantly lower (p 90% but a specificity < 70%. Size criteria of 2 or 3 cm had a high specificity of 100% and 80% but a low sensitivity of 20% and 60%. The threshold attenuation values of 20 or 25 HU in the unenhanced CT appear optimal for discriminating an adrenal adenoma from a nonadenoma. The size criteria are of little value in differentiating adrenal masses because of their low specificity or low sensitivity

  2. A hybrid flower pollination algorithm based modified randomized location for multi-threshold medical image segmentation.

    Science.gov (United States)

    Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou

    2015-01-01

    Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.

  3. Adiabatic theory of Wannier threshold laws and ionization cross sections

    International Nuclear Information System (INIS)

    Macek, J.H.; Ovchinnikov, S.Y.

    1994-01-01

    Adiabatic energy eigenvalues of H 2 + are computed for complex values of the internuclear distance R. The infinite number of bound-state eigenenergies are represented by a function ε(R) that is single valued on a multisheeted Riemann surface. A region is found where ε(R) and the corresponding eigenfunctions exhibit harmonic-oscillator structure characteristic of electron motion on a potential saddle. The Schroedinger equation is solved in the adiabatic approximation along a path in the complex R plane to compute ionization cross sections. The cross section thus obtained joins the Wannier threshold region with the keV energy region, but the exponent near the ionization threshold disagrees with well-accepted values. Accepted values are obtained when a lowest-order diabatic correction is employed, indicating that adiabatic approximations do not give the correct zero velocity limit for ionization cross sections. Semiclassical eigenvalues for general top-of-barrier motion are given and the theory is applied to the ionization of atomic hydrogen by electron impact. The theory with a first diabatic correction gives the Wannier threshold law even for this case

  4. Relationship Between Consumer Acceptability and Pungency-Related Flavor Compounds of Vidalia Onions.

    Science.gov (United States)

    Kim, Ha-Yeon; Jackson, Daniel; Adhikari, Koushik; Riner, Cliff; Sanchez-Brambila, Gabriela

    2017-10-01

    A consumer study was conducted to evaluate preferences in Vidalia onions, and define consumer acceptability thresholds for commonly analyzed flavor compounds associated with pungency. Two varieties of Vidalia onions (Plethora and Sapelo Sweet) were grown at 3 fertilizer application rates (37.5 and 0; 134.5 and 59.4; and 190 and 118.8 kg/ha of nitrogen and sulfur, respectively), creating 6 treatments with various flavor attributes to use in the study. Bulb soluble solids, sugars, pyruvic acid, lachrymatory factor (LF; propanethial S-oxide), and methyl thiosulfinate (MT) content were determined and compared to sensory responses for overall liking, intensity of the sharp/pungent/burning sensation (SPB), and intent to buy provided by 142 consumers. Onion pyruvate, LF, MT, and sugar content increased as fertilization rate increased, regardless of onion variety. Consumer responses showed participants preferred onions with low SPB, which correlated positively to lower pyruvate, LF and MT concentrations, but showed no relationship to total sugars in the onion bulb. Regression analyses revealed that the majority of consumers (≥55%) found the flavor of Vidalia onions acceptable when the concentrations of LF, pyruvic acid, and MT within the bulbs were below 2.21, 4.83, and 0.43 nmol/mL, respectively. These values will support future studies aimed at identifying the optimal cultivation practices for production of sweet Vidalia onions, and can serve as an industry benchmark for quality control, thus ensuring the flavor of Vidalia onions will be acceptable to the majority of consumers. This study identified the relationship between consumer preferences and commonly analyzed flavor compounds in Vidalia onions, and established thresholds for these compounds at concentrations which the majority of consumers will find desirable. These relationships and thresholds will support future research investigating how cultural practices impact onion quality, and can be used to assist

  5. The distribution choice for the threshold of solid state relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    Either normal distribution or Weibull distribution can be accepted as sample distribution of the threshold of solid state relay. By goodness-of-fit method, bootstrap method and Bayesian method, the Weibull distribution is chosen later. (authors)

  6. Regional rainfall thresholds for landslide occurrence using a centenary database

    Science.gov (United States)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Garcia, Ricardo A. C.; Quaresma, Ivânia

    2018-04-01

    This work proposes a comprehensive method to assess rainfall thresholds for landslide initiation using a centenary landslide database associated with a single centenary daily rainfall data set. The method is applied to the Lisbon region and includes the rainfall return period analysis that was used to identify the critical rainfall combination (cumulated rainfall duration) related to each landslide event. The spatial representativeness of the reference rain gauge is evaluated and the rainfall thresholds are assessed and calibrated using the receiver operating characteristic (ROC) metrics. Results show that landslide events located up to 10 km from the rain gauge can be used to calculate the rainfall thresholds in the study area; however, these thresholds may be used with acceptable confidence up to 50 km from the rain gauge. The rainfall thresholds obtained using linear and potential regression perform well in ROC metrics. However, the intermediate thresholds based on the probability of landslide events established in the zone between the lower-limit threshold and the upper-limit threshold are much more informative as they indicate the probability of landslide event occurrence given rainfall exceeding the threshold. This information can be easily included in landslide early warning systems, especially when combined with the probability of rainfall above each threshold.

  7. Rejection thresholds in solid chocolate-flavored compound coating.

    Science.gov (United States)

    Harwood, Meriel L; Ziegler, Gregory R; Hayes, John E

    2012-10-01

    Classical detection thresholds do not predict liking, as they focus on the presence or absence of a sensation. Recently however, Prescott and colleagues described a new method, the rejection threshold, where a series of forced choice preference tasks are used to generate a dose-response function to determine hedonically acceptable concentrations. That is, how much is too much? To date, this approach has been used exclusively in liquid foods. Here, we determined group rejection thresholds in solid chocolate-flavored compound coating for bitterness. The influences of self-identified preferences for milk or dark chocolate, as well as eating style (chewers compared to melters) on rejection thresholds were investigated. Stimuli included milk chocolate-flavored compound coating spiked with increasing amounts of sucrose octaacetate, a bitter and generally recognized as safe additive. Paired preference tests (blank compared to spike) were used to determine the proportion of the group that preferred the blank. Across pairs, spiked samples were presented in ascending concentration. We were able to quantify and compare differences between 2 self-identified market segments. The rejection threshold for the dark chocolate preferring group was significantly higher than the milk chocolate preferring group (P= 0.01). Conversely, eating style did not affect group rejection thresholds (P= 0.14), although this may reflect the amount of chocolate given to participants. Additionally, there was no association between chocolate preference and eating style (P= 0.36). Present work supports the contention that this method can be used to examine preferences within specific market segments and potentially individual differences as they relate to ingestive behavior. This work makes use of the rejection threshold method to study market segmentation, extending its use to solid foods. We believe this method has broad applicability to the sensory specialist and product developer by providing a

  8. Operations Acceptance Management

    OpenAIRE

    Suchá, Ivana

    2010-01-01

    This paper examines the process of Operations Acceptance Management, whose main task is to control Operations Acceptance Tests (OAT). In the first part the author focuses on the theoretical ground for the problem in the context of ITSM best practices framework ITIL. Benefits, process pitfalls and possibilities for automation are discussed in this part. The second part contains a case study of DHL IT Services (Prague), where a solution optimizing the overall workflow was implemented using simp...

  9. Optimization of the acceptance of prebiotic beverage made from cashew nut kernels and passion fruit juice.

    Science.gov (United States)

    Rebouças, Marina Cabral; Rodrigues, Maria do Carmo Passos; Afonso, Marcos Rodrigues Amorim

    2014-07-01

    The aim of this research was to develop a prebiotic beverage from a hydrosoluble extract of broken cashew nut kernels and passion fruit juice using response surface methodology in order to optimize acceptance of its sensory attributes. A 2(2) central composite rotatable design was used, which produced 9 formulations, which were then evaluated using different concentrations of hydrosoluble cashew nut kernel, passion fruit juice, oligofructose, and 3% sugar. The use of response surface methodology to interpret the sensory data made it possible to obtain a formulation with satisfactory acceptance which met the criteria of bifidogenic action and use of hydrosoluble cashew nut kernels by using 14% oligofructose and 33% passion fruit juice. As a result of this study, it was possible to obtain a new functional prebiotic product, which combined the nutritional and functional properties of cashew nut kernels and oligofructose with the sensory properties of passion fruit juice in a beverage with satisfactory sensory acceptance. This new product emerges as a new alternative for the industrial processing of broken cashew nut kernels, which have very low market value, enabling this sector to increase its profits. © 2014 Institute of Food Technologists®

  10. Weighted-noise threshold based channel estimation for OFDM ...

    Indian Academy of Sciences (India)

    Existing optimal time-domain thresholds exhibit suboptimal behavior for completely unavailable KCS ... Compared with no truncation case, truncation improved the MSE ... channel estimation errors has been studied. ...... Consumer Electron.

  11. Robust Adaptive Thresholder For Document Scanning Applications

    Science.gov (United States)

    Hsing, To R.

    1982-12-01

    In document scanning applications, thresholding is used to obtain binary data from a scanner. However, due to: (1) a wide range of different color backgrounds; (2) density variations of printed text information; and (3) the shading effect caused by the optical systems, the use of adaptive thresholding to enhance the useful information is highly desired. This paper describes a new robust adaptive thresholder for obtaining valid binary images. It is basically a memory type algorithm which can dynamically update the black and white reference level to optimize a local adaptive threshold function. The results of high image quality from different types of simulate test patterns can be obtained by this algorithm. The software algorithm is described and experiment results are present to describe the procedures. Results also show that the techniques described here can be used for real-time signal processing in the varied applications.

  12. A low threshold nanocavity in a two-dimensional 12-fold photonic quasicrystal

    Science.gov (United States)

    Ren, Jie; Sun, XiaoHong; Wang, Shuai

    2018-05-01

    In this article, a low threshold nanocavity is built and investigated in a two-dimensional 12-fold holographic photonic quasicrystal (PQC). The cavity is formed by using the method of multi-beam common-path interference. By finely adjusting the structure parameters of the cavity, the Q factor and the mode volume are optimized, which are two keys to low-threshold on the basis of Purcell effect. Finally, an optimal cavity is obtained with Q value of 6023 and mode volume of 1.24 ×10-12cm3 . On the other hand, by Fourier Transformation of the electric field components in the cavity, the in-plane wave vectors are calculated and fitted to evaluate the cavity performance. The performance analysis of the cavity further proves the effectiveness of the optimization process. This has a guiding significance for the research of low threshold nano-laser.

  13. Color image Segmentation using automatic thresholding techniques

    International Nuclear Information System (INIS)

    Harrabi, R.; Ben Braiek, E.

    2011-01-01

    In this paper, entropy and between-class variance based thresholding methods for color images segmentation are studied. The maximization of the between-class variance (MVI) and the entropy (ME) have been used as a criterion functions to determine an optimal threshold to segment images into nearly homogenous regions. Segmentation results from the two methods are validated and the segmentation sensitivity for the test data available is evaluated, and a comparative study between these methods in different color spaces is presented. The experimental results demonstrate the superiority of the MVI method for color image segmentation.

  14. Structured decision making as a conceptual framework to identify thresholds for conservation and management

    Science.gov (United States)

    Martin, J.; Runge, M.C.; Nichols, J.D.; Lubow, B.C.; Kendall, W.L.

    2009-01-01

    Thresholds and their relevance to conservation have become a major topic of discussion in the ecological literature. Unfortunately, in many cases the lack of a clear conceptual framework for thinking about thresholds may have led to confusion in attempts to apply the concept of thresholds to conservation decisions. Here, we advocate a framework for thinking about thresholds in terms of a structured decision making process. The purpose of this framework is to promote a logical and transparent process for making informed decisions for conservation. Specification of such a framework leads naturally to consideration of definitions and roles of different kinds of thresholds in the process. We distinguish among three categories of thresholds. Ecological thresholds are values of system state variables at which small changes bring about substantial changes in system dynamics. Utility thresholds are components of management objectives (determined by human values) and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. The approach that we present focuses directly on the objectives of management, with an aim to providing decisions that are optimal with respect to those objectives. This approach clearly distinguishes the components of the decision process that are inherently subjective (management objectives, potential management actions) from those that are more objective (system models, estimates of system state). Optimization based on these components then leads to decision matrices specifying optimal actions to be taken at various values of system state variables. Values of state variables separating different actions in such matrices are viewed as decision thresholds. Utility thresholds are included in the objectives

  15. Optimization of airport security process

    Science.gov (United States)

    Wei, Jianan

    2017-05-01

    In order to facilitate passenger travel, on the basis of ensuring public safety, the airport security process and scheduling to optimize. The stochastic Petri net is used to simulate the single channel security process, draw the reachable graph, construct the homogeneous Markov chain to realize the performance analysis of the security process network, and find the bottleneck to limit the passenger throughput. Curve changes in the flow of passengers to open a security channel for the initial state. When the passenger arrives at a rate that exceeds the processing capacity of the security channel, it is queued. The passenger reaches the acceptable threshold of the queuing time as the time to open or close the next channel, simulate the number of dynamic security channel scheduling to reduce the passenger queuing time.

  16. Using instrumental (CIE and reflectance) measures to predict consumers' acceptance of beef colour.

    Science.gov (United States)

    Holman, Benjamin W B; van de Ven, Remy J; Mao, Yanwei; Coombs, Cassius E O; Hopkins, David L

    2017-05-01

    We aimed to establish colorimetric thresholds based upon the capacity for instrumental measures to predict consumer satisfaction with beef colour. A web-based survey was used to distribute standardised photographs of beef M. longissimus lumborum with known colorimetrics (L*, a*, b*, hue, chroma, ratio of reflectance at 630nm and 580nm, and estimated deoxymyoglobin, oxymyoglobin and metmyoglobin concentrations) for scrutiny. Consumer demographics and perceived importance of colour to beef value were also evaluated. It was found that a* provided the most simple and robust prediction of beef colour acceptability. Beef colour was considered acceptable (with 95% acceptance) when a* values were equal to or above 14.5. Demographic effects on this threshold were negligible, but consumer nationality and gender did contribute to variation in the relative importance of colour to beef value. These results provide future beef colour studies with context to interpret objective colour measures in terms of consumer acceptance and market appeal. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  17. THRESHOLD DETERMINATION FOR LOCAL INSTANTANEOUS SEA SURFACE HEIGHT DERIVATION WITH ICEBRIDGE DATA IN BEAUFORT SEA

    Directory of Open Access Journals (Sweden)

    C. Zhu

    2018-05-01

    Full Text Available The NASA Operation IceBridge (OIB mission is the largest program in the Earth’s polar remote sensing science observation project currently, initiated in 2009, which collects airborne remote sensing measurements to bridge the gap between NASA’s ICESat and the upcoming ICESat-2 mission. This paper develop an improved method that optimizing the selection method of Digital Mapping System (DMS image and using the optimal threshold obtained by experiments in Beaufort Sea to calculate the local instantaneous sea surface height in this area. The optimal threshold determined by comparing manual selection with the lowest (Airborne Topographic Mapper ATM L1B elevation threshold of 2 %, 1 %, 0.5 %, 0.2 %, 0.1 % and 0.05 % in A, B, C sections, the mean of mean difference are 0.166 m, 0.124 m, 0.083 m, 0.018 m, 0.002 m and −0.034 m. Our study shows the lowest L1B data of 0.1 % is the optimal threshold. The optimal threshold and manual selections are also used to calculate the instantaneous sea surface height over images with leads, we find that improved methods has closer agreement with those from L1B manual selections. For these images without leads, the local instantaneous sea surface height estimated by using the linear equations between distance and sea surface height calculated over images with leads.

  18. Multiuser switched scheduling systems with per-user threshold and post-user selection

    KAUST Repository

    Nam, Haewoon

    2010-06-01

    A multiuser switched diversity scheduling scheme with per-user feedback threshold is proposed in this paper. Unlike the conventional multiuser switched diversity scheduling scheme where a single threshold is used by all the users in order to determine whether to transmit a feedback, the proposed scheme deploys per-user threshold, where each user uses a potentially different threshold than other users thresholds. This paper first provides a generic analytical framework for the optimal feedback thresholds in a closed form. Then we investigates the impact of user sequence strategies and post selection strategies on the performance of the multiuser switched scheduling scheme with per-user threshold. Numerical and simulation results show that the proposed scheme provides a higher system capacity compared to the conventional scheme. © 2010 IEEE.

  19. Multiuser switched scheduling systems with per-user threshold and post-user selection

    KAUST Repository

    Nam, Haewoon; Alouini, Mohamed-Slim

    2010-01-01

    A multiuser switched diversity scheduling scheme with per-user feedback threshold is proposed in this paper. Unlike the conventional multiuser switched diversity scheduling scheme where a single threshold is used by all the users in order to determine whether to transmit a feedback, the proposed scheme deploys per-user threshold, where each user uses a potentially different threshold than other users thresholds. This paper first provides a generic analytical framework for the optimal feedback thresholds in a closed form. Then we investigates the impact of user sequence strategies and post selection strategies on the performance of the multiuser switched scheduling scheme with per-user threshold. Numerical and simulation results show that the proposed scheme provides a higher system capacity compared to the conventional scheme. © 2010 IEEE.

  20. Optimization of prophylaxis for hemophilia A.

    Directory of Open Access Journals (Sweden)

    Robert D Herbert

    Full Text Available Prophylactic injections of factor VIII reduce the incidence of bleeds and slow the development of joint damage in people with hemophilia. The aim of this study was to identify optimal person-specific prophylaxis regimens for children with hemophilia A.Analytic and numerical methods were used to identify prophylaxis regimens which maximize the time for which plasma factor VIII concentrations exceed a threshold, maximize the lowest plasma factor VIII concentrations, and minimize risk of bleeds.It was demonstrated analytically that, for any injection schedule, the regimen that maximizes the lowest factor VIII concentration involves sharing doses between injections so that all of the trough concentrations in a prophylaxis cycle are equal. Numerical methods were used to identify optimal prophylaxis schedules and explore the trade-offs between efficacy and acceptability of different prophylaxis regimens. The prophylaxis regimen which minimizes risk of bleeds depends on the person's pattern of physical activity and may differ greatly from prophylaxis regimens that optimize pharmacokinetic parameters. Prophylaxis regimens which minimize risk of bleeds also differ from prophylaxis regimens that are typically prescribed. Predictions about which regimen is optimal are sensitive to estimates of the effects on risk of bleeds of factor VIII concentration and physical activity.The methods described here can be used to identify optimal, person-specific prophylaxis regimens for children with hemophilia A.

  1. Omega-optimized portfolios: applying stochastic dominance criterion for the selection of the threshold return

    Directory of Open Access Journals (Sweden)

    Renaldas Vilkancas

    2016-05-01

    Full Text Available Purpose of the article: While using asymmetric risk-return measures an important role is played by selection of the investor‘s required or threshold rate of return. The scientific literature usually states that every investor should define this rate according to their degree of risk aversion. In this paper, it is attempted to look at the problem from a different perspective – empirical research is aimed at determining the influence of the threshold rate of return on the portfolio characteristics. Methodology/methods: In order to determine the threshold rate of return a stochastic dominance criterion was used. The results are verified using the commonly applied method of backtesting. Scientific aim: The aim of this paper is to propose a method allowing selecting the threshold rate of return reliably and objectively. Findings: Empirical research confirms that stochastic dominance criteria can be successfully applied to determine the rate of return preferred by the investor. Conclusions: A risk-free investment rate or simply a zero rate of return commonly used in practice is often justified neither by theoretical nor empirical studies. This work suggests determining the threshold rate of return by applying the stochastic dominance criterion

  2. Adaptively optimizing stochastic resonance in visual system

    Science.gov (United States)

    Yang, Tao

    1998-08-01

    Recent psychophysics experiment has showed that the noise strength could affect the perceived image quality. This work gives an adaptive process for achieving the optimal perceived image quality in a simple image perception array, which is a simple model of an image sensor. A reference image from memory is used for constructing a cost function and defining the optimal noise strength where the cost function gets its minimum point. The reference image is a binary image, which is used to define the background and the object. Finally, an adaptive algorithm is proposed for searching the optimal noise strength. Computer experimental results show that if the reference image is a thresholded version of the sub-threshold input image then the output of the sensor array gives an optimal output, in which the background and the object have the biggest contrast. If the reference image is different from a thresholded version of the sub-threshold input image then the output usually gives a sub-optimal contrast between the object and the background.

  3. Accept or Decline? An Analytics-Based Decision Tool for Kidney Offer Evaluation.

    Science.gov (United States)

    Bertsimas, Dimitris; Kung, Jerry; Trichakis, Nikolaos; Wojciechowski, David; Vagefi, Parsia A

    2017-12-01

    When a deceased-donor kidney is offered to a waitlisted candidate, the decision to accept or decline the organ relies primarily upon a practitioner's experience and intuition. Such decisions must achieve a delicate balance between estimating the immediate benefit of transplantation and the potential for future higher-quality offers. However, the current experience-based paradigm lacks scientific rigor and is subject to the inaccuracies that plague anecdotal decision-making. A data-driven analytics-based model was developed to predict whether a patient will receive an offer for a deceased-donor kidney at Kidney Donor Profile Index thresholds of 0.2, 0.4, and 0.6, and at timeframes of 3, 6, and 12 months. The model accounted for Organ Procurement Organization, blood group, wait time, DR antigens, and prior offer history to provide accurate and personalized predictions. Performance was evaluated on data sets spanning various lengths of time to understand the adaptability of the method. Using United Network for Organ Sharing match-run data from March 2007 to June 2013, out-of-sample area under the receiver operating characteristic curve was approximately 0.87 for all Kidney Donor Profile Index thresholds and timeframes considered for the 10 most populous Organ Procurement Organizations. As more data becomes available, area under the receiver operating characteristic curve values increase and subsequently level off. The development of a data-driven analytics-based model may assist transplant practitioners and candidates during the complex decision of whether to accept or forgo a current kidney offer in anticipation of a future high-quality offer. The latter holds promise to facilitate timely transplantation and optimize the efficiency of allocation.

  4. Non-periodic preventive maintenance with reliability thresholds for complex repairable systems

    International Nuclear Information System (INIS)

    Lin, Zu-Liang; Huang, Yeu-Shiang; Fang, Chih-Chiang

    2015-01-01

    In general, a non-periodic condition-based PM policy with different condition variables is often more effective than a periodic age-based policy for deteriorating complex repairable systems. In this study, system reliability is estimated and used as the condition variable, and three reliability-based PM models are then developed with consideration of different scenarios which can assist in evaluating the maintenance cost for each scenario. The proposed approach provides the optimal reliability thresholds and PM schedules in advance by which the system availability and quality can be ensured and the organizational resources can be well prepared and managed. The results of the sensitivity anlysis indicate that PM activities performed at a high reliability threshold can not only significantly improve the system availability but also efficiently extend the system lifetime, although such a PM strategy is more costly than that for a low reliabiltiy threshold. The optimal reliability threshold increases along with the number of PM activities to prevent future breakdowns caused by severe deterioration, and thus substantially reduces repair costs. - Highlights: • The PM problems for repairable deteriorating systems are formulated. • The structural properties of the proposed PM models are investigated. • The corresponding algorithms to find the optimal PM strategies are provided. • Imperfect PM activities are allowed to reduce the occurences of breakdowns. • Provide managers with insights about the critical factors in the planning stage

  5. A Comparative Study of Improved Artificial Bee Colony Algorithms Applied to Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Kanjana Charansiriphaisan

    2013-01-01

    Full Text Available Multilevel thresholding is a highly useful tool for the application of image segmentation. Otsu’s method, a common exhaustive search for finding optimal thresholds, involves a high computational cost. There has been a lot of recent research into various meta-heuristic searches in the area of optimization research. This paper analyses and discusses using a family of artificial bee colony algorithms, namely, the standard ABC, ABC/best/1, ABC/best/2, IABC/best/1, IABC/rand/1, and CABC, and some particle swarm optimization-based algorithms for searching multilevel thresholding. The strategy for an onlooker bee to select an employee bee was modified to serve our purposes. The metric measures, which are used to compare the algorithms, are the maximum number of function calls, successful rate, and successful performance. The ranking was performed by Friedman ranks. The experimental results showed that IABC/best/1 outperformed the other techniques when all of them were applied to multilevel image thresholding. Furthermore, the experiments confirmed that IABC/best/1 is a simple, general, and high performance algorithm.

  6. Mate choice and the evolutionary stability of a fixed threshold in a sequential search strategy

    Directory of Open Access Journals (Sweden)

    Raymond Cheng

    2014-06-01

    Full Text Available The sequential search strategy is a prominent model of searcher behavior, derived as a rule by which females might sample and choose a mate from a distribution of prospective partners. The strategy involves a threshold criterion against which prospective mates are evaluated. The optimal threshold depends on the attributes of prospective mates, which are likely to vary across generations or within the lifetime of searchers due to stochastic environmental events. The extent of this variability and the cost to acquire information on the distribution of the quality of prospective mates determine whether a learned or environmentally canalized threshold is likely to be favored. In this paper, we determine conditions on cross-generational perturbations of the distribution of male phenotypes that allow for the evolutionary stability of an environmentally canalized threshold. In particular, we derive conditions under which there is a genetically determined threshold that is optimal over an evolutionary time scale in comparison to any other unlearned threshold. These considerations also reveal a simple algorithm by which the threshold could be learned.

  7. Bilevel thresholding of sliced image of sludge floc.

    Science.gov (United States)

    Chu, C P; Lee, D J

    2004-02-15

    This work examined the feasibility of employing various thresholding algorithms to determining the optimal bilevel thresholding value for estimating the geometric parameters of sludge flocs from the microtome sliced images and from the confocal laser scanning microscope images. Morphological information extracted from images depends on the bilevel thresholding value. According to the evaluation on the luminescence-inverted images and fractal curves (quadric Koch curve and Sierpinski carpet), Otsu's method yields more stable performance than other histogram-based algorithms and is chosen to obtain the porosity. The maximum convex perimeter method, however, can probe the shapes and spatial distribution of the pores among the biomass granules in real sludge flocs. A combined algorithm is recommended for probing the sludge floc structure.

  8. Optimizing CT radiation dose based on patient size and image quality: the size-specific dose estimate method

    Energy Technology Data Exchange (ETDEWEB)

    Larson, David B. [Stanford University School of Medicine, Department of Radiology, Stanford, CA (United States)

    2014-10-15

    The principle of ALARA (dose as low as reasonably achievable) calls for dose optimization rather than dose reduction, per se. Optimization of CT radiation dose is accomplished by producing images of acceptable diagnostic image quality using the lowest dose method available. Because it is image quality that constrains the dose, CT dose optimization is primarily a problem of image quality rather than radiation dose. Therefore, the primary focus in CT radiation dose optimization should be on image quality. However, no reliable direct measure of image quality has been developed for routine clinical practice. Until such measures become available, size-specific dose estimates (SSDE) can be used as a reasonable image-quality estimate. The SSDE method of radiation dose optimization for CT abdomen and pelvis consists of plotting SSDE for a sample of examinations as a function of patient size, establishing an SSDE threshold curve based on radiologists' assessment of image quality, and modifying protocols to consistently produce doses that are slightly above the threshold SSDE curve. Challenges in operationalizing CT radiation dose optimization include data gathering and monitoring, managing the complexities of the numerous protocols, scanners and operators, and understanding the relationship of the automated tube current modulation (ATCM) parameters to image quality. Because CT manufacturers currently maintain their ATCM algorithms as secret for proprietary reasons, prospective modeling of SSDE for patient populations is not possible without reverse engineering the ATCM algorithm and, hence, optimization by this method requires a trial-and-error approach. (orig.)

  9. Optimizing Policymakers' Loss Functions in Crisis Prediction: Before, Within or After?

    OpenAIRE

    Sarlin, Peter; von Schweinitz, Gregor

    2015-01-01

    Early-warning models most commonly optimize signaling thresholds on crisis probabilities. The expost threshold optimization is based upon a loss function accounting for preferences between forecast errors, but comes with two crucial drawbacks: unstable thresholds in recursive estimations and an in-sample overfit at the expense of out-of-sample performance. We propose two alternatives for threshold setting: (i) including preferences in the estimation itself and (ii) setting thresholds ex-ante ...

  10. High-frequency (8 to 16 kHz) reference thresholds and intrasubject threshold variability relative to ototoxicity criteria using a Sennheiser HDA 200 earphone.

    Science.gov (United States)

    Frank, T

    2001-04-01

    The first purpose of this study was to determine high-frequency (8 to 16 kHz) thresholds for standardizing reference equivalent threshold sound pressure levels (RETSPLs) for a Sennheiser HDA 200 earphone. The second and perhaps more important purpose of this study was to determine whether repeated high-frequency thresholds using a Sennheiser HDA 200 earphone had a lower intrasubject threshold variability than the ASHA 1994 significant threshold shift criteria for ototoxicity. High-frequency thresholds (8 to 16 kHz) were obtained for 100 (50 male, 50 female) normally hearing (0.25 to 8 kHz) young adults (mean age of 21.2 yr) in four separate test sessions using a Sennheiser HDA 200 earphone. The mean and median high-frequency thresholds were similar for each test session and increased as frequency increased. At each frequency, the high-frequency thresholds were not significantly (p > 0.05) different for gender, test ear, or test session. The median thresholds at each frequency were similar to the 1998 interim ISO RETSPLs; however, large standard deviations and wide threshold distributions indicated very high intersubject threshold variability, especially at 14 and 16 kHz. Threshold repeatability was determined by finding the threshold differences between each possible test session comparison (N = 6). About 98% of all of the threshold differences were within a clinically acceptable range of +/-10 dB from 8 to 14 kHz. The threshold differences between each subject's second, third, and fourth minus their first test session were also found to determine whether intrasubject threshold variability was less than the ASHA 1994 criteria for determining a significant threshold shift due to ototoxicity. The results indicated a false-positive rate of 0% for a threshold shift > or = 20 dB at any frequency and a false-positive rate of 2% for a threshold shift >10 dB at two consecutive frequencies. This study verified that the output of high-frequency audiometers at 0 dB HL using

  11. Sparse electromagnetic imaging using nonlinear iterative shrinkage thresholding

    KAUST Repository

    Desmal, Abdulla; Bagci, Hakan

    2015-01-01

    A sparse nonlinear electromagnetic imaging scheme is proposed for reconstructing dielectric contrast of investigation domains from measured fields. The proposed approach constructs the optimization problem by introducing the sparsity constraint to the data misfit between the scattered fields expressed as a nonlinear function of the contrast and the measured fields and solves it using the nonlinear iterative shrinkage thresholding algorithm. The thresholding is applied to the result of every nonlinear Landweber iteration to enforce the sparsity constraint. Numerical results demonstrate the accuracy and efficiency of the proposed method in reconstructing sparse dielectric profiles.

  12. Sparse electromagnetic imaging using nonlinear iterative shrinkage thresholding

    KAUST Repository

    Desmal, Abdulla

    2015-04-13

    A sparse nonlinear electromagnetic imaging scheme is proposed for reconstructing dielectric contrast of investigation domains from measured fields. The proposed approach constructs the optimization problem by introducing the sparsity constraint to the data misfit between the scattered fields expressed as a nonlinear function of the contrast and the measured fields and solves it using the nonlinear iterative shrinkage thresholding algorithm. The thresholding is applied to the result of every nonlinear Landweber iteration to enforce the sparsity constraint. Numerical results demonstrate the accuracy and efficiency of the proposed method in reconstructing sparse dielectric profiles.

  13. Scheduling Appliances with GA, TLBO, FA, OSR and Their Hybrids Using Chance Constrained Optimization for Smart Homes

    Directory of Open Access Journals (Sweden)

    Zunaira Nadeem

    2018-04-01

    Full Text Available In this paper, we design a controller for home energy management based on following meta-heuristic algorithms: teaching learning-based optimization (TLBO, genetic algorithm (GA, firefly algorithm (FA and optimal stopping rule (OSR theory. The principal goal of designing this controller is to reduce the energy consumption of residential sectors while reducing consumer’s electricity bill and maximizing user comfort. Additionally, we propose three hybrid schemes OSR-GA, OSR-TLBO and OSR-FA, by combining the best features of existing algorithms. We have also optimized the desired parameters: peak to average ratio, energy consumption, cost, and user comfort (appliance waiting time for 20, 50, 100 and 200 heterogeneous homes in two steps. In the first step, we obtain the optimal scheduling of home appliances implementing our aforementioned hybrid schemes for single and multiple homes while considering user preferences and threshold base policy. In the second step, we formulate our problem through chance constrained optimization. Simulation results show that proposed hybrid scheduling schemes outperformed for single and multiple homes and they shift the consumer load demand exceeding a predefined threshold to the hours where the electricity price is low thus following the threshold base policy. This helps to reduce electricity cost while considering the comfort of a user by minimizing delay and peak to average ratio. In addition, chance-constrained optimization is used to ensure the scheduling of appliances while considering the uncertainties of a load hence smoothing the load curtailment. The major focus is to keep the appliances power consumption within the power constraint, while keeping power consumption below a pre-defined acceptable level. Moreover, the feasible regions of appliances electricity consumption are calculated which show the relationship between cost and energy consumption and cost and waiting time.

  14. Color-discrimination threshold determination using pseudoisochromatic test plates

    Directory of Open Access Journals (Sweden)

    Kaiva eJurasevska

    2014-11-01

    Full Text Available We produced a set of pseudoisochromatic plates for determining individual color-difference thresholds to assess test performance and test properties, and analyzed the results. We report a high test validity and classification ability for the deficiency type and severity level (comparable to that of the fourth edition of the Hardy–Rand–Rittler (HRR test. We discuss changes of the acceptable chromatic shifts from the protan and deutan confusion lines along the CIE xy diagram, and the high correlation of individual color-difference thresholds and the red–green discrimination index. Color vision was tested using an Oculus HMC anomaloscope, a Farnsworth D15, and an HRR test on 273 schoolchildren, and 57 other subjects with previously diagnosed red–green color-vision deficiency.

  15. Optimizing edge detectors for robust automatic threshold selection : Coping with edge curvature and noise

    NARCIS (Netherlands)

    Wilkinson, M.H.F.

    The Robust Automatic Threshold Selection algorithm was introduced as a threshold selection based on a simple image statistic. The statistic is an average of the grey levels of the pixels in an image weighted by the response at each pixel of a specific edge detector. Other authors have suggested that

  16. A practical threshold concept for simple and reasonable radiation protection

    International Nuclear Information System (INIS)

    Kaneko, Masahito

    2002-01-01

    A half century ago it was assumed for the purpose of protection that radiation risks are linearly proportional at all levels of dose. Linear No-Threshold (LNT) hypothesis has greatly contributed to the minimization of doses received by workers and members of the public, while it has brought about 'radiophobia' and unnecessary over-regulation. Now that the existence of bio-defensive mechanisms such as DNA repair, apoptosis and adaptive response are well recognized, the linearity assumption can be said 'unscientific'. Evidences increasingly imply that there are threshold effects in risk of radiation. A concept of 'practical' thresholds is proposed and the classification of 'stochastic' and 'deterministic' radiation effects should be abandoned. 'Practical' thresholds are dose levels below which induction of detectable radiogenic cancers or hereditary effects are not expected. There seems to be no evidence of deleterious health effects from radiation exposures at the current dose limits (50 mSv/y for workers and 5 mSv/y for members of the public), which have been adopted worldwide in the latter half of the 20th century. Those limits are assumed to have been set below certain 'practical' thresholds. As any workers and members of the public do not gain benefits from being exposed, excepting intentional irradiation for medical purposes, their radiation exposures should be kept below 'practical' thresholds. There is no use of 'justification' and 'optimization' (ALARA) principles, because there are no 'radiation detriments' as far as exposures are maintained below 'practical' thresholds. Accordingly the ethical issue of 'justification' to allow benefit to society to offset radiation detriments to individuals can be resolved. And also the ethical issue of 'optimization' to exchange health or safety for economical gain can be resolved. The ALARA principle should be applied to the probability (risk) of exceeding relevant dose limits instead of applying to normal exposures

  17. Rainfall thresholds and flood warning: an operative case study

    Directory of Open Access Journals (Sweden)

    V. Montesarchio

    2009-02-01

    Full Text Available An operative methodology for rainfall thresholds definition is illustrated, in order to provide at critical river section optimal flood warnings. Threshold overcoming could produce a critical situation in river sites exposed to alluvial risk and trigger the prevention and emergency system alert. The procedure for the definition of critical rainfall threshold values is based both on the quantitative precipitation observed and the hydrological response of the basin. Thresholds values specify the precipitation amount for a given duration that generates a critical discharge in a given cross section and are estimated by hydrological modelling for several scenarios (e.g.: modifying the soil moisture conditions. Some preliminary results, in terms of reliability analysis (presence of false alarms and missed alarms, evaluated using indicators like hit rate and false alarm rate for the case study of Mignone River are presented.

  18. Validation and evaluation of epistemic uncertainty in rainfall thresholds for regional scale landslide forecasting

    Science.gov (United States)

    Gariano, Stefano Luigi; Brunetti, Maria Teresa; Iovine, Giulio; Melillo, Massimo; Peruccacci, Silvia; Terranova, Oreste Giuseppe; Vennari, Carmela; Guzzetti, Fausto

    2015-04-01

    Prediction of rainfall-induced landslides can rely on empirical rainfall thresholds. These are obtained from the analysis of past rainfall events that have (or have not) resulted in slope failures. Accurate prediction requires reliable thresholds, which need to be validated before their use in operational landslide warning systems. Despite the clear relevance of validation, only a few studies have addressed the problem, and have proposed and tested robust validation procedures. We propose a validation procedure that allows for the definition of optimal thresholds for early warning purposes. The validation is based on contingency table, skill scores, and receiver operating characteristic (ROC) analysis. To establish the optimal threshold, which maximizes the correct landslide predictions and minimizes the incorrect predictions, we propose an index that results from the linear combination of three weighted skill scores. Selection of the optimal threshold depends on the scope and the operational characteristics of the early warning system. The choice is made by selecting appropriately the weights, and by searching for the optimal (maximum) value of the index. We discuss weakness in the validation procedure caused by the inherent lack of information (epistemic uncertainty) on landslide occurrence typical of large study areas. When working at the regional scale, landslides may have occurred and may have not been reported. This results in biases and variations in the contingencies and the skill scores. We introduce two parameters to represent the unknown proportion of rainfall events (above and below the threshold) for which landslides occurred and went unreported. We show that even a very small underestimation in the number of landslides can result in a significant decrease in the performance of a threshold measured by the skill scores. We show that the variations in the skill scores are different for different uncertainty of events above or below the threshold. This

  19. Automatic segmentation of coronary arteries from computed tomography angiography data cloud using optimal thresholding

    Science.gov (United States)

    Ansari, Muhammad Ahsan; Zai, Sammer; Moon, Young Shik

    2017-01-01

    Manual analysis of the bulk data generated by computed tomography angiography (CTA) is time consuming, and interpretation of such data requires previous knowledge and expertise of the radiologist. Therefore, an automatic method that can isolate the coronary arteries from a given CTA dataset is required. We present an automatic yet effective segmentation method to delineate the coronary arteries from a three-dimensional CTA data cloud. Instead of a region growing process, which is usually time consuming and prone to leakages, the method is based on the optimal thresholding, which is applied globally on the Hessian-based vesselness measure in a localized way (slice by slice) to track the coronaries carefully to their distal ends. Moreover, to make the process automatic, we detect the aorta using the Hough transform technique. The proposed segmentation method is independent of the starting point to initiate its process and is fast in the sense that coronary arteries are obtained without any preprocessing or postprocessing steps. We used 12 real clinical datasets to show the efficiency and accuracy of the presented method. Experimental results reveal that the proposed method achieves 95% average accuracy.

  20. The heritability of acceptability in South African Merino sheep ...

    African Journals Online (AJOL)

    Selection for production and reproduction in South African Merino sheep is always combined with selection based on visual appraisal and will, in all probability, remain so for many years to come. Heritabilities for acceptability were estimated using a threshold model to analyse data from two parent Merino studs. Effects ...

  1. FOXP3-stained image analysis for follicular lymphoma: optimal adaptive thresholding with maximal nucleus coverage

    Science.gov (United States)

    Senaras, C.; Pennell, M.; Chen, W.; Sahiner, B.; Shana'ah, A.; Louissaint, A.; Hasserjian, R. P.; Lozanski, G.; Gurcan, M. N.

    2017-03-01

    Immunohistochemical detection of FOXP3 antigen is a usable marker for detection of regulatory T lymphocytes (TR) in formalin fixed and paraffin embedded sections of different types of tumor tissue. TR plays a major role in homeostasis of normal immune systems where they prevent auto reactivity of the immune system towards the host. This beneficial effect of TR is frequently "hijacked" by malignant cells where tumor-infiltrating regulatory T cells are recruited by the malignant nuclei to inhibit the beneficial immune response of the host against the tumor cells. In the majority of human solid tumors, an increased number of tumor-infiltrating FOXP3 positive TR is associated with worse outcome. However, in follicular lymphoma (FL) the impact of the number and distribution of TR on the outcome still remains controversial. In this study, we present a novel method to detect and enumerate nuclei from FOXP3 stained images of FL biopsies. The proposed method defines a new adaptive thresholding procedure, namely the optimal adaptive thresholding (OAT) method, which aims to minimize under-segmented and over-segmented nuclei for coarse segmentation. Next, we integrate a parameter free elliptical arc and line segment detector (ELSD) as additional information to refine segmentation results and to split most of the merged nuclei. Finally, we utilize a state-of-the-art super-pixel method, Simple Linear Iterative Clustering (SLIC) to split the rest of the merged nuclei. Our dataset consists of 13 region-ofinterest images containing 769 negative and 88 positive nuclei. Three expert pathologists evaluated the method and reported sensitivity values in detecting negative and positive nuclei ranging from 83-100% and 90-95%, and precision values of 98-100% and 99-100%, respectively. The proposed solution can be used to investigate the impact of FOXP3 positive nuclei on the outcome and prognosis in FL.

  2. Optimization of the general acceptability though affective tests and response surface methodology of a dry cacao powder mixture based beverage

    Directory of Open Access Journals (Sweden)

    Elena Chau Loo Kung

    2013-09-01

    Full Text Available This research work had as main objective optimizing the general acceptability though affective tests and response surface methodology of a dry cacao powder mixture based beverage. We obtained formulations of mixtures of cacao powder with different concentrations of 15%, 17.5% and 20%, as well as lecithin concentrations of 0.1%; 0.3%; and 0.5% maintaining a constant content of sugar (25 %, Vanillin (1% that included cacao powder with different pH values: natural (pH 5 and alkalinized (pH 6.5 and pH 8 and water by difference to 100%, generating a total of fifteen treatments to be evaluated, according to the Box-Behnen design for three factors. The treatments underwent satisfaction level tests to establish the general acceptability. The treatment that included cacao powder with a concentration of 17.5 %, pH 6.5 and lecithin concentration of 0.3 % obtained the best levels of acceptability. The software Statgraphics Plus 5.1 was used to obtain the treatment with maximum acceptability that corresponded to cacao powder with pH 6.81, with a concentration of 18.24 % and soy lecithin in 0.28% with a tendency to what was obtained in the satisfaction levels tests. Finally we characterized in a physical-chemistry and microbiological way the optimum formulation as well as evaluated sensitively obtaining an acceptability of 6.17.

  3. A PC microsimulation of a gap acceptance model for turning left at a T-junction

    NARCIS (Netherlands)

    Schaap, Nina; Dijck, T.; van Arem, Bart; Morsink, Peter L.J.

    2009-01-01

    Vehicles are controlled by sub-behavioral models in a microsimulation model, this includes the gap acceptance model where the decision about how to cross a junction is made. The critical gap in these models must serve as a threshold value to accept or reject the space between two successive vehicles

  4. [Loudness optimized registration of compound action potential in cochlear implant recipients].

    Science.gov (United States)

    Berger, Klaus; Hocke, Thomas; Hessel, Horst

    2017-11-01

    Background Postoperative measurements of compound action potentials are not always possible due to the insufficient acceptance of the CI-recipients. This study investigated the impact of different parameters on the acceptance of the measurements. Methods Compound action potentials of 16 CI recipients were measured with different pulse-widths. Recipients performed a loudness rating at the potential thresholds with the different sequences. Results Compound action potentials obtained with higher pulse-widths were rated softer than those obtained with smaller pulse-widths. Conclusions Compound action potentials measured with higher pulse-widths generate a gap between loudest acceptable presentation level and potential threshold. This gap contributes to a higher acceptance of postoperative measurements. Georg Thieme Verlag KG Stuttgart · New York.

  5. An NMR log echo data de-noising method based on the wavelet packet threshold algorithm

    International Nuclear Information System (INIS)

    Meng, Xiangning; Xie, Ranhong; Li, Changxi; Hu, Falong; Li, Chaoliu; Zhou, Cancan

    2015-01-01

    To improve the de-noising effects of low signal-to-noise ratio (SNR) nuclear magnetic resonance (NMR) log echo data, this paper applies the wavelet packet threshold algorithm to the data. The principle of the algorithm is elaborated in detail. By comparing the properties of a series of wavelet packet bases and the relevance between them and the NMR log echo train signal, ‘sym7’ is found to be the optimal wavelet packet basis of the wavelet packet threshold algorithm to de-noise the NMR log echo train signal. A new method is presented to determine the optimal wavelet packet decomposition scale; this is within the scope of its maximum, using the modulus maxima and the Shannon entropy minimum standards to determine the global and local optimal wavelet packet decomposition scales, respectively. The results of applying the method to the simulated and actual NMR log echo data indicate that compared with the wavelet threshold algorithm, the wavelet packet threshold algorithm, which shows higher decomposition accuracy and better de-noising effect, is much more suitable for de-noising low SNR–NMR log echo data. (paper)

  6. Performance improvement of per-user threshold based multiuser switched scheduling system

    KAUST Repository

    Nam, Haewoon

    2013-01-01

    SUMMARY This letter proposes a multiuser switched scheduling scheme with per-user threshold and post user selection and provides a generic analytical framework for determining the optimal feedback thresholds. The proposed scheme applies an individual feedback threshold for each user rather than a single common threshold for all users to achieve some capacity gain due to the flexibility of threshold selection as well as a lower scheduling outage probability. In addition, since scheduling outage may occur with a non-negligible probability, the proposed scheme employs post user selection in order to further improve the ergodic capacity, where the user with the highest potential for a higher channel quality than other users is selected. Numerical and simulation results show that the capacity gain by post user selection is significant when random sequence is used. Copyright © 2013 The Institute of Electronics, Information and Communication Engineers.

  7. Value of information and pricing new healthcare interventions.

    Science.gov (United States)

    Willan, Andrew R; Eckermann, Simon

    2012-06-01

    Previous application of value-of-information methods to optimal clinical trial design have predominantly taken a societal decision-making perspective, implicitly assuming that healthcare costs are covered through public expenditure and trial research is funded by government or donation-based philanthropic agencies. In this paper, we consider the interaction between interrelated perspectives of a societal decision maker (e.g. the National Institute for Health and Clinical Excellence [NICE] in the UK) charged with the responsibility for approving new health interventions for reimbursement and the company that holds the patent for a new intervention. We establish optimal decision making from societal and company perspectives, allowing for trade-offs between the value and cost of research and the price of the new intervention. Given the current level of evidence, there exists a maximum (threshold) price acceptable to the decision maker. Submission for approval with prices above this threshold will be refused. Given the current level of evidence and the decision maker's threshold price, there exists a minimum (threshold) price acceptable to the company. If the decision maker's threshold price exceeds the company's, then current evidence is sufficient since any price between the thresholds is acceptable to both. On the other hand, if the decision maker's threshold price is lower than the company's, then no price is acceptable to both and the company's optimal strategy is to commission additional research. The methods are illustrated using a recent example from the literature.

  8. Fluorescently labeled bevacizumab in human breast cancer: defining the classification threshold

    Science.gov (United States)

    Koch, Maximilian; de Jong, Johannes S.; Glatz, Jürgen; Symvoulidis, Panagiotis; Lamberts, Laetitia E.; Adams, Arthur L. L.; Kranendonk, Mariëtte E. G.; Terwisscha van Scheltinga, Anton G. T.; Aichler, Michaela; Jansen, Liesbeth; de Vries, Jakob; Lub-de Hooge, Marjolijn N.; Schröder, Carolien P.; Jorritsma-Smit, Annelies; Linssen, Matthijs D.; de Boer, Esther; van der Vegt, Bert; Nagengast, Wouter B.; Elias, Sjoerd G.; Oliveira, Sabrina; Witkamp, Arjen J.; Mali, Willem P. Th. M.; Van der Wall, Elsken; Garcia-Allende, P. Beatriz; van Diest, Paul J.; de Vries, Elisabeth G. E.; Walch, Axel; van Dam, Gooitzen M.; Ntziachristos, Vasilis

    2017-07-01

    In-vivo fluorescently labelled drug (bevacizumab) breast cancer specimen where obtained from patients. We propose a new structured method to determine the optimal classification threshold in targeted fluorescence intra-operative imaging.

  9. Combining two strategies to optimize biometric decisions against spoofing attacks

    Science.gov (United States)

    Li, Weifeng; Poh, Norman; Zhou, Yicong

    2014-09-01

    Spoof attack by replicating biometric traits represents a real threat to an automatic biometric verification/ authentication system. This is because the system, originally designed to distinguish between genuine users from impostors, simply cannot distinguish between a replicated biometric sample (replica) from a live sample. An effective solution is to obtain some measures that can indicate whether or not a biometric trait has been tempered with, e.g., liveness detection measures. These measures are referred to as evidence of spoofing or anti-spoofing measures. In order to make the final accept/rejection decision, a straightforward solution to define two thresholds: one for the anti-spoofing measure, and another for the verification score. We compared two variants of a method that relies on applying two thresholds - one to the verification (matching) score and another to the anti-spoofing measure. Our experiments carried out using a signature database as well as by simulation show that both the brute-force and its probabilistic variant turn out to be optimal under different operating conditions.

  10. Practical determination of aortic valve calcium volume score on contrast-enhanced computed tomography prior to transcatheter aortic valve replacement and impact on paravalvular regurgitation: Elucidating optimal threshold cutoffs.

    Science.gov (United States)

    Bettinger, Nicolas; Khalique, Omar K; Krepp, Joseph M; Hamid, Nadira B; Bae, David J; Pulerwitz, Todd C; Liao, Ming; Hahn, Rebecca T; Vahl, Torsten P; Nazif, Tamim M; George, Isaac; Leon, Martin B; Einstein, Andrew J; Kodali, Susheel K

    The threshold for the optimal computed tomography (CT) number in Hounsfield Units (HU) to quantify aortic valvular calcium on contrast-enhanced scans has not been standardized. Our aim was to find the most accurate threshold to predict paravalvular regurgitation (PVR) after transcatheter aortic valve replacement (TAVR). 104 patients who underwent TAVR with the CoreValve prosthesis were studied retrospectively. Luminal attenuation (LA) in HU was measured at the level of the aortic annulus. Calcium volume score for the aortic valvular complex was measured using 6 threshold cutoffs (650 HU, 850 HU, LA × 1.25, LA × 1.5, LA+50, LA+100). Receiver-operating characteristic (ROC) analysis was performed to assess the predictive value for > mild PVR (n = 16). Multivariable analysis was performed to determine the accuracy to predict > mild PVR after adjustment for depth and perimeter oversizing. ROC analysis showed lower area under the curve (AUC) values for fixed threshold cutoffs (650 or 850 HU) compared to thresholds relative to LA. The LA+100 threshold had the highest AUC (0.81), and AUC was higher than all studied protocols, other than the LA x 1.25 and LA + 50 protocols, where the difference approached statistical significance (p = 0.05, and 0.068, respectively). Multivariable analysis showed calcium volume determined by the LAx1.25, LAx1.5, LA+50, and LA+ 100 HU protocols to independently predict PVR. Calcium volume scoring thresholds which are relative to LA are more predictive of PVR post-TAVR than those which use fixed cutoffs. A threshold of LA+100 HU had the highest predictive value. Copyright © 2017 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  11. Data-Driven Jump Detection Thresholds for Application in Jump Regressions

    Directory of Open Access Journals (Sweden)

    Robert Davies

    2018-03-01

    Full Text Available This paper develops a method to select the threshold in threshold-based jump detection methods. The method is motivated by an analysis of threshold-based jump detection methods in the context of jump-diffusion models. We show that over the range of sampling frequencies a researcher is most likely to encounter that the usual in-fill asymptotics provide a poor guide for selecting the jump threshold. Because of this we develop a sample-based method. Our method estimates the number of jumps over a grid of thresholds and selects the optimal threshold at what we term the ‘take-off’ point in the estimated number of jumps. We show that this method consistently estimates the jumps and their indices as the sampling interval goes to zero. In several Monte Carlo studies we evaluate the performance of our method based on its ability to accurately locate jumps and its ability to distinguish between true jumps and large diffusive moves. In one of these Monte Carlo studies we evaluate the performance of our method in a jump regression context. Finally, we apply our method in two empirical studies. In one we estimate the number of jumps and report the jump threshold our method selects for three commonly used market indices. In the other empirical application we perform a series of jump regressions using our method to select the jump threshold.

  12. COMPARISON OF NONLINEAR DYNAMICS OPTIMIZATION METHODS FOR APS-U

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Y.; Borland, Michael

    2017-06-25

    Many different objectives and genetic algorithms have been proposed for storage ring nonlinear dynamics performance optimization. These optimization objectives include nonlinear chromaticities and driving/detuning terms, on-momentum and off-momentum dynamic acceptance, chromatic detuning, local momentum acceptance, variation of transverse invariant, Touschek lifetime, etc. In this paper, the effectiveness of several different optimization methods and objectives are compared for the nonlinear beam dynamics optimization of the Advanced Photon Source upgrade (APS-U) lattice. The optimized solutions from these different methods are preliminarily compared in terms of the dynamic acceptance, local momentum acceptance, chromatic detuning, and other performance measures.

  13. A new temperature threshold detector - Application to missile monitoring

    Science.gov (United States)

    Coston, C. J.; Higgins, E. V.

    Comprehensive thermal surveys within the case of solid propellant ballistic missile flight motors are highly desirable. For example, a problem involving motor failures due to insulator cracking at motor ignition, which took several years to solve, could have been identified immediately on the basis of a suitable thermal survey. Using conventional point measurements, such as those utilizing typical thermocouples, for such a survey on a full scale motor is not feasible because of the great number of sensors and measurements required. An alternate approach recognizes that temperatures below a threshold (which depends on the material being monitored) are acceptable, but higher temperatures exceed design margins. In this case hot spots can be located by a grid of wire-like sensors which are sensitive to temperature above the threshold anywhere along the sensor. A new type of temperature threshold detector is being developed for flight missile use. The considered device consists of KNO3 separating copper and Constantan metals. Above the KNO3 MP, galvanic action provides a voltage output of a few tenths of a volt.

  14. Framework for determining airport daily departure and arrival delay thresholds: statistical modelling approach.

    Science.gov (United States)

    Wesonga, Ronald; Nabugoomu, Fabian

    2016-01-01

    The study derives a framework for assessing airport efficiency through evaluating optimal arrival and departure delay thresholds. Assumptions of airport efficiency measurements, though based upon minimum numeric values such as 15 min of turnaround time, cannot be extrapolated to determine proportions of delay-days of an airport. This study explored the concept of delay threshold to determine the proportion of delay-days as an expansion of the theory of delay and our previous work. Data-driven approach using statistical modelling was employed to a limited set of determinants of daily delay at an airport. For the purpose of testing the efficacy of the threshold levels, operational data for Entebbe International Airport were used as a case study. Findings show differences in the proportions of delay at departure (μ = 0.499; 95 % CI = 0.023) and arrival (μ = 0.363; 95 % CI = 0.022). Multivariate logistic model confirmed an optimal daily departure and arrival delay threshold of 60 % for the airport given the four probable thresholds {50, 60, 70, 80}. The decision for the threshold value was based on the number of significant determinants, the goodness of fit statistics based on the Wald test and the area under the receiver operating curves. These findings propose a modelling framework to generate relevant information for the Air Traffic Management relevant in planning and measurement of airport operational efficiency.

  15. Color difference thresholds in dentistry.

    Science.gov (United States)

    Paravina, Rade D; Ghinea, Razvan; Herrera, Luis J; Bona, Alvaro D; Igiel, Christopher; Linninger, Mercedes; Sakai, Maiko; Takahashi, Hidekazu; Tashkandi, Esam; Perez, Maria del Mar

    2015-01-01

    The aim of this prospective multicenter study was to determine 50:50% perceptibility threshold (PT) and 50:50% acceptability threshold (AT) of dental ceramic under simulated clinical settings. The spectral radiance of 63 monochromatic ceramic specimens was determined using a non-contact spectroradiometer. A total of 60 specimen pairs, divided into 3 sets of 20 specimen pairs (medium to light shades, medium to dark shades, and dark shades), were selected for psychophysical experiment. The coordinating center and seven research sites obtained the Institutional Review Board (IRB) approvals prior the beginning of the experiment. Each research site had 25 observers, divided into five groups of five observers: dentists-D, dental students-S, dental auxiliaries-A, dental technicians-T, and lay persons-L. There were 35 observers per group (five observers per group at each site ×7 sites), for a total of 175 observers. Visual color comparisons were performed using a viewing booth. Takagi-Sugeno-Kang (TSK) fuzzy approximation was used for fitting the data points. The 50:50% PT and 50:50% AT were determined in CIELAB and CIEDE2000. The t-test was used to evaluate the statistical significance in thresholds differences. The CIELAB 50:50% PT was ΔEab  = 1.2, whereas 50:50% AT was ΔEab  = 2.7. Corresponding CIEDE2000 (ΔE00 ) values were 0.8 and 1.8, respectively. 50:50% PT by the observer group revealed differences among groups D, A, T, and L as compared with 50:50% PT for all observers. The 50:50% AT for all observers was statistically different than 50:50% AT in groups T and L. A 50:50% perceptibility and ATs were significantly different. The same is true for differences between two color difference formulas ΔE00 /ΔEab . Observer groups and sites showed high level of statistical difference in all thresholds. Visual color difference thresholds can serve as a quality control tool to guide the selection of esthetic dental materials, evaluate clinical performance, and

  16. Using Johnson Distribution for Automatic Threshold Setting in Wind Turbine Condition Monitoring System

    DEFF Research Database (Denmark)

    Marhadi, Kun Saptohartyadi; Skrimpas, Georgios Alexandros

    2014-01-01

    not rep- resent the whole operating conditions of a turbine, which re- sults in uncertainty in the parameters of the fitted probabil- ity distribution and the thresholds calculated. In this study Johnson distribution is used to identify shape, location, and scale parameters of distribution that can best...... fit vibration data. This study shows that using Johnson distribution can elim- inate testing or fitting various distributions to the data, and have more direct approach to obtain optimal thresholds. To quantify uncertainty in the thresholds due to limited data, im- plementations with bootstrap method...

  17. Defect Detection of Steel Surfaces with Global Adaptive Percentile Thresholding of Gradient Image

    Science.gov (United States)

    Neogi, Nirbhar; Mohanta, Dusmanta K.; Dutta, Pranab K.

    2017-12-01

    Steel strips are used extensively for white goods, auto bodies and other purposes where surface defects are not acceptable. On-line surface inspection systems can effectively detect and classify defects and help in taking corrective actions. For detection of defects use of gradients is very popular in highlighting and subsequently segmenting areas of interest in a surface inspection system. Most of the time, segmentation by a fixed value threshold leads to unsatisfactory results. As defects can be both very small and large in size, segmentation of a gradient image based on percentile thresholding can lead to inadequate or excessive segmentation of defective regions. A global adaptive percentile thresholding of gradient image has been formulated for blister defect and water-deposit (a pseudo defect) in steel strips. The developed method adaptively changes the percentile value used for thresholding depending on the number of pixels above some specific values of gray level of the gradient image. The method is able to segment defective regions selectively preserving the characteristics of defects irrespective of the size of the defects. The developed method performs better than Otsu method of thresholding and an adaptive thresholding method based on local properties.

  18. Search for H-dibaryon at J-PARC with a Large Acceptance TPC

    Directory of Open Access Journals (Sweden)

    Sako H.

    2014-03-01

    Full Text Available H-dibaryon has been predicted as a stable 6-quark color-singlet state. It has been searched for by many experiments but has never been discovered. Recent lattice QCD calculations predict H-dibaryon as a weakly bound or a resonant state close to the LL threshold. E224 and E522 experiments at KEK observed peaks in LL invariant mass spectra near the threshold in (K-, K+ reactions, which were statistically not significant. Therefore, we proposed a new experiment E42 at J-PARC. It will measure decay products of ΛΛ and Λπ-p in a (K-, K+ reaction. We design a large acceptance spectrometer based on a Time Projection Chamber (TPC immersed in a dipole magnetic field. The TPC surrounds a target to cover nearly 4π acceptance, and accepts K- beams up to 106 counts per second. To suppress drift field distortion at high beam rates, we adopt Gas Electron Multipliers (GEMs for electron amplification and a gating grid. We show an overview of the experiment, the design of the spectrometer, and the R&D status of the TPC prototype.

  19. Comparison of memory thresholds for planar qudit geometries

    Science.gov (United States)

    Marks, Jacob; Jochym-O'Connor, Tomas; Gheorghiu, Vlad

    2017-11-01

    We introduce and analyze a new type of decoding algorithm called general color clustering, based on renormalization group methods, to be used in qudit color codes. The performance of this decoder is analyzed under a generalized bit-flip error model, and is used to obtain the first memory threshold estimates for qudit 6-6-6 color codes. The proposed decoder is compared with similar decoding schemes for qudit surface codes as well as the current leading qubit decoders for both sets of codes. We find that, as with surface codes, clustering performs sub-optimally for qubit color codes, giving a threshold of 5.6 % compared to the 8.0 % obtained through surface projection decoding methods. However, the threshold rate increases by up to 112% for large qudit dimensions, plateauing around 11.9 % . All the analysis is performed using QTop, a new open-source software for simulating and visualizing topological quantum error correcting codes.

  20. Improved Bat Algorithm Applied to Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Adis Alihodzic

    2014-01-01

    Full Text Available Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed.

  1. Threshold-based Adaptive Detection for WSN

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2014-01-06

    Efficient receiver designs for wireless sensor networks (WSNs) are becoming increasingly important. Cooperative WSNs communicated with the use of L sensors. As the receiver is constrained, it can only process U out of L sensors. Channel shortening and reduced-rank techniques were employed to design the preprocessing matrix. In this work, a receiver structure is proposed which combines the joint iterative optimization (JIO) algorithm and our proposed threshold selection criteria. This receiver structure assists in determining the optimal Uopt. It also provides the freedom to choose U

  2. Threshold-based Adaptive Detection for WSN

    KAUST Repository

    Abuzaid, Abdulrahman I.; Ahmed, Qasim Zeeshan; Alouini, Mohamed-Slim

    2014-01-01

    Efficient receiver designs for wireless sensor networks (WSNs) are becoming increasingly important. Cooperative WSNs communicated with the use of L sensors. As the receiver is constrained, it can only process U out of L sensors. Channel shortening and reduced-rank techniques were employed to design the preprocessing matrix. In this work, a receiver structure is proposed which combines the joint iterative optimization (JIO) algorithm and our proposed threshold selection criteria. This receiver structure assists in determining the optimal Uopt. It also provides the freedom to choose U

  3. Relationship between colorimetric (instrumental) evaluation and consumer-defined beef colour acceptability.

    Science.gov (United States)

    Holman, Benjamin W B; Mao, Yanwei; Coombs, Cassius E O; van de Ven, Remy J; Hopkins, David L

    2016-11-01

    The relationship between instrumental colorimetric values (L*, a*, b*, the ratio of reflectance at 630nm and 580nm) and consumer perception of acceptable beef colour was evaluated using a web-based survey and standardised photographs of beef m. longissimus lumborum with known colorimetrics. Only L* and b* were found to relate to average consumer opinions of beef colour acceptability. Respondent nationality was also identified as a source of variation in beef colour acceptability score. Although this is a preliminary study with the findings necessitating additional investigation, these results suggest L* and b* as candidates for developing instrumental thresholds for consumer beef colour expectations. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  4. Appropriate threshold levels of cardiac beat-to-beat variation in semi-automatic analysis of equine ECG recordings

    DEFF Research Database (Denmark)

    Madsen, Mette Flethøj; Kanters, Jørgen K.; Pedersen, Philip Juul

    2016-01-01

    considerably with heart rate (HR), and an adaptable model consisting of three different HR ranges with separate threshold levels of maximum acceptable RR deviation was consequently defined. For resting HRs

  5. The Acceptance of Background Noise in Adult Cochlear Implant Users

    Science.gov (United States)

    Plyler, Patrick N.; Bahng, Junghwa; von Hapsburg, Deborah

    2008-01-01

    Purpose: The purpose of this study was to determine (a) if acceptable noise levels (ANLs) are different in cochlear implant (CI) users than in listeners with normal hearing, (b) if ANLs are related to sentence reception thresholds in noise in CI users, and (c) if ANLs and subjective outcome measures are related in CI users. Method: ANLs and the…

  6. 1.3μm low threshold distributed feedback lasers for high bit-rate applications

    International Nuclear Information System (INIS)

    Artigue, C.; Louis, Y.; Padioleau, C.; Poingt, F.; Sigogne, D.; Starck, C.; Benoit, J.

    1985-01-01

    A low threshold current (≅ 30 mA) 1.3μm (InGaAsP) second order DFB laser with a ridge structure made by liquid phase epitaxy is reported. The low threshold results from: optimized heterostructure and grating profile, good tuning of the DFB wavelength with the peak gain wavelength, and the proper LPE regrowth conditions on the grating

  7. Photoproduction of the φ(1020) near threshold in CLAS

    International Nuclear Information System (INIS)

    Tedeschi, D.J.

    2002-01-01

    The differential cross section for the photoproduction of the φ (1020) near threshold (E γ = 1.57GeV) is predicted to be sensitive to production mechanisms other than diffraction. However, the existing low energy data is of limited statistics and kinematical coverage. Complete measurements of φ meson production on the proton have been performed at The Thomas Jefferson National Accelerator Facility using a liquid hydrogen target and the CEBAF Large Acceptance Spectrometer (CLAS). The φ was identified by missing mass using a proton and positive kaon detected by CLAS in coincidence with an electron in the photon tagger. The energy of the tagged, bremsstrahlung photons ranged from φ-threshold to 2.4 GeV. A description of the data set and the differential cross section for (E γ = 2.0 GeV) will be presented and compared with present theoretical calculations. (author)

  8. The problem of the detection threshold in radiation measurement

    International Nuclear Information System (INIS)

    Rose, E.; Wueneke, C.D.

    1983-01-01

    In all cases encountered in practical radiation measurement, the basic problem is to differentiate between the lowest measured value and the zero value (background, natural background radiation, etc.). For this purpose, on the mathematical side, tests based on hypotheses are to be applied. These will show the probability of differentiation between two values having the same random spread. By means of these tests and the corresponding error theory, a uniform treatment of the subject, applicable to all problems relating to measuring technique alike, can be found. Two basic concepts are found in this process, which have to be defined in terms of semantics and nomenclature: Decision threshold and detection threshold, or 'minimum detectable mean value'. At the decision threshold, one has to decide (with a given statistical error probability) whether a measured value is to be attributed to the background radiation, accepting the zero hypothesis, or whether this value differs significantly from the background radiation (error of 1rst kind). The minimum detectable mean value is the value which, with a given decision threshold, can be determined with sufficient significance to be a measured value and thus cannot be mistaken as background radiation (alternative hypothesis, error of 2nd kind). Normally, the two error types are of equal importance. It may happen, however, that one type of error gains more importance, depending on the approach. (orig.) [de

  9. The adaptive value of gluttony: predators mediate the life history trade-offs of satiation threshold.

    Science.gov (United States)

    Pruitt, J N; Krauel, J J

    2010-10-01

    Animals vary greatly in their tendency to consume large meals. Yet, whether or how meal size influences fitness in wild populations is infrequently considered. Using a predator exclusion, mark-recapture experiment, we estimated selection on the amount of food accepted during an ad libitum feeding bout (hereafter termed 'satiation threshold') in the wolf spider Schizocosa ocreata. Individually marked, size-matched females of known satiation threshold were assigned to predator exclusion and predator inclusion treatments and tracked for a 40-day period. We also estimated the narrow-sense heritability of satiation threshold using dam-on-female-offspring regression. In the absence of predation, high satiation threshold was positively associated with larger and faster egg case production. However, these selective advantages were lost when predators were present. We estimated the heritability of satiation threshold to be 0.56. Taken together, our results suggest that satiation threshold can respond to selection and begets a life history trade-off in this system: high satiation threshold individuals tend to produce larger egg cases but also suffer increased susceptibility to predation. © 2010 The Authors. Journal Compilation © 2010 European Society For Evolutionary Biology.

  10. Optimally frugal foraging

    Science.gov (United States)

    Bénichou, O.; Bhat, U.; Krapivsky, P. L.; Redner, S.

    2018-02-01

    We introduce the frugal foraging model in which a forager performs a discrete-time random walk on a lattice in which each site initially contains S food units. The forager metabolizes one unit of food at each step and starves to death when it last ate S steps in the past. Whenever the forager eats, it consumes all food at its current site and this site remains empty forever (no food replenishment). The crucial property of the forager is that it is frugal and eats only when encountering food within at most k steps of starvation. We compute the average lifetime analytically as a function of the frugality threshold and show that there exists an optimal strategy, namely, an optimal frugality threshold k* that maximizes the forager lifetime.

  11. Threshold quantum cryptography

    International Nuclear Information System (INIS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding

  12. Optimal grasp planning for a dexterous robotic hand using the volume of a generalized force ellipsoid during accepted flattening

    Directory of Open Access Journals (Sweden)

    Peng Jia

    2017-01-01

    Full Text Available A grasp planning method based on the volume and flattening of a generalized force ellipsoid is proposed to improve the grasping ability of a dexterous robotic hand. First, according to the general solution of joint torques for a dexterous robotic hand, a grasping indicator for the dexterous hand—the maximum volume of a generalized external force ellipsoid and the minimum volume of a generalized contact internal force ellipsoid during accepted flattening—is proposed. Second, an optimal grasp planning method based on a task is established using the grasping indicator as an objective function. Finally, a simulation analysis and grasping experiment are performed. Results show that when the grasping experiment is conducted with the grasping configuration and positions of contact points optimized using the proposed grasping indicator, the root-mean-square values of the joint torques and contact internal forces of the dexterous hand are at a minimum. The effectiveness of the proposed grasping planning method is thus demonstrated.

  13. OPTIMIZATION OF ETHANOL CONCENTRATION, GLYCEROL CONCENTRATION AND TEMPERATURE CONDITIONS OF GRAPE-MAHUA WINE TO MAXIMIZE THE QUALITY AND OVERALL ACCEPTABILITY

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2013-06-01

    Full Text Available Black grapes (Vitis vinifera and mahua (Madhuca longfolia extract was used in 90:10 grape-mahua ratio for fermentation for 15 days and subjected to clarification using bentonite and gelatin as fining agents. Ageing was allowed for three months and studies were conducted using response surface methodology to assess the effect of ethanol, glycerol and temperature on viscosity, color, specific gravity, pH and overall acceptability. Experimental designs were conducted and 20 samples were prepared containing varying concentration of ethanol (7.55%-13.44%, glycerol (6.19-18.8g/l and temperature (5.6-22.4oC respectively. The maximum desirability of 93% was obtained for wine under the optimized conditions 13.44% ethanol, 6.19g/l glycerol and 14oC temperature, having viscosity (efflux time, 12.9 s; color absorbance, 4.61; SG, 1.0012; pH, 3.34 and overall acceptability, 8.47.

  14. Optimal threshold detection for Málaga turbulent optical links

    DEFF Research Database (Denmark)

    Jurado-Navas, Antonio; Garrido-Balsellss, José María; del Castillo Vázquez, Miguel

    2016-01-01

    in this paper the role of the detection threshold in a free-space optical system employing an on-off keying modulation technique and involved in different scenarios, and taking into account the extinction ratio associated to the employed laser. First we have derived some analytical expressions for the lower......A new and generalized statistical model, called Málaga distribution (M distribution), has been derived recently to characterize the irradiance fluctuations of an unbounded optical wave front propagating through a turbulent medium under all irradiance fluctuation conditions. As great advantages...... associated to that model, we can indicate that it is written in a simple tractable closed-form expression and that it is able to unify most of the proposed statistical models for free-space optical communications derived until now in the scientific literature. Based on that Málaga model, we have analyzed...

  15. Interface Engineering for Precise Threshold Voltage Control in Multilayer-Channel Thin Film Transistors

    KAUST Repository

    Park, Jihoon

    2016-11-29

    Multilayer channel structure is used to effectively manipulate the threshold voltage of zinc oxide transistors without degrading its field-effect mobility. Transistors operating in enhancement mode with good mobility are fabricated by optimizing the structure of the multilayer channel. The optimization is attributed to the formation of additional channel and suppression of the diffusion of absorbed water molecules and oxygen vacancies.

  16. Interface Engineering for Precise Threshold Voltage Control in Multilayer-Channel Thin Film Transistors

    KAUST Repository

    Park, Jihoon; Alshammari, Fwzah Hamud; Wang, Zhenwei; Alshareef, Husam N.

    2016-01-01

    Multilayer channel structure is used to effectively manipulate the threshold voltage of zinc oxide transistors without degrading its field-effect mobility. Transistors operating in enhancement mode with good mobility are fabricated by optimizing the structure of the multilayer channel. The optimization is attributed to the formation of additional channel and suppression of the diffusion of absorbed water molecules and oxygen vacancies.

  17. Optimal threshold of error decision related to non-uniform phase distribution QAM signals generated from MZM based on OCS

    Science.gov (United States)

    Han, Xifeng; Zhou, Wen

    2018-03-01

    Optical vector radio-frequency (RF) signal generation based on optical carrier suppression (OCS) in one Mach-Zehnder modulator (MZM) can realize frequency-doubling. In order to match the phase or amplitude of the recovered quadrature amplitude modulation (QAM) signal, phase or amplitude pre-coding is necessary in the transmitter side. The detected QAM signals usually have one non-uniform phase distribution after square-law detection at the photodiode because of the imperfect characteristics of the optical and electrical devices. We propose to use optimal threshold of error decision for non-uniform phase contribution to reduce the bit error rate (BER). By employing this scheme, the BER of 16 Gbaud (32 Gbit/s) quadrature-phase-shift-keying (QPSK) millimeter wave signal at 36 GHz is improved from 1 × 10-3 to 1 × 10-4 at - 4 . 6 dBm input power into the photodiode.

  18. Some problems in the acceptability of implementing radiation protection programs

    International Nuclear Information System (INIS)

    Neill, R.H.

    1997-01-01

    The three fundamentals that radiation protection programs are based upon are; 1) establishing a quantitative correlation between radiation exposure and biological effects in people; 2) determining a level of acceptable risk of exposure; and 3) establishing systems to measure the radiation dose to insure compliance with the regulations or criteria. The paper discusses the interrelationship of these fundamentals, difficulties in obtaining a consensus of acceptable risk and gives some examples of problems in identifying the most critical population-at-risk and in measuring dose. Despite such problems, it is recommended that we proceed with the existing conservative structure of radiation protection programs based upon a linear no threshold model for low radiation doses to insure public acceptability of various potential radiation risks. Voluntary compliance as well as regulatory requirements should continue to be pursued to maintain minimal exposure to ionizing radiation. (author)

  19. Shape optimization for Stokes problem with threshold slip

    Czech Academy of Sciences Publication Activity Database

    Haslinger, J.; Stebel, Jan; Taoufik, S.

    2014-01-01

    Roč. 59, č. 6 (2014), s. 631-652 ISSN 0862-7940 R&D Projects: GA ČR GA201/09/0917; GA ČR(CZ) GAP201/12/0671 Institutional support: RVO:67985840 Keywords : Stokes problem * friction boundary condition * shape optimization Subject RIV: BA - General Mathematics Impact factor: 0.400, year: 2014 http://link.springer.com/article/10.1007%2Fs10492-014-0077-z

  20. Near-Threshold Computing and Minimum Supply Voltage of Single-Rail MCML Circuits

    Directory of Open Access Journals (Sweden)

    Ruiping Cao

    2014-01-01

    Full Text Available In high-speed applications, MOS current mode logic (MCML is a good alternative. Scaling down supply voltage of the MCML circuits can achieve low power-delay product (PDP. However, the current almost all MCML circuits are realized with dual-rail scheme, where the NMOS configuration in series limits the minimum supply voltage. In this paper, single-rail MCML (SRMCML circuits are described, which can avoid the devices configuration in series, since their logic evaluation block can be realized by only using MOS devices in parallel. The relationship between the minimum supply voltage of the SRMCML circuits and the model parameters of MOS transistors is derived, so that the minimum supply voltage can be estimated before circuit designs. An MCML dynamic flop-flop based on SRMCML is also proposed. The optimization algorithm for near-threshold sequential circuits is presented. A near-threshold SRMCML mode-10 counter based on the optimization algorithm is verified. Scaling down the supply voltage of the SRMCML circuits is also investigated. The power dissipation, delay, and power-delay products of these circuits are carried out. The results show that the near-threshold SRMCML circuits can obtain low delay and small power-delay product.

  1. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    Science.gov (United States)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  2. Cost-effectiveness analysis of the optimal threshold of an automated immunochemical test for colorectal cancer screening: performances of immunochemical colorectal cancer screening.

    Science.gov (United States)

    Berchi, Célia; Guittet, Lydia; Bouvier, Véronique; Launoy, Guy

    2010-01-01

    Most industrialized countries, including France, have undertaken to generalize colorectal cancer screening using guaiac fecal occult blood tests (G-FOBT). However, recent researches demonstrate that immunochemical fecal occult blood tests (I-FOBT) are more effective than G-FOBT. Moreover, new generation I-FOBT benefits from a quantitative reading technique allowing the positivity threshold to be chosen, hence offering the best balance between effectiveness and cost. We aimed at comparing the cost and the clinical performance of one round of screening using I-FOBT at different positivity thresholds to those obtained with G-FOBT to determine the optimal cut-off for I-FOBT. Data were derived from an experiment conducted from June 2004 to December 2005 in Calvados (France) where 20,322 inhabitants aged 50-74 years performed both I-FOBT and G-FOBT. Clinical performance was assessed by the number of advanced tumors screened, including large adenomas and cancers. Costs were assessed by the French Social Security Board and included only direct costs. Screening using I-FOBT resulted in better health outcomes and lower costs than screening using G-FOBT for thresholds comprised between 75 and 93 ng/ml. I-FOBT at 55 ng/ml also offers a satisfactory alternative to G-FOBT, because it is 1.8-fold more effective than G-FOBT, without increasing the number of unnecessary colonoscopies, and at an extra cost of 2,519 euros per advanced tumor screened. The use of an automated I-FOBT at 75 ng/ml would guarantee more efficient screening than currently used G-FOBT. Health authorities in industrialized countries should consider the replacement of G-FOBT by an automated I-FOBT test in the near future.

  3. Espectroscopia de fotoelétrons de limiares de átomos e moléculas Atomic and molecular threshold photoelectron spectroscopy

    Directory of Open Access Journals (Sweden)

    Maria Cristina Andreolli Lopes

    2006-02-01

    Full Text Available A threshold photoelectron spectrometer applied to the study of atomic and molecular threshold photoionization processes is described. The spectrometer has been used in conjunction with a toroidal grating monochromator at the National Synchrotron Radiation Laboratory (LNLS, Brazil. It can be tuned to accept threshold electrons (< 20 meV and work with a power resolution of 716 (~18 meV at 12 eV with a high signal/noise ratio. The performance of this apparatus and some characteristics of the TGM (Toroidal Grating Monochromator beam line of LNLS are described and discussed by means of argon, O2 and N2 threshold photoelectron spectra.

  4. Is skin penetration a determining factor in skin sensitization potential and potency? Refuting the notion of a LogKow threshold for Skin Sensitization

    Science.gov (United States)

    Summary:Background. It is widely accepted that substances that cannot penetrate through the skin will not be sensitisers. Thresholds based on relevant physicochemical parameters such as a LogKow > 1 and a MW < 500, are assumed and widely accepted as self-evident truths. Objective...

  5. On the implications of thresholds for economic science and environmental policy

    Energy Technology Data Exchange (ETDEWEB)

    Aalbers, R.

    1999-05-11

    This thesis consists of four chapters on the topic of thresholds. Chapter 2 deals with the issue of a truly catastrophic threshold. Society has perfect information on both the location and the impact of crossing the threshold. The assumption is made that crossing the threshold will result in the destruction of all human utility on earth (which is not the same as the destruction of all human life on earth). On the basis of a simple neoclassical growth model the question is posed under what conditions society would like to cross the threshold, and hence, initiate a catastrophe. The trade-off for society is to have a relatively low consumption level for an infinite period of time, or to have a relatively high consumption level for a short period of time. Perhaps surprisingly, it turns out that the doomsday scenario may be optimal (in the sense of maximizing human utility). Chapter 3 extends the analysis of the second chapter by allowing society to spend resources either on consumption or on abatement. In chapter 4 the assumption of certainty of information about both the location and the impact of a threshold is relaxed. Instead it is assumed that a society has information - in the form of a probability density function - about both the location of the threshold and its impact. What then is the optimal strategy for that society? In addition, it will be analysed how society`s optimal strategy changes, if the uncertainty increases. We will see that if uncertainty about the impact of the threshold increases, society`s strategy will become more prudent. Moreover, it is argued that any form of cost-benefit analysis must - whenever thresholds cannot be excluded a priori from the analysis - necessarily be based on arbitrary, in the sense of non-empirically verifiable, assumptions about the shape of the damage function. Finally, the author examines the case in which society has not enough - quantitative or qualitative - information in order to obtain, or estimate, a

  6. Discriminating the precipitation phase based on different temperature thresholds in the Songhua River Basin, China

    Science.gov (United States)

    Zhong, Keyuan; Zheng, Fenli; Xu, Ximeng; Qin, Chao

    2018-06-01

    Different precipitation phases (rain, snow or sleet) differ greatly in their hydrological and erosional processes. Therefore, accurate discrimination of the precipitation phase is highly important when researching hydrologic processes and climate change at high latitudes and mountainous regions. The objective of this study was to identify suitable temperature thresholds for discriminating the precipitation phase in the Songhua River Basin (SRB) based on 20-year daily precipitation collected from 60 meteorological stations located in and around the basin. Two methods, the air temperature method (AT method) and the wet bulb temperature method (WBT method), were used to discriminate the precipitation phase. Thirteen temperature thresholds were used to discriminate snowfall in the SRB. These thresholds included air temperatures from 0 to 5.5 °C at intervals of 0.5 °C and the wet bulb temperature (WBT). Three evaluation indices, the error percentage of discriminated snowfall days (Ep), the relative error of discriminated snowfall (Re) and the determination coefficient (R2), were applied to assess the discrimination accuracy. The results showed that 2.5 °C was the optimum threshold temperature for discriminating snowfall at the scale of the entire basin. Due to differences in the landscape conditions at the different stations, the optimum threshold varied by station. The optimal threshold ranged 1.5-4.0 °C, and 19 stations, 17 stations and 18 stations had optimal thresholds of 2.5 °C, 3.0 °C, and 3.5 °C respectively, occupying 90% of all stations. Compared with using a single suitable temperature threshold to discriminate snowfall throughout the basin, it was more accurate to use the optimum threshold at each station to estimate snowfall in the basin. In addition, snowfall was underestimated when the temperature threshold was the WBT and when the temperature threshold was below 2.5 °C, whereas snowfall was overestimated when the temperature threshold exceeded 4

  7. On the controlling parameters for fatigue-crack threshold at low homologous temperatures

    International Nuclear Information System (INIS)

    Yu, W.; Gerberich, W.W.

    1983-01-01

    Fatigue crack propagation phenomena near the threshold stress intensity level ΔK /SUB TH/ , has been a vigorously studied topic in recent years. Near threshold the crack propagates rather slowly, thus giving enough time for various physical and chemical reactions to take place. Room air, which is the most commonly encountered environment, can still supply various ingredients such as oxygen, water vapor (and thus hydrogen) to support these reactions. Much effort had been directed toward the environmental aspects of near threshold fatigue crack growth. By conducting tests under vacuum, Suresh and coworkers found that the crack propagation rate in a 2-1/4 Cr-1Mo steel was higher in vacuum than in air. An oxide induced closure, which served to reduce the effective stress intensity at the crack tip, seems to furnish a good explanation. Neumann and coworkers proposed that during the fatigue process, extrusion-intrusion pairs can develop as a consequence of reversed slip around the crack tip when the crack was propagated near threshold stress intensity. Beevers demonstrated that fatigue fracture surfaces contact each other during unloading even under tension-tension cycling. Kanninen and Atkinson also reached the conclusion that the compressive stress acting at the crack tip due to residual plasticity can induce closure. Microstructural effects have also been cited as important factors in near threshold crack growth. It is generally accepted that coarser grains have a beneficial effect on the resistance to the near threshold crack propagation

  8. Optimization methodology for large scale fin geometry on the steel containment of a Public Acceptable Simple SMR (PASS)

    International Nuclear Information System (INIS)

    Kim, Do Yun; NO, Hee Cheon; Kim, Ho Sik

    2015-01-01

    Highlights: • Optimization methodology for fin geometry on the steel containment is established. • Optimum spacing is 7 cm in PASS containment. • Optimum thickness is 0.9–1.8 cm when a fin height is 10–25 cm. • Optimal fin geometry is determined in given fin height by overall effectiveness correlation. • 13% of material volume and 43% of containment volume are reduced by using fins. - Abstracts: Heat removal capability through a steel containment is important in accident situations to preserve the integrity of a nuclear power plant which adopts a steel containment concept. A heat transfer rate will be enhanced by using fins on the external surface of the steel containment. The fins, however, cause to increase flow resistance and to deteriorate the heat transfer rate at the same time. Therefore, this study investigates an optimization methodology of large scale fin geometry for a vertical base where a natural convection flow regime is turbulent. Rectangular plate fins adopted in the steel containment of a Public Acceptable Simple SMR (PASS) is used as a reference. The heat transfer rate through the fins is obtained from CFD tools. In order to optimize fin geometry, an overall effectiveness concept is introduced as a fin performance parameter. The optimizing procedure is starting from finding optimum spacing. Then, optimum thickness is calculated and finally optimal fin geometry is suggested. Scale analysis is conducted to show the existence of an optimum spacing which turns out to be 7 cm in case of PASS. Optimum thickness is obtained by the overall effectiveness correlation, which is derived from a total heat transfer coefficient correlation. The total heat transfer coefficient correlation of a vertical fin array is suggested considering both of natural convection and radiation. However, the optimum thickness is changed as a fin height varies. Therefore, optimal fin geometry is obtained as a function of a fin height. With the assumption that the heat

  9. Optimal Quality Strategy and Matching Service on Crowdfunding Platforms

    Directory of Open Access Journals (Sweden)

    Wenqing Wu

    2018-04-01

    Full Text Available This paper develops a crowdfunding platform model incorporating quality and a matching service from the perspective of a two-sided market. It aims to explore the impact of different factors on the optimal quality threshold and matching service in a context of crowdfunding from the perspective of a two-sided market. We discuss the impact of different factors on the optimal quality threshold and matching service. Two important influential factors are under consideration, simultaneously. One is the quality threshold of admission and the other is the matching efficiency on crowdfunding platforms. This paper develops a two-sided market model incorporating quality, a matching service, and the characters of crowdfunding campaigns. After attempting to solve the model by derivative method, this paper identifies the mechanism of how the parameters influence the optimal quality threshold and matching service. Additionally, it compares the platform profits in scenarios with and without an exclusion policy. The results demonstrate that excluding low-quality projects is profitable when funder preference for project quality is substantial enough. Crowdfunding platform managers would be unwise to admit the quality threshold of the crowdfunding project and charge entrance fees when the parameter of funder preference for project quality is small.

  10. CARA Risk Assessment Thresholds

    Science.gov (United States)

    Hejduk, M. D.

    2016-01-01

    Warning remediation threshold (Red threshold): Pc level at which warnings are issued, and active remediation considered and usually executed. Analysis threshold (Green to Yellow threshold): Pc level at which analysis of event is indicated, including seeking additional information if warranted. Post-remediation threshold: Pc level to which remediation maneuvers are sized in order to achieve event remediation and obviate any need for immediate follow-up maneuvers. Maneuver screening threshold: Pc compliance level for routine maneuver screenings (more demanding than regular Red threshold due to additional maneuver uncertainty).

  11. Optimization of contrast-enhanced spectral mammography depending on clinical indication.

    Science.gov (United States)

    Dromain, Clarisse; Canale, Sandra; Saab-Puong, Sylvie; Carton, Ann-Katherine; Muller, Serge; Fallenberg, Eva Maria

    2014-10-01

    The objective is to optimize low-energy (LE) and high-energy (HE) exposure parameters of contrast-enhanced spectral mammography (CESM) examinations in four different clinical applications for which different levels of average glandular dose (AGD) and ratios between LE and total doses are required. The optimization was performed on a Senographe DS with a SenoBright® upgrade. Simulations were performed to find the optima by maximizing the contrast-to-noise ratio (CNR) on the recombined CESM image using different targeted doses and LE image quality. The linearity between iodine concentration and CNR as well as the minimal detectable iodine concentration was assessed. The image quality of the LE image was assessed on the CDMAM contrast-detail phantom. Experiments confirmed the optima found on simulation. The CNR was higher for each clinical indication than for SenoBright®, including the screening indication for which the total AGD was 22% lower. Minimal iodine concentrations detectable in the case of a 3-mm-diameter round tumor were 12.5% lower than those obtained for the same dose in the clinical routine. LE image quality satisfied EUREF acceptable limits for threshold contrast. This newly optimized set of acquisition parameters allows increased contrast detectability compared to parameters currently used without a significant loss in LE image quality.

  12. Ideal Standards, Acceptance, and Relationship Satisfaction: Latitudes of Differential Effects

    Directory of Open Access Journals (Sweden)

    Asuman Buyukcan-Tetik

    2017-09-01

    Full Text Available We examined whether the relations of consistency between ideal standards and perceptions of a current romantic partner with partner acceptance and relationship satisfaction level off, or decelerate, above a threshold. We tested our hypothesis using a 3-year longitudinal data set collected from heterosexual newlywed couples. We used two indicators of consistency: pattern correspondence (within-person correlation between ideal standards and perceived partner ratings and mean-level match (difference between ideal standards score and perceived partner score. Our results revealed that pattern correspondence had no relation with partner acceptance, but a positive linear/exponential association with relationship satisfaction. Mean-level match had a significant positive association with actor’s acceptance and relationship satisfaction up to the point where perceived partner score equaled ideal standards score. Partner effects did not show a consistent pattern. The results suggest that the consistency between ideal standards and perceived partner attributes has a non-linear association with acceptance and relationship satisfaction, although the results were more conclusive for mean-level match.

  13. Formulation, evaluation and 3(2) full factorial design-based optimization of ondansetron hydrochloride incorporated taste masked microspheres.

    Science.gov (United States)

    Kharb, Vandana; Saharan, Vikas Anand; Dev, Kapil; Jadhav, Hemant; Purohit, Suresh

    2014-11-01

    Masking the bitter taste of Ondansetron hydrochloride (ONS) may improve palatability, acceptance and compliance of ONS products. ONS-loaded, taste-masked microspheres were prepared with a polycationic pH-sensitive polymer and 3(2) full factorial design (FFD) was applied to optimize microsphere batches. Solvent evaporation, in acetone--methanol/liquid paraffin system, was used to prepare taste-masked ONS microspheres. The effect of varying drug/polymer (D/P) ratios on microspheres characteristics were studied by 3(2) FFD. Desirability function was used to search the optimum formulation. Microspheres were evaluated by FTIR, XRD and DSC to examine interaction and effect of microencapsulation process. In vitro taste assessment approach based on bitterness threshold and drug release was used to assess bitterness scores. Prepared ONS microspheres were spherical and surface was wrinkled. ONS was molecularly dispersed in microspheres without any incompatibility with EE100. In hydrochloric acid buffer pH 1.2, ONS released completely from microsphere in just 10 min. Contrary to this, ONS release at initial 5 min from taste-masked microspheres was less than the bitterness threshold. Full factorial design and in vitro taste assessment approach, coupled together, was successfully applied to develop and optimize batches of ONS incorporated taste-masked microspheres.

  14. A problem of finding an acceptable variant in generalized project networks

    Directory of Open Access Journals (Sweden)

    David Blokh

    2005-01-01

    Full Text Available A project network often has some activities or groups of activities which can be performed at different stages of the project. Then, the problem of finding an optimal/acceptable time or/and optimal/acceptable order of such an activity or a group of activities arises. Such a problem emerges, in particular, in house-building management when the beginnings of some activities may vary in time or/and order. We consider a mathematical formulation of the problem, show its computational complexity, and describe an algorithm for solving the problem.

  15. A derivation of the stable cavitation threshold accounting for bubble-bubble interactions.

    Science.gov (United States)

    Guédra, Matthieu; Cornu, Corentin; Inserra, Claude

    2017-09-01

    The subharmonic emission of sound coming from the nonlinear response of a bubble population is the most used indicator for stable cavitation. When driven at twice their resonance frequency, bubbles can exhibit subharmonic spherical oscillations if the acoustic pressure amplitude exceeds a threshold value. Although various theoretical derivations exist for the subharmonic emission by free or coated bubbles, they all rest on the single bubble model. In this paper, we propose an analytical expression of the subharmonic threshold for interacting bubbles in a homogeneous, monodisperse cloud. This theory predicts a shift of the subharmonic resonance frequency and a decrease of the corresponding pressure threshold due to the interactions. For a given sonication frequency, these results show that an optimal value of the interaction strength (i.e. the number density of bubbles) can be found for which the subharmonic threshold is minimum, which is consistent with recently published experiments conducted on ultrasound contrast agents. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Decoding suprathreshold stochastic resonance with optimal weights

    International Nuclear Information System (INIS)

    Xu, Liyan; Vladusich, Tony; Duan, Fabing; Gunn, Lachlan J.; Abbott, Derek; McDonnell, Mark D.

    2015-01-01

    We investigate an array of stochastic quantizers for converting an analog input signal into a discrete output in the context of suprathreshold stochastic resonance. A new optimal weighted decoding is considered for different threshold level distributions. We show that for particular noise levels and choices of the threshold levels optimally weighting the quantizer responses provides a reduced mean square error in comparison with the original unweighted array. However, there are also many parameter regions where the original array provides near optimal performance, and when this occurs, it offers a much simpler approach than optimally weighting each quantizer's response. - Highlights: • A weighted summing array of independently noisy binary comparators is investigated. • We present an optimal linearly weighted decoding scheme for combining the comparator responses. • We solve for the optimal weights by applying least squares regression to simulated data. • We find that the MSE distortion of weighting before summation is superior to unweighted summation of comparator responses. • For some parameter regions, the decrease in MSE distortion due to weighting is negligible

  17. Numerical investigation of the inertial cavitation threshold under multi-frequency ultrasound.

    Science.gov (United States)

    Suo, Dingjie; Govind, Bala; Zhang, Shengqi; Jing, Yun

    2018-03-01

    Through the introduction of multi-frequency sonication in High Intensity Focused Ultrasound (HIFU), enhancement of efficiency has been noted in several applications including thrombolysis, tissue ablation, sonochemistry, and sonoluminescence. One key experimental observation is that multi-frequency ultrasound can help lower the inertial cavitation threshold, thereby improving the power efficiency. However, this has not been well corroborated by the theory. In this paper, a numerical investigation on the inertial cavitation threshold of microbubbles (MBs) under multi-frequency ultrasound irradiation is conducted. The relationships between the cavitation threshold and MB size at various frequencies and in different media are investigated. The results of single-, dual and triple frequency sonication show reduced inertial cavitation thresholds by introducing additional frequencies which is consistent with previous experimental work. In addition, no significant difference is observed between dual frequency sonication with various frequency differences. This study, not only reaffirms the benefit of using multi-frequency ultrasound for various applications, but also provides a possible route for optimizing ultrasound excitations for initiating inertial cavitation. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Norm based Threshold Selection for Fault Detectors

    DEFF Research Database (Denmark)

    Rank, Mike Lind; Niemann, Henrik

    1998-01-01

    The design of fault detectors for fault detection and isolation (FDI) in dynamic systems is considered from a norm based point of view. An analysis of norm based threshold selection is given based on different formulations of FDI problems. Both the nominal FDI problem as well as the uncertain FDI...... problem are considered. Based on this analysis, a performance index based on norms of the involved transfer functions is given. The performance index allows us also to optimize the structure of the fault detection filter directly...

  19. Optimal Network-Topology Design

    Science.gov (United States)

    Li, Victor O. K.; Yuen, Joseph H.; Hou, Ting-Chao; Lam, Yuen Fung

    1987-01-01

    Candidate network designs tested for acceptability and cost. Optimal Network Topology Design computer program developed as part of study on topology design and analysis of performance of Space Station Information System (SSIS) network. Uses efficient algorithm to generate candidate network designs consisting of subsets of set of all network components, in increasing order of total costs and checks each design to see whether it forms acceptable network. Technique gives true cost-optimal network and particularly useful when network has many constraints and not too many components. Program written in PASCAL.

  20. Optimal Control and Optimization of Stochastic Supply Chain Systems

    CERN Document Server

    Song, Dong-Ping

    2013-01-01

    Optimal Control and Optimization of Stochastic Supply Chain Systems examines its subject in the context of the presence of a variety of uncertainties. Numerous examples with intuitive illustrations and tables are provided, to demonstrate the structural characteristics of the optimal control policies in various stochastic supply chains and to show how to make use of these characteristics to construct easy-to-operate sub-optimal policies.                 In Part I, a general introduction to stochastic supply chain systems is provided. Analytical models for various stochastic supply chain systems are formulated and analysed in Part II. In Part III the structural knowledge of the optimal control policies obtained in Part II is utilized to construct easy-to-operate sub-optimal control policies for various stochastic supply chain systems accordingly. Finally, Part IV discusses the optimisation of threshold-type control policies and their robustness. A key feature of the book is its tying together of ...

  1. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  2. Precipitation thresholds for landslide occurrence near Seattle, Mukilteo, and Everett, Washington

    Science.gov (United States)

    Scheevel, Caroline R.; Baum, Rex L.; Mirus, Benjamin B.; Smith, Joel B.

    2017-04-27

    Shallow landslides along coastal bluffs frequently occur in the railway corridor between Seattle and Everett, Washington. These slides disrupt passenger rail service, both because of required track maintenance and because the railroad owner, Burlington Northern Santa Fe Railway, does not allow passenger travel for 48 hours after a disruptive landslide. Sound Transit, which operates commuter trains in the corridor, is interested in a decision-making tool to help preemptively cancel passenger railway service in dangerous conditions and reallocate resources to alternative transportation.Statistical analysis showed that a majority of landslides along the Seattle-Everett Corridor are strongly correlated with antecedent rainfall, but that 21-37 percent of recorded landslide dates experienced less than 1 inch of precipitation in the 3 days preceding the landslide and less than 4 inches of rain in the 15 days prior to the preceding 3 days. We developed two empirical thresholds to identify precipitation conditions correlated with landslide occurrence. The two thresholds are defined as P3 = 2.16-0.44P15 and P3 = 2.16-0.22P32, where P3 is the cumulative precipitation in the 3 days prior to the considered date and P15 or P32 is the cumulative precipitation in the 15 days or 32 days prior to P3 (all measurements given in inches). The two thresholds, when compared to a previously developed threshold, quantitatively improve the prediction rate.We also investigated rainfall intensity-duration (ID) thresholds to determine whether revision would improve identification of moderate-intensity, landslide-producing storms. New, optimized ID thresholds evaluate rainstorms lasting at least 12 hours and identify landslide-inducing storms that were typically missed by previously published ID thresholds. The main advantage of the ID thresholds appears when they are combined with recent-antecedent thresholds because rainfall conditions that exceed both threshold types are more likely to induce

  3. The effect of acupuncture duration on analgesia and peripheral sensory thresholds

    Directory of Open Access Journals (Sweden)

    Schulteis Gery

    2008-05-01

    Full Text Available Abstract Background Acupuncture provides a means of peripheral stimulation for pain relief. However, the detailed neuronal mechanisms by which acupuncture relieves pain are still poorly understood and information regarding optimal treatment settings is still inadequate. Previous studies with a short burst of unilateral electroacupuncture (EA in the Tendinomuscular Meridians (TMM treatment model for pain demonstrated a transient dermatomally correlated bilateral analgesic effect with corresponding peripheral modality-specific sensory threshold alterations. However, the impact of EA duration on the analgesic effect in this particular treatment model is unknown. To obtain mechanistically and clinically important information regarding EA analgesia, this current prospective cross-over study assesses the effects of EA duration on analgesia and thermal sensory thresholds in the TMM treatment model. Methods Baseline peripheral sensory thresholds were measured at pre-marked testing sites along the medial aspects (liver and spleen meridians of bilateral lower extremities. A 5-second hot pain stimulation was delivered to the testing sites and the corresponding pain Visual Analog Scale (VAS scores were recorded. Three different EA (5Hz stimulation durations (5, 15 and 30 minutes were randomly tested at least one week apart. At the last 10 seconds of each EA session, 5 seconds of subject specific HP stimulation was delivered to the testing sites. The corresponding pain and EA VAS scores of de qi sensation (tingling during and after the EA were recorded. The measurements were repeated immediately, 30 and 60 minutes after the EA stimulation. A four-factor repeat measures ANOVA was used to assess the effect of stimulation duration, time, location (thigh vs. calf and side (ipsilateral vs. contralateral of EA on sensory thresholds and HP VAS scores. Results A significant (P Conclusion Longer durations of EA stimulation provide a more sustainable analgesic benefit

  4. Software thresholds alter the bias of actigraphy for monitoring sleep in team-sport athletes.

    Science.gov (United States)

    Fuller, Kate L; Juliff, Laura; Gore, Christopher J; Peiffer, Jeremiah J; Halson, Shona L

    2017-08-01

    Actical ® actigraphy is commonly used to monitor athlete sleep. The proprietary software, called Actiware ® , processes data with three different sleep-wake thresholds (Low, Medium or High), but there is no standardisation regarding their use. The purpose of this study was to examine validity and bias of the sleep-wake thresholds for processing Actical ® sleep data in team sport athletes. Validation study comparing actigraph against accepted gold standard polysomnography (PSG). Sixty seven nights of sleep were recorded simultaneously with polysomnography and Actical ® devices. Individual night data was compared across five sleep measures for each sleep-wake threshold using Actiware ® software. Accuracy of each sleep-wake threshold compared with PSG was evaluated from mean bias with 95% confidence limits, Pearson moment-product correlation and associated standard error of estimate. The Medium threshold generated the smallest mean bias compared with polysomnography for total sleep time (8.5min), sleep efficiency (1.8%) and wake after sleep onset (-4.1min); whereas the Low threshold had the smallest bias (7.5min) for wake bouts. Bias in sleep onset latency was the same across thresholds (-9.5min). The standard error of the estimate was similar across all thresholds; total sleep time ∼25min, sleep efficiency ∼4.5%, wake after sleep onset ∼21min, and wake bouts ∼8 counts. Sleep parameters measured by the Actical ® device are greatly influenced by the sleep-wake threshold applied. In the present study the Medium threshold produced the smallest bias for most parameters compared with PSG. Given the magnitude of measurement variability, confidence limits should be employed when interpreting changes in sleep parameters. Copyright © 2017 Sports Medicine Australia. All rights reserved.

  5. ISTA-Net: Iterative Shrinkage-Thresholding Algorithm Inspired Deep Network for Image Compressive Sensing

    KAUST Repository

    Zhang, Jian; Ghanem, Bernard

    2017-01-01

    and the performance/speed of network-based ones. We propose a novel structured deep network, dubbed ISTA-Net, which is inspired by the Iterative Shrinkage-Thresholding Algorithm (ISTA) for optimizing a general $l_1$ norm CS reconstruction model. ISTA-Net essentially

  6. Radar rainfall estimation for the identification of debris-flow precipitation thresholds

    Science.gov (United States)

    Marra, Francesco; Nikolopoulos, Efthymios I.; Creutin, Jean-Dominique; Borga, Marco

    2014-05-01

    variogram) of the triggering rainfall. These results show that weather radar has the potential to effectively increase the accuracy of rainfall thresholds for debris flow occurrence. However, these benefits may only be achieved if the same monitoring instrumentation is used both to derive the rainfall thresholds and for use of thresholds for real-time identification of debris flows occurrence. References Nikolopoulos, E.I., Borga M., Crema S., Marchi L, Marra F. & Guzzetti F., 2014. Impact of uncertainty in rainfall estimation on the identification of rainfall thresholds for debris-flow occurrence. Geomorphology (conditionally accepted) Peruccacci, S., Brunetti, M.T., Luciani, S., Vennari, C., and Guzzetti, F., 2012. Lithological and seasonal control of rainfall thresholds for the possible initiation of landslides in central Italy, Geomorphology, 139-140, 79-90, 2012.

  7. Consumers' perception and acceptance of boiled and fermented sausages from strongly boar tainted meat.

    Science.gov (United States)

    Meier-Dinkel, Lisa; Gertheiss, Jan; Schnäckel, Wolfram; Mörlein, Daniel

    2016-08-01

    Characteristic off-flavours may occur in uncastrated male pigs depending on the accumulation of androstenone and skatole. Feasible processing of strongly tainted carcasses is challenging but gains in importance due to the European ban on piglet castration in 2018. This paper investigates consumers' acceptability of two sausage types: (a) emulsion-type (BOILED) and (b) smoked raw-fermented (FERM). Liking (9 point scales) and flavour perception (check-all-that-apply with both, typical and negatively connoted sensory terms) were evaluated by 120 consumers (within-subject design). Proportion of tainted boar meat (0, 50, 100%) affected overall liking of BOILED, F (2, 238)=23.22, P<.001, but not of FERM sausages, F (2, 238)=0.89, P=.414. Consumers described the flavour of BOILED-100 as strong and sweaty. In conclusion, FERM products seem promising for processing of tainted carcasses whereas formulations must be optimized for BOILED in order to eliminate perceptible off-flavours. Boar taint rejection thresholds may be higher for processed than those suggested for unprocessed meat cuts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Particles near threshold

    International Nuclear Information System (INIS)

    Bhattacharya, T.; Willenbrock, S.

    1993-01-01

    We propose returning to the definition of the width of a particle in terms of the pole in the particle's propagator. Away from thresholds, this definition of width is equivalent to the standard perturbative definition, up to next-to-leading order; however, near a threshold, the two definitions differ significantly. The width as defined by the pole position provides more information in the threshold region than the standard perturbative definition and, in contrast with the perturbative definition, does not vanish when a two-particle s-wave threshold is approached from below

  9. Study of the p + p → π+ + d reaction close to threshold

    International Nuclear Information System (INIS)

    Drochner, M.; Kemmerling, G.; Zwoll, K.; Frekers, D.; Garske, W.; Klimala, W.; Kolev, D.; Tsenov, R.; Kutsarova, T.

    1996-01-01

    The p + p → π + + d reaction has been studied at excess energies between 0.275 MeV and 3.86 MeV. The experiments were performed with the external proton beam of the COoler SYnchrotron (COSY) in Julich. Differential and total cross sections were measured employing a high resolution magnetic spectrometer with nearly 4π acceptance in the centre of the mass system. The values of the total cross sections are - when corrected for the Coulomb effects - in agreement with the results obtained from the time reversed reactions as well as from isospin related reactions. The measured anisotropies between 0.008 and 0.29 indicate that the p-wave is not negligible even so close to threshold. The s-wave and p-wave contributions at threshold are deduced. (author)

  10. Optimization of tactical decisions: subjective and objective conditionality

    Directory of Open Access Journals (Sweden)

    Олег Юрійович Булулуков

    2016-06-01

    Full Text Available In the article «human» and «objective» factors are investigated that influencing on optimization of tactical decisions. Attention is accented on dependence of the got information about the circumstances of crime from the acceptance of correct decisions an investigator. Connection between efficiency of investigation and acceptance of optimal tactical decisions is underlined. The declared problem is not investigational in literature in a sufficient measure. Its separate aspects found the reflection in works: D. А. Solodova, S. Yu. Yakushina and others. Some questions related to optimization of investigation and making decision an investigator we discover in works: R. S. Belkin, V. А. Juravel, V. Е. Konovalova, V. L. Sinchuk, B. V. Shur, V. Yu. Shepitko. The aim of the article is determination of term «optimization», as it applies to tactical decisions in criminalistics, and also consideration of influence of human and objective factors on the acceptance of optimal decisions at investigation of crimes. In the article etymology of term is considered «optimization» and interpretation of its is given as it applies to the acceptance of tactical decisions. The types of mark human and objective factors, stipulating optimization of tactical decisions. The last assists efficiency of tactics of investigation of crimes. At consideration of «human factors» of influencing on optimization decisions, attention applies on «psychological traps» can take place at making decision. Among them such are named, as: anchoring; status quo; irreversible expenses; desired and actual; incorrect formulation; conceit; reinsurance; constancy of memory. Underlined, absence of unambiguity in the brought list over of «objective factors» influencing at choice tactical decision. The different understanding of «tactical risk» is argued, as a factor influencing on an acceptance tactical decisions. The analysis of «human» and «objective» factors influencing on

  11. The variance of length of stay and the optimal DRG outlier payments.

    Science.gov (United States)

    Felder, Stefan

    2009-09-01

    Prospective payment schemes in health care often include supply-side insurance for cost outliers. In hospital reimbursement, prospective payments for patient discharges, based on their classification into diagnosis related group (DRGs), are complemented by outlier payments for long stay patients. The outlier scheme fixes the length of stay (LOS) threshold, constraining the profit risk of the hospitals. In most DRG systems, this threshold increases with the standard deviation of the LOS distribution. The present paper addresses the adequacy of this DRG outlier threshold rule for risk-averse hospitals with preferences depending on the expected value and the variance of profits. It first shows that the optimal threshold solves the hospital's tradeoff between higher profit risk and lower premium loading payments. It then demonstrates for normally distributed truncated LOS that the optimal outlier threshold indeed decreases with an increase in the standard deviation.

  12. Response threshold variance as a basis of collective rationality.

    Science.gov (United States)

    Yamamoto, Tatsuhiro; Hasegawa, Eisuke

    2017-04-01

    Determining the optimal choice among multiple options is necessary in various situations, and the collective rationality of groups has recently become a major topic of interest. Social insects are thought to make such optimal choices by collecting individuals' responses relating to an option's value (=a quality-graded response). However, this behaviour cannot explain the collective rationality of brains because neurons can make only 'yes/no' responses on the basis of the response threshold. Here, we elucidate the basic mechanism underlying the collective rationality of such simple units and show that an ant species uses this mechanism. A larger number of units respond 'yes' to the best option available to a collective decision-maker using only the yes/no mechanism; thus, the best option is always selected by majority decision. Colonies of the ant Myrmica kotokui preferred the better option in a binary choice experiment. The preference of a colony was demonstrated by the workers, which exhibited variable thresholds between two options' qualities. Our results demonstrate how a collective decision-maker comprising simple yes/no judgement units achieves collective rationality without using quality-graded responses. This mechanism has broad applicability to collective decision-making in brain neurons, swarm robotics and human societies.

  13. Research of the mode instability threshold in high power double cladding Yb-doped fiber amplifiers

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yanshan; Ma, Yi; Sun, Yinhong; Peng, Wanjing; Tang, Chun [Institute of Applied Electronics, CAEP, Mianyang, Sichuan (China); The Key Laboratory of Science and Technology on High Energy Laser, CAEP, Mianyang, Sichuan (China); Liu, Qinyong; Ke, Weiwei [Institute of Applied Physics and Computational Mathematics, CAEP, Beijing (China); Wang, Xiaojun [Institute of Applied Physics and Computational Mathematics, CAEP, Beijing (China); Technical Institute of Physics and Chemistry, CAS, Beijing (China)

    2017-08-15

    We experimentally investigate the behavior of the mode instability (MI) threshold in the double cladding Yb-doped fiber amplifier when the amplifier is pumped by broad linewidth laser diodes and narrow linewidth laser diodes respectively. It is found that the MI threshold increases by 26% when the amplifier is pumped by the broad linewidth laser diodes. Experiment results show that the MI threshold is affected by the local heat load rather than the average or the total heat load. The calculation shows that the local heat deposit actually plays the key role to stimulate the MI behaviour. At the MI threshold position in the fiber, the local heat deposit also changes dramatically. The effect of the thermal conductivity on the MI threshold is also studied. Our investigation shows that the MI threshold increases from 1269 W to 1950 W when the thermal conductivity of the fiber amplifier is increased from 0.3 W/(m . K) to 5 W/(m . K). Through optimizing the pump linewidth and the cooling efficiency of the gain fiber, the MI threshold is doubled in our experiment. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  14. Research of the mode instability threshold in high power double cladding Yb-doped fiber amplifiers

    International Nuclear Information System (INIS)

    Wang, Yanshan; Ma, Yi; Sun, Yinhong; Peng, Wanjing; Tang, Chun; Liu, Qinyong; Ke, Weiwei; Wang, Xiaojun

    2017-01-01

    We experimentally investigate the behavior of the mode instability (MI) threshold in the double cladding Yb-doped fiber amplifier when the amplifier is pumped by broad linewidth laser diodes and narrow linewidth laser diodes respectively. It is found that the MI threshold increases by 26% when the amplifier is pumped by the broad linewidth laser diodes. Experiment results show that the MI threshold is affected by the local heat load rather than the average or the total heat load. The calculation shows that the local heat deposit actually plays the key role to stimulate the MI behaviour. At the MI threshold position in the fiber, the local heat deposit also changes dramatically. The effect of the thermal conductivity on the MI threshold is also studied. Our investigation shows that the MI threshold increases from 1269 W to 1950 W when the thermal conductivity of the fiber amplifier is increased from 0.3 W/(m . K) to 5 W/(m . K). Through optimizing the pump linewidth and the cooling efficiency of the gain fiber, the MI threshold is doubled in our experiment. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  15. Diagnostic thresholds for ambulatory blood pressure moving lower: a review based on a meta-analysis-clinical implications

    DEFF Research Database (Denmark)

    Hansen, T.W.; Kikuya, M.; Thijs, L.

    2008-01-01

    Upper limits of normal ambulatory blood pressure (ABP) have been a matter of debate in recent years. Current diagnostic thresholds for ABP rely mainly on statistical parameters derived from reference populations. Recent findings from the International Database of Ambulatory Blood Pressure...... in Relation to Cardiovascular Outcome (IDACO) provide outcome-driven thresholds for ABP. Rounded systolic/diastolic thresholds for optimal ABP were found to be 115/75 mm Hg for 24 hours, 120/80 mm Hg for daytime, and 100/65 mm Hg for nighttime. The corresponding rounded thresholds for normal ABP were 125...... database is therefore being updated with additional population cohorts to enable the construction of multifactorial risk score charts, which also include ABP Udgivelsesdato: 2008/5...

  16. Influence of pulse-height discrimination threshold for photon counting on the accuracy of singlet oxygen luminescence measurement

    International Nuclear Information System (INIS)

    Lin, Huiyun; Chen, Defu; Wang, Min; Lin, Juqiang; Li, Buhong; Xie, Shusen

    2011-01-01

    Direct measurement of near-infrared (NIR) luminescence around 1270 nm is the golden standard of singlet oxygen ( 1 O 2 ) identification. In this study, the influence of pulse-height discrimination threshold on measurement accuracy of the 1 O 2 luminescence that is generated from the photoirradiation of meso-tetra (N-methyl-4-pyridyl) morphine tetra-tosylate (TMPyP) in aqueous solution was investigated by using our custom-developed detection system. Our results indicate that the discrimination threshold has a significant influence on the absolute 1 O 2 luminescence counts, and the optimal threshold for our detection system is found to be about − 41.2 mV for signal discrimination. After optimization, the derived triplet-state and 1 O 2 lifetimes of TMPyP in aqueous solution are found to be 1.73 ± 0.03 and 3.70 ± 0.04 µs, respectively, and the accuracy of measurement was further independently demonstrated using the laser flash photolysis technique

  17. Multimodal distribution of human cold pain thresholds.

    Science.gov (United States)

    Lötsch, Jörn; Dimova, Violeta; Lieb, Isabel; Zimmermann, Michael; Oertel, Bruno G; Ultsch, Alfred

    2015-01-01

    It is assumed that different pain phenotypes are based on varying molecular pathomechanisms. Distinct ion channels seem to be associated with the perception of cold pain, in particular TRPM8 and TRPA1 have been highlighted previously. The present study analyzed the distribution of cold pain thresholds with focus at describing the multimodality based on the hypothesis that it reflects a contribution of distinct ion channels. Cold pain thresholds (CPT) were available from 329 healthy volunteers (aged 18 - 37 years; 159 men) enrolled in previous studies. The distribution of the pooled and log-transformed threshold data was described using a kernel density estimation (Pareto Density Estimation (PDE)) and subsequently, the log data was modeled as a mixture of Gaussian distributions using the expectation maximization (EM) algorithm to optimize the fit. CPTs were clearly multi-modally distributed. Fitting a Gaussian Mixture Model (GMM) to the log-transformed threshold data revealed that the best fit is obtained when applying a three-model distribution pattern. The modes of the identified three Gaussian distributions, retransformed from the log domain to the mean stimulation temperatures at which the subjects had indicated pain thresholds, were obtained at 23.7 °C, 13.2 °C and 1.5 °C for Gaussian #1, #2 and #3, respectively. The localization of the first and second Gaussians was interpreted as reflecting the contribution of two different cold sensors. From the calculated localization of the modes of the first two Gaussians, the hypothesis of an involvement of TRPM8, sensing temperatures from 25 - 24 °C, and TRPA1, sensing cold from 17 °C can be derived. In that case, subjects belonging to either Gaussian would possess a dominance of the one or the other receptor at the skin area where the cold stimuli had been applied. The findings therefore support a suitability of complex analytical approaches to detect mechanistically determined patterns from pain phenotype data.

  18. Phased arrays: A strategy to lower the energy threshold for neutrinos

    Directory of Open Access Journals (Sweden)

    Wissel Stephanie

    2017-01-01

    Full Text Available In-ice radio arrays are optimized for detecting the highest energy, cosmogenic neutrinos expected to be produced though cosmic ray interactions with background photons. However, there are two expected populations of high energy neutrinos: the astrophysical flux observed by IceCube (~1 PeV and the cosmogenic flux (~ 1017 eV or 100 PeV. Typical radio arrays employ a noise-riding trigger, which limits their minimum energy threshold based on the background noise temperature of the ice. Phased radio arrays could lower the energy threshold by combining the signals from several channels before triggering, thereby improving the signal-to-noise at the trigger level. Reducing the energy threshold would allow radio experiments to more efficiently overlap with optical Cherenkov neutrino telescopes as well as for more efficient searches for cosmogenic neutrinos. We discuss the proposed technique and prototypical phased arrays deployed in an anechoic chamber and at Greenland’s Summit Station.

  19. Music effect on pain threshold evaluated with current perception threshold

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    AIM: Music relieves anxiety and psychotic tension. This effect of music is applied to surgical operation in the hospital and dental office. It is still unclear whether this music effect is only limited to the psychological aspect but not to the physical aspect or whether its music effect is influenced by the mood or emotion of audience. To elucidate these issues, we evaluated the music effect on pain threshold by current perception threshold (CPT) and profile of mood states (POMC) test. METHODS: Healthy 30 subjects (12 men, 18 women, 25-49 years old, mean age 34.9) were tested. (1)After POMC test, all subjects were evaluated pain threshold with CPT by Neurometer (Radionics, USA) under 6 conditions, silence, listening to the slow tempo classic music, nursery music, hard rock music, classic paino music and relaxation music with 30 seconds interval. (2)After Stroop color word test as the stresser, pain threshold was evaluated with CPT under 2 conditions, silence and listening to the slow tempo classic music. RESULTS: Under litening to the music, CPT sores increased, especially 2 000 Hz level related with compression, warm and pain sensation. Type of music, preference of music and stress also affected CPT score. CONCLUSION: The present study demonstrated that the concentration on the music raise the pain threshold and that stress and mood influence the music effect on pain threshold.

  20. Rational optimization of reliability and safety policies

    International Nuclear Information System (INIS)

    Melchers, Robert E.

    2001-01-01

    Optimization of structures for design has a long history, including optimization using numerical methods and optimality criteria. Much of this work has considered a subset of the complete design optimization problem--that of the technical issues alone. The more general problem must consider also non-technical issues and, importantly, the interplay between them and the parameters which influence them. Optimization involves optimal setting of design or acceptance criteria and, separately, optimal design within the criteria. In the modern context of probability based design codes this requires probabilistic acceptance criteria. The determination of such criteria involves more than the nominal code failure probability approach used for design code formulation. A more general view must be taken and a clear distinction must be made between those matters covered by technical reliability and non-technical reliability. The present paper considers this issue and outlines a framework for rational optimization of structural and other systems given the socio-economic and political systems within which optimization must be performed

  1. Why does society accept a higher risk for alcohol than for other voluntary or involuntary risks?

    Science.gov (United States)

    Rehm, Jürgen; Lachenmeier, Dirk W; Room, Robin

    2014-10-21

    Societies tend to accept much higher risks for voluntary behaviours, those based on individual decisions (for example, to smoke, to consume alcohol, or to ski), than for involuntary exposure such as exposure to risks in soil, drinking water or air. In high-income societies, an acceptable risk to those voluntarily engaging in a risky behaviour seems to be about one death in 1,000 on a lifetime basis. However, drinking more than 20 g pure alcohol per day over an adult lifetime exceeds a threshold of one in 100 deaths, based on a calculation from World Health Organization data of the odds in six European countries of dying from alcohol-attributable causes at different levels of drinking. The voluntary mortality risk of alcohol consumption exceeds the risks of other lifestyle risk factors. In addition, evidence shows that the involuntary risks resulting from customary alcohol consumption far exceed the acceptable threshold for other involuntary risks (such as those established by the World Health Organization or national environmental agencies), and would be judged as not acceptable. Alcohol's exceptional status reflects vagaries of history, which have so far resulted in alcohol being exempted from key food legislation (no labelling of ingredients and nutritional information) and from international conventions governing all other psychoactive substances (both legal and illegal). This is along with special treatment of alcohol in the public health field, in part reflecting overestimation of its beneficial effect on ischaemic disease when consumed in moderation. A much higher mortality risk from alcohol than from other risk factors is currently accepted by high income countries.

  2. Impact of Thresholds and Load Patterns when Executing HPC Applications with Cloud Elasticity

    Directory of Open Access Journals (Sweden)

    Vinicius Facco Rodrigues

    2016-04-01

    Full Text Available Elasticity is one of the most known capabilities related to cloud computing, being largely deployed reactively using thresholds. In this way, maximum and minimum limits are used to drive resource allocation and deallocation actions, leading to the following problem statements: How can cloud users set the threshold values to enable elasticity in their cloud applications? And what is the impact of the application’s load pattern in the elasticity? This article tries to answer these questions for iterative high performance computing applications, showing the impact of both thresholds and load patterns on application performance and resource consumption. To accomplish this, we developed a reactive and PaaS-based elasticity model called AutoElastic and employed it over a private cloud to execute a numerical integration application. Here, we are presenting an analysis of best practices and possible optimizations regarding the elasticity and HPC pair. Considering the results, we observed that the maximum threshold influences the application time more than the minimum one. We concluded that threshold values close to 100% of CPU load are directly related to a weaker reactivity, postponing resource reconfiguration when its activation in advance could be pertinent for reducing the application runtime.

  3. An optimal scheme for top quark mass measurement near the \\rm{t}\\bar{t} threshold at future \\rm{e}^{+}{e}^{-} colliders

    Science.gov (United States)

    Chen, Wei-Guo; Wan, Xia; Wang, You-Kai

    2018-05-01

    A top quark mass measurement scheme near the {{t}}\\bar{{{t}}} production threshold in future {{{e}}}+{{{e}}}- colliders, e.g. the Circular Electron Positron Collider (CEPC), is simulated. A {χ }2 fitting method is adopted to determine the number of energy points to be taken and their locations. Our results show that the optimal energy point is located near the largest slope of the cross section v. beam energy plot, and the most efficient scheme is to concentrate all luminosity on this single energy point in the case of one-parameter top mass fitting. This suggests that the so-called data-driven method could be the best choice for future real experimental measurements. Conveniently, the top mass statistical uncertainty can also be calculated directly by the error matrix even without any sampling and fitting. The agreement of the above two optimization methods has been checked. Our conclusion is that by taking 50 fb‑1 total effective integrated luminosity data, the statistical uncertainty of the top potential subtracted mass can be suppressed to about 7 MeV and the total uncertainty is about 30 MeV. This precision will help to identify the stability of the electroweak vacuum at the Planck scale. Supported by National Science Foundation of China (11405102) and the Fundamental Research Funds for the Central Universities of China (GK201603027, GK201803019)

  4. Optimal Energy Efficiency Fairness of Nodes in Wireless Powered Communication Networks.

    Science.gov (United States)

    Zhang, Jing; Zhou, Qingjie; Ng, Derrick Wing Kwan; Jo, Minho

    2017-09-15

    In wireless powered communication networks (WPCNs), it is essential to research energy efficiency fairness in order to evaluate the balance of nodes for receiving information and harvesting energy. In this paper, we propose an efficient iterative algorithm for optimal energy efficiency proportional fairness in WPCN. The main idea is to use stochastic geometry to derive the mean proportionally fairness utility function with respect to user association probability and receive threshold. Subsequently, we prove that the relaxed proportionally fairness utility function is a concave function for user association probability and receive threshold, respectively. At the same time, a sub-optimal algorithm by exploiting alternating optimization approach is proposed. Through numerical simulations, we demonstrate that our sub-optimal algorithm can obtain a result close to optimal energy efficiency proportional fairness with significant reduction of computational complexity.

  5. A Threshold Pseudorandom Function Construction and Its Applications

    DEFF Research Database (Denmark)

    Nielsen, Jesper Buus

    2002-01-01

    We give the first construction of a practical threshold pseudo- random function.The protocol for evaluating the function is efficient enough that it can be used to replace random oracles in some protocols relying on such oracles. In particular, we show how to transform the efficient...... cryptographically secure Byzantine agreement protocol by Cachin, Kursawe and Shoup for the random oracle model into a cryptographically secure protocol for the complexity theoretic model without loosing efficiency or resilience,thereby constructing an efficient and optimally resilient Byzantine agreement protocol...

  6. Optimal and efficient decoding of concatenated quantum block codes

    International Nuclear Information System (INIS)

    Poulin, David

    2006-01-01

    We consider the problem of optimally decoding a quantum error correction code--that is, to find the optimal recovery procedure given the outcomes of partial ''check'' measurements on the system. In general, this problem is NP hard. However, we demonstrate that for concatenated block codes, the optimal decoding can be efficiently computed using a message-passing algorithm. We compare the performance of the message-passing algorithm to that of the widespread blockwise hard decoding technique. Our Monte Carlo results using the five-qubit and Steane's code on a depolarizing channel demonstrate significant advantages of the message-passing algorithms in two respects: (i) Optimal decoding increases by as much as 94% the error threshold below which the error correction procedure can be used to reliably send information over a noisy channel; and (ii) for noise levels below these thresholds, the probability of error after optimal decoding is suppressed at a significantly higher rate, leading to a substantial reduction of the error correction overhead

  7. Response Surface Method and Linear Programming in the development of mixed nectar of acceptability high and minimum cost

    Directory of Open Access Journals (Sweden)

    Enrique López Calderón

    2012-06-01

    Full Text Available The aim of this study was to develop a high acceptability mixed nectar and low cost. To obtain the nectar mixed considered different amounts of passion fruit, sweet pepino, sucrose, and completing 100% with water, following a two-stage design: screening (using a design of type 2 3 + 4 center points and optimization (using a design of type 2 2 + 2*2 + 4 center points; stages that allow explore a high acceptability formulation. Then we used the technique of Linear Programming to minimize the cost of high acceptability nectar. Result of this process was obtained a mixed nectar optimal acceptability (score of 7, when the formulation is between 9 and 14% of passion fruit, 4 and 5% of sucrose, 73.5% of sweet pepino juice and filling with water to the 100%. Linear Programming possible reduced the cost of nectar mixed with optimal acceptability at S/.174 for a production of 1000 L/day.

  8. The Comparison Study of Quadratic Infinite Beam Program on Optimization Instensity Modulated Radiation Therapy Treatment Planning (IMRTP) between Threshold and Exponential Scatter Method with CERR® In The Case of Lung Cancer

    International Nuclear Information System (INIS)

    Hardiyanti, Y; Haekal, M; Waris, A; Haryanto, F

    2016-01-01

    This research compares the quadratic optimization program on Intensity Modulated Radiation Therapy Treatment Planning (IMRTP) with the Computational Environment for Radiotherapy Research (CERR) software. We assumed that the number of beams used for the treatment planner was about 9 and 13 beams. The case used the energy of 6 MV with Source Skin Distance (SSD) of 100 cm from target volume. Dose calculation used Quadratic Infinite beam (QIB) from CERR. CERR was used in the comparison study between Gauss Primary threshold method and Gauss Primary exponential method. In the case of lung cancer, the threshold variation of 0.01, and 0.004 was used. The output of the dose was distributed using an analysis in the form of DVH from CERR. The maximum dose distributions obtained were on the target volume (PTV) Planning Target Volume, (CTV) Clinical Target Volume, (GTV) Gross Tumor Volume, liver, and skin. It was obtained that if the dose calculation method used exponential and the number of beam 9. When the dose calculation method used the threshold and the number of beam 13, the maximum dose distributions obtained were on the target volume PTV, GTV, heart, and skin. (paper)

  9. ISTA-Net: Iterative Shrinkage-Thresholding Algorithm Inspired Deep Network for Image Compressive Sensing

    KAUST Repository

    Zhang, Jian

    2017-06-24

    Traditional methods for image compressive sensing (CS) reconstruction solve a well-defined inverse problem that is based on a predefined CS model, which defines the underlying structure of the problem and is generally solved by employing convergent iterative solvers. These optimization-based CS methods face the challenge of choosing optimal transforms and tuning parameters in their solvers, while also suffering from high computational complexity in most cases. Recently, some deep network based CS algorithms have been proposed to improve CS reconstruction performance, while dramatically reducing time complexity as compared to optimization-based methods. Despite their impressive results, the proposed networks (either with fully-connected or repetitive convolutional layers) lack any structural diversity and they are trained as a black box, void of any insights from the CS domain. In this paper, we combine the merits of both types of CS methods: the structure insights of optimization-based method and the performance/speed of network-based ones. We propose a novel structured deep network, dubbed ISTA-Net, which is inspired by the Iterative Shrinkage-Thresholding Algorithm (ISTA) for optimizing a general $l_1$ norm CS reconstruction model. ISTA-Net essentially implements a truncated form of ISTA, where all ISTA-Net parameters are learned end-to-end to minimize a reconstruction error in training. Borrowing more insights from the optimization realm, we propose an accelerated version of ISTA-Net, dubbed FISTA-Net, which is inspired by the fast iterative shrinkage-thresholding algorithm (FISTA). Interestingly, this acceleration naturally leads to skip connections in the underlying network design. Extensive CS experiments demonstrate that the proposed ISTA-Net and FISTA-Net outperform existing optimization-based and network-based CS methods by large margins, while maintaining a fast runtime.

  10. Developing optimized CT scan protocols: Phantom measurements of image quality

    International Nuclear Information System (INIS)

    Zarb, Francis; Rainford, Louise; McEntee, Mark F.

    2011-01-01

    Purpose: The increasing frequency of computerized tomography (CT) examinations is well documented, leading to concern about potential radiation risks for patients. However, the consequences of not performing the CT examination and missing injuries and disease are potentially serious, impacting upon correct patient management. The ALARA principle of dose optimization must be employed for all justified CT examinations. Dose indicators displayed on the CT console as either CT dose index (CTDI) and/or dose length product (DLP), are used to indicate dose and can quantify improvements achieved through optimization. Key scan parameters contributing to dose have been identified in previous literature and in previous work by our group. The aim of this study was to optimize the scan parameters of mA; kV and pitch, whilst maintaining image quality and reducing dose. This research was conducted using psychophysical image quality measurements on a CT quality assurance (QA) phantom establishing the impact of dose optimization on image quality parameters. Method: Current CT scan parameters for head (posterior fossa and cerebrum), abdomen and chest examinations were collected from 57% of CT suites available nationally in Malta (n = 4). Current scan protocols were used to image a Catphan 600 CT QA phantom whereby image quality was assessed. Each scan parameter: mA; kV and pitch were systematically reduced until the contrast resolution (CR), spatial resolution (SR) and noise were significantly lowered. The Catphan 600 images, produced by the range of protocols, were evaluated by 2 expert observers assessing CR, SR and noise. The protocol considered as the optimization threshold was just above the setting that resulted in a significant reduction in CR and noise but not affecting SR at the 95% confidence interval. Results: The limit of optimization threshold was determined for each CT suite. Employing optimized parameters, CTDI and DLP were both significantly reduced (p ≤ 0.001) by

  11. Personal values and pain tolerance: does a values intervention add to acceptance?

    Science.gov (United States)

    Branstetter-Rost, Ann; Cushing, Christopher; Douleh, Tanya

    2009-08-01

    Previous research suggests that acceptance is a promising alternative to distraction and control techniques in successfully coping with pain. Acceptance interventions based upon Acceptance and Commitment Therapy (ACT) have been shown to lead to greater tolerance of acute pain as well as increased adjustment and less disability among individuals with chronic pain. However, in these previous intervention studies, the ACT component of values has either not been included or not specifically evaluated. The current study compares the effects of an ACT-based acceptance intervention with and without the values component among individuals completing the cold-pressor task. Results indicate that inclusion of the values component (n = 34) of ACT leads to significantly greater pain tolerance than acceptance alone (n = 30). Consistent with previous research, both conditions were associated with greater pain tolerance than control (n = 35). Despite the difference in tolerance, pain threshold did not differ, and participants in the control condition provided lower ratings of pain severity. The findings from this study support the important role of values and values clarification in acceptance-based interventions such as ACT, and provide direction for clinicians working with individuals with chronic pain conditions. This article evaluates the additive effect of including a personalized-values exercise in an acceptance-based treatment for pain. Results indicate that values interventions make a significant contribution and improvement to acceptance interventions, which may be of interest to clinicians who provide psychological treatment to individuals with chronic pain.

  12. Speech-in-Noise Tests and Supra-threshold Auditory Evoked Potentials as Metrics for Noise Damage and Clinical Trial Outcome Measures.

    Science.gov (United States)

    Le Prell, Colleen G; Brungart, Douglas S

    2016-09-01

    In humans, the accepted clinical standards for detecting hearing loss are the behavioral audiogram, based on the absolute detection threshold of pure-tones, and the threshold auditory brainstem response (ABR). The audiogram and the threshold ABR are reliable and sensitive measures of hearing thresholds in human listeners. However, recent results from noise-exposed animals demonstrate that noise exposure can cause substantial neurodegeneration in the peripheral auditory system without degrading pure-tone audiometric thresholds. It has been suggested that clinical measures of auditory performance conducted with stimuli presented above the detection threshold may be more sensitive than the behavioral audiogram in detecting early-stage noise-induced hearing loss in listeners with audiometric thresholds within normal limits. Supra-threshold speech-in-noise testing and supra-threshold ABR responses are reviewed here, given that they may be useful supplements to the behavioral audiogram for assessment of possible neurodegeneration in noise-exposed listeners. Supra-threshold tests may be useful for assessing the effects of noise on the human inner ear, and the effectiveness of interventions designed to prevent noise trauma. The current state of the science does not necessarily allow us to define a single set of best practice protocols. Nonetheless, we encourage investigators to incorporate these metrics into test batteries when feasible, with an effort to standardize procedures to the greatest extent possible as new reports emerge.

  13. Readiness Factors and Consumer Acceptance of Technology in Mobile Telephony

    Directory of Open Access Journals (Sweden)

    Lucilla Andrade Sousa Cunha

    2014-06-01

    Full Text Available This study aims at analyzing the variables of use and contact of users regarding to products and services of mobile phone technology. Technological innovation stands in the enterprise, enabling growth and creation of new products and services. In the theoretical framework, we used the technology readiness model, from users beliefs and feelings represented by four dimensions: optimism, innovativeness, discomfort and insecurity. It also highlights the technology acceptance model and presents two important constructs as perceived usefulness and ease of use. In short, the factors that support the technology readiness model could be preceding the technology acceptance model. To conduct this study, a field survey of students at the Federal University of Uberlândia / Campus FACIP was carried out. Results indicate optimism as the main factor to accept new mobile technology, due to the fact that the mobile devices provide benefits to people's lives and prompt them a positive attitude toward mobile technology. The perception of the user to purchase a mobile device is directly related to its usefulness and ease of handling of the technology.

  14. Accelerator based production of fissile nuclides, threshold uranium price and perspectives

    International Nuclear Information System (INIS)

    Djordjevic, D.; Knapp, V.

    1988-01-01

    Accelerator breeder system characteristics are considered in this work. One such system which produces fissile nuclides can supply several thermal reactors with fissile fuel, so this system becomes analogous to an uranium enrichment facility with difference that fissile nuclides are produced by conversion of U-238 rather than by separation from natural uranium. This concept, with other long-term perspective for fission technology on the basis of development only one simpler technology. The influence of basic system characteristics on threshold uranium price is examined. Conditions for economically acceptable production are established. (author)

  15. A study for proposal of use of regulatory T cells as a prognostic marker and establishing an optimal threshold level for their expression in chronic lymphocytic leukemia.

    Science.gov (United States)

    Dasgupta, Alakananda; Mahapatra, Manoranjan; Saxena, Renu

    2015-06-01

    Although regulatory T cells (Tregs) have been extensively studied in chronic lymphocytic leukemia, there is no uniform guideline or consensus regarding their use as a prognostic marker. This study describes the methodology used to develop an optimal threshold level for Tregs in these patients. Treg levels were assessed in the peripheral blood of 130 patients and 150 controls. Treg frequencies were linked to established prognostic markers as well as overall survival and time to first treatment. The cut-offs for Treg positivity were assessed by receiver operating characteristic (ROC) analysis. A cut-off of 5.7% for Treg cell percentage and of 35 cells/μL for absolute Treg cell count were determined as optimal in patients with CLL along with a median Treg percentage of 15.5% used to separate patients with low- and high-risk disease. The experiments presented here will possibly aid in the use of Treg frequencies as a potential prognostic marker in CLL.

  16. The asymmetry of the impact of oil price shocks on economic activities: an application of the multivariate threshold model

    International Nuclear Information System (INIS)

    Bwo-Nung Huang; National Chia-Yi University; Hwang, M.J.; Hsiao-Ping Peng

    2005-01-01

    This paper applies the multivariate threshold model to investigate the impacts of an oil price change and its volatility on economic activities (changes in industrial production and real stock returns). The statistical test on the existence of a threshold effect indicates that a threshold value does exist. Using monthly data of the US, Canada, and Japan during the period from 1970 to 2002, we conclude: (i) the optimal threshold level seems to vary according to how an economy depends on imported oil and the attitude towards adopting energy-saving technology; (ii) an oil price change or its volatility has a limited impact on the economies if the change is below the threshold levels; (iii) if the change is above threshold levels, it appears that the change in oil price better explains macroeconomic variables than the volatility of the oil price; and (iv) if the change is above threshold levels, a change in oil price or its volatility explains the model better than the real interest rate. (author)

  17. Threshold-Switchable Particles (TSP’s) to Control Internal Hemorrhage

    Science.gov (United States)

    2011-12-01

    Ismagilov and Morrissey groups for threshold testing. TSPs will be formed from porous and solid nanoparticles with a low-size dispersion optimized for...samples by HUVEC (Human Umbilical Vein Endothelial Cells). Ag NP concentration – 10 µg/ml; Kaolin concentration - 10 µg/ml; MCF-26 concentration 100...µg/ml. (Right) Uptake of various samples by HDFs. Ag NP concentration – 20 µg/ml; Kaolin concentration - 20 µg/ml; MCF-26 concentration 100 µg/ml

  18. Quick-low-density parity check and dynamic threshold voltage optimization in 1X nm triple-level cell NAND flash memory with comprehensive analysis of endurance, retention-time, and temperature variation

    Science.gov (United States)

    Doi, Masafumi; Tokutomi, Tsukasa; Hachiya, Shogo; Kobayashi, Atsuro; Tanakamaru, Shuhei; Ning, Sheyang; Ogura Iwasaki, Tomoko; Takeuchi, Ken

    2016-08-01

    NAND flash memory’s reliability degrades with increasing endurance, retention-time and/or temperature. After a comprehensive evaluation of 1X nm triple-level cell (TLC) NAND flash, two highly reliable techniques are proposed. The first proposal, quick low-density parity check (Quick-LDPC), requires only one cell read in order to accurately estimate a bit-error rate (BER) that includes the effects of temperature, write and erase (W/E) cycles and retention-time. As a result, 83% read latency reduction is achieved compared to conventional AEP-LDPC. Also, W/E cycling is extended by 100% compared with conventional Bose-Chaudhuri-Hocquenghem (BCH) error-correcting code (ECC). The second proposal, dynamic threshold voltage optimization (DVO) has two parts, adaptive V Ref shift (AVS) and V TH space control (VSC). AVS reduces read error and latency by adaptively optimizing the reference voltage (V Ref) based on temperature, W/E cycles and retention-time. AVS stores the optimal V Ref’s in a table in order to enable one cell read. VSC further improves AVS by optimizing the voltage margins between V TH states. DVO reduces BER by 80%.

  19. Selection Strategies for Social Influence in the Threshold Model

    Science.gov (United States)

    Karampourniotis, Panagiotis; Szymanski, Boleslaw; Korniss, Gyorgy

    The ubiquity of online social networks makes the study of social influence extremely significant for its applications to marketing, politics and security. Maximizing the spread of influence by strategically selecting nodes as initiators of a new opinion or trend is a challenging problem. We study the performance of various strategies for selection of large fractions of initiators on a classical social influence model, the Threshold model (TM). Under the TM, a node adopts a new opinion only when the fraction of its first neighbors possessing that opinion exceeds a pre-assigned threshold. The strategies we study are of two kinds: strategies based solely on the initial network structure (Degree-rank, Dominating Sets, PageRank etc.) and strategies that take into account the change of the states of the nodes during the evolution of the cascade, e.g. the greedy algorithm. We find that the performance of these strategies depends largely on both the network structure properties, e.g. the assortativity, and the distribution of the thresholds assigned to the nodes. We conclude that the optimal strategy needs to combine the network specifics and the model specific parameters to identify the most influential spreaders. Supported in part by ARL NS-CTA, ARO, and ONR.

  20. Sub-threshold Post Traumatic Stress Disorder in the WHO World Mental Health Surveys

    Science.gov (United States)

    McLaughlin, Katie A.; Koenen, Karestan C.; Friedman, Matthew J.; Ruscio, Ayelet Meron; Karam, Elie G.; Shahly, Victoria; Stein, Dan J.; Hill, Eric D.; Petukhova, Maria; Alonso, Jordi; Andrade, Laura Helena; Angermeyer, Matthias C.; Borges, Guilherme; de Girolamo, Giovanni; de Graaf, Ron; Demyttenaere, Koen; Florescu, Silvia E.; Mladenova, Maya; Posada-Villa, Jose; Scott, Kate M.; Takeshima, Tadashi; Kessler, Ronald C.

    2014-01-01

    Background Although only a minority of people exposed to a traumatic event (TE) develops PTSD, symptoms not meeting full PTSD criteria are common and often clinically significant. Individuals with these symptoms have sometimes been characterized as having sub-threshold PTSD, but no consensus exists on the optimal definition of this term. Data from a large cross-national epidemiological survey are used to provide a principled basis for such a definition. Methods The WHO World Mental Health (WMH) Surveys administered fully-structured psychiatric diagnostic interviews to community samples in 13 countries containing assessments of PTSD associated with randomly selected TEs. Focusing on the 23,936 respondents reporting lifetime TE exposure, associations of approximated DSM-5 PTSD symptom profiles with six outcomes (distress-impairment, suicidality, comorbid fear-distress disorders, PTSD symptom duration) were examined to investigate implications of different sub-threshold definitions. Results Although consistently highest distress-impairment, suicidality, comorbidity, and symptom duration were observed among the 3.0% of respondents with DSM-5 PTSD than other symptom profiles, the additional 3.6% of respondents meeting two or three of DSM-5 Criteria BE also had significantly elevated scores for most outcomes. The proportion of cases with threshold versus sub-threshold PTSD varied depending on TE type, with threshold PTSD more common following interpersonal violence and sub-threshold PTSD more common following events happening to loved ones. Conclusions Sub-threshold DSM-5 PTSD is most usefully defined as meeting two or three of the DSM-5 Criteria B-E. Use of a consistent definition is critical to advance understanding of the prevalence, predictors, and clinical significance of sub-threshold PTSD. PMID:24842116

  1. Rainfall thresholds and susceptibility mapping for shallow landslides and debris flows in Scotland

    Science.gov (United States)

    Postance, Benjamin; Hillier, John; Dijkstra, Tom; Dixon, Neil

    2017-04-01

    Shallow translational slides and debris flows (hereafter 'landslides') pose a significant threat to life and cause significant annual economic impacts (e.g. by damage and disruption of infrastructure). The focus of this research is on the definition of objective rainfall thresholds using a weather radar system and landslide susceptibility mapping. In the study area Scotland, an inventory of 75 known landslides was used for the period 2003 to 2016. First, the effect of using different rain records (i.e. time series length) on two threshold selection techniques in receiver operating characteristic (ROC) analysis was evaluated. The results show that thresholds selected by 'Threat Score' (minimising false alarms) are sensitive to rain record length and which is not routinely considered, whereas thresholds selected using 'Optimal Point' (minimising failed alarms) are not; therefore these may be suited to establishing lower limit thresholds and be of interest to those developing early warning systems. Robust thresholds are found for combinations of normalised rain duration and accumulation at 1 and 12 day's antecedence respectively; these are normalised using the rainy-day normal and an equivalent measure for rain intensity. This research indicates that, in Scotland, rain accumulation provides a better indicator than rain intensity and that landslides may be generated by threshold conditions lower than previously thought. Second, a landslide susceptibility map is constructed using a cross-validated logistic regression model. A novel element of the approach is that landslide susceptibility is calculated for individual hillslope sections. The developed thresholds and susceptibility map are combined to assess potential hazards and impacts posed to the national highway network in Scotland.

  2. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  3. Threshold-based detection for amplify-and-forward cooperative communication systems with channel estimation error

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2014-09-01

    Efficient receiver designs for cooperative communication systems are becoming increasingly important. In previous work, cooperative networks communicated with the use of $L$ relays. As the receiver is constrained, it can only process $U$ out of $L$ relays. Channel shortening and reduced-rank techniques were employed to design the preprocessing matrix. In this paper, a receiver structure is proposed which combines the joint iterative optimization (JIO) algorithm and our proposed threshold selection criteria. This receiver structure assists in determining the optimal $U-{opt}$. Furthermore, this receiver provides the freedom to choose $U ≤ U-{opt}$ for each frame depending upon the tolerable difference allowed for mean square error (MSE). Our study and simulation results show that by choosing an appropriate threshold, it is possible to gain in terms of complexity savings without affecting the BER performance of the system. Furthermore, in this paper the effect of channel estimation errors is investigated on the MSE performance of the amplify-and-forward (AF) cooperative relaying system.

  4. Thresholds in radiobiology

    International Nuclear Information System (INIS)

    Katz, R.; Hofmann, W.

    1982-01-01

    Interpretations of biological radiation effects frequently use the word 'threshold'. The meaning of this word is explored together with its relationship to the fundamental character of radiation effects and to the question of perception. It is emphasised that although the existence of either a dose or an LET threshold can never be settled by experimental radiobiological investigations, it may be argued on fundamental statistical grounds that for all statistical processes, and especially where the number of observed events is small, the concept of a threshold is logically invalid. (U.K.)

  5. Optimized FFTF Acceptance Test Program covering Phases III, IV, and V

    International Nuclear Information System (INIS)

    Wykoff, W.R.; Jones, D.H.

    1977-03-01

    A detailed review of Phases III, IV, and V of the FFTF Acceptance Test Program has been completed. The purpose of this review was to formulate that test sequence which not only meets requirements for safe, reliable and useful operation of the plant, but also results in the earliest prudent demonstration of full-power performance. A test sequence based on the underlying assumption that sodium flows into the secondary sodium storage tank (T-44) no later than August 31, 1978, is described in detail. A time-scale which allows extra time to put systems and equipment into operation the first time, debugging, and learning how to operate most effectively has been superimposed on the test sequence. Time is not included for major equipment malfunctions. This test plan provides the basis for coordinating the many and varied activities and interfaces necessary for successful and timely execution of the FFTF Acceptance Test Program. In this report, the need dates have been identified for presently scheduled test articles and standard core components

  6. Optimal Claiming Strategies in Bonus Malus Systems and Implied Markov Chains

    Directory of Open Access Journals (Sweden)

    Arthur Charpentier

    2017-11-01

    Full Text Available In this paper, we investigate the impact of the accident reporting strategy of drivers, within a Bonus-Malus system. We exhibit the induced modification of the corresponding class level transition matrix and derive the optimal reporting strategy for rational drivers. The hunger for bonuses induces optimal thresholds under which, drivers do not claim their losses. Mathematical properties of the induced level class process are studied. A convergent numerical algorithm is provided for computing such thresholds and realistic numerical applications are discussed.

  7. Energetic Constraints on H-2-Dependent Terminal Electron Accepting Processes in Anoxic Environments

    DEFF Research Database (Denmark)

    Heimann, Axel Colin; Jakobsen, Rasmus; Blodau, C.

    2010-01-01

    and sulfate reduction are under direct thermodynamic control in soils and sediments and generally approach theoretical minimum energy thresholds. If H-2 concentrations are lowered by thermodynamically more potent TEAPs, these processes are inhibited. This principle is also valid for TEAPS providing more free......Microbially mediated terminal electron accepting processes (TEAPs) to a large extent control the fate of redox reactive elements and associated reactions in anoxic soils, sediments, and aquifers. This review focuses on thermodynamic controls and regulation of H-2-dependent TEAPs, case studies...... illustrating this concept and the quantitative description of thermodynamic controls in modeling. Other electron transfer processes are considered where appropriate. The work reviewed shows that thermodynamics and microbial kinetics are connected near thermodynamic equilibrium. Free energy thresholds...

  8. Acceptability of Early Antiretroviral Therapy Among South African Women.

    Science.gov (United States)

    Garrett, Nigel; Norman, Emily; Leask, Kerry; Naicker, Nivashnee; Asari, Villeshni; Majola, Nelisile; Karim, Quarraisha Abdool; Karim, Salim S Abdool

    2018-03-01

    WHO guidelines recommend immediate initiation of antiretroviral therapy (ART) for all individuals at HIV diagnosis regardless of CD4 count, but concerns remain about potential low uptake or poor adherence among healthy patients with high CD4 counts, especially in resource-limited settings. This study assessed the acceptability of earlier treatment among HIV-positive South African women, median age at enrollment 25 (IQR 22-30), in a 10 year prospective cohort study by (i) describing temporal CD4 count trends at initiation in relation to WHO guidance, (ii) virological suppression rates post-ART initiation at different CD4 count thresholds, and (iii) administration of a standardized questionnaire. 158/232 (68.1%) participants initiated ART between 2006 and 2015. Mean CD4 count at initiation was 217 cells/µl (range 135-372) before 2010, and increased to 531 cells/µl (range 272-1095) by 2015 (p suppression rates at 3, 6, 12 and 18 months were consistently above 85% with no statistically significant differences for participants starting ART at different CD4 count thresholds. A questionnaire assessing uptake of early ART amongst ART-naïve women, median age 28 (IQR 24-33), revealed that 40/51 (78.4%) were willing to start ART at CD4 ≥500. Of those unwilling, 6/11 (54.5%) started ART within 6 months of questionnaire administration. Temporal increases in CD4 counts, comparable virological suppression rates, and positive patient perceptions confirm high acceptability of earlier ART initiation for the majority of patients.

  9. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    Science.gov (United States)

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-06

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.

  10. Parameters optimization for wavelet denoising based on normalized spectral angle and threshold constraint machine learning

    Science.gov (United States)

    Li, Hao; Ma, Yong; Liang, Kun; Tian, Yong; Wang, Rui

    2012-01-01

    Wavelet parameters (e.g., wavelet type, level of decomposition) affect the performance of the wavelet denoising algorithm in hyperspectral applications. Current studies select the best wavelet parameters for a single spectral curve by comparing similarity criteria such as spectral angle (SA). However, the method to find the best parameters for a spectral library that contains multiple spectra has not been studied. In this paper, a criterion named normalized spectral angle (NSA) is proposed. By comparing NSA, the best combination of parameters for a spectral library can be selected. Moreover, a fast algorithm based on threshold constraint and machine learning is developed to reduce the time of a full search. After several iterations of learning, the combination of parameters that constantly surpasses a threshold is selected. The experiments proved that by using the NSA criterion, the SA values decreased significantly, and the fast algorithm could save 80% time consumption, while the denoising performance was not obviously impaired.

  11. Hybrid carbon incentive mechanisms and political acceptability

    International Nuclear Information System (INIS)

    Vollebergh, H.R.J.; De Vries, J.L.; Koutstaal, P.R.

    1997-01-01

    In this paper it is analyzed how hybrid systems of carbon taxes and tradeable permits optimize some conflicting dimensions of political acceptability related to the design of these instruments. Pure systems like taxes without exemptions or auctioned tradeable permits cause problems for political acceptability in open economies due to high overall costs (abatement cost plus payments on the tax or auctions) for current polluters. Unfortunately, pure systems based on grandfathering of emission rights across the board do not provide a feasible alternative because of monitoring and enforcement problems. In contrast, consciously designed hybrid systems employ grandfathering of emission rights together with either carbon taxes or auctioned carbon permits in order to overcome acceptability problems of pure systems, while leaving incentives to reduce emissions at the margin untouched. Moreover, monitoring and enforcement costs of the hybrid systems are less due to the lower number of participating agents compared with the pure systems, while opportunities for cost- or burden-sharing exist as well. 3 figs., 4 tabs., 23 refs

  12. Shifts in the relationship between motor unit recruitment thresholds versus derecruitment thresholds during fatigue.

    Science.gov (United States)

    Stock, Matt S; Mota, Jacob A

    2017-12-01

    Muscle fatigue is associated with diminished twitch force amplitude. We examined changes in the motor unit recruitment versus derecruitment threshold relationship during fatigue. Nine men (mean age = 26 years) performed repeated isometric contractions at 50% maximal voluntary contraction (MVC) knee extensor force until exhaustion. Surface electromyographic signals were detected from the vastus lateralis, and were decomposed into their constituent motor unit action potential trains. Motor unit recruitment and derecruitment thresholds and firing rates at recruitment and derecruitment were evaluated at the beginning, middle, and end of the protocol. On average, 15 motor units were studied per contraction. For the initial contraction, three subjects showed greater recruitment thresholds than derecruitment thresholds for all motor units. Five subjects showed greater recruitment thresholds than derecruitment thresholds for only low-threshold motor units at the beginning, with a mean cross-over of 31.6% MVC. As the muscle fatigued, many motor units were derecruited at progressively higher forces. In turn, decreased slopes and increased y-intercepts were observed. These shifts were complemented by increased firing rates at derecruitment relative to recruitment. As the vastus lateralis fatigued, the central nervous system's compensatory adjustments resulted in a shift of the regression line of the recruitment versus derecruitment threshold relationship. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Acceptance of sugar reduction in flavored yogurt.

    Science.gov (United States)

    Chollet, M; Gille, D; Schmid, A; Walther, B; Piccinali, P

    2013-09-01

    To investigate what level of sugar reduction is accepted in flavored yogurt, we conducted a hedonic test focusing on the degree of liking of the products and on optimal sweetness and aroma levels. For both flavorings (strawberry and coffee), consumers preferred yogurt containing 10% added sugar. However, yogurt containing 7% added sugar was also acceptable. On the just-about-right scale, yogurt containing 10% sugar was more often described as too sweet compared with yogurt containing 7% sugar. On the other hand, the sweetness and aroma intensity for yogurt containing 5% sugar was judged as too low. A second test was conducted to determine the effect of flavoring concentration on the acceptance of yogurt containing 7% sugar. Yogurts containing the highest concentrations of flavoring (11% strawberry, 0.75% coffee) were less appreciated. Additionally, the largest percentage of consumers perceived these yogurts as "not sweet enough." These results indicate that consumers would accept flavored yogurts with 7% added sugar instead of 10%, but 5% sugar would be too low. Additionally, an increase in flavor concentration is undesirable for yogurt containing 7% added sugar. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Stochastic optimal control of single neuron spike trains

    DEFF Research Database (Denmark)

    Iolov, Alexandre; Ditlevsen, Susanne; Longtin, Andrë

    2014-01-01

    stimulation of a neuron to achieve a target spike train under the physiological constraint to not damage tissue. Approach. We pose a stochastic optimal control problem to precisely specify the spike times in a leaky integrate-and-fire (LIF) model of a neuron with noise assumed to be of intrinsic or synaptic...... origin. In particular, we allow for the noise to be of arbitrary intensity. The optimal control problem is solved using dynamic programming when the controller has access to the voltage (closed-loop control), and using a maximum principle for the transition density when the controller only has access...... to the spike times (open-loop control). Main results. We have developed a stochastic optimal control algorithm to obtain precise spike times. It is applicable in both the supra-threshold and sub-threshold regimes, under open-loop and closed-loop conditions and with an arbitrary noise intensity; the accuracy...

  15. Optimal Control of Sensor Threshold for Autonomous Wide Area Search Munitions

    National Research Council Canada - National Science Library

    Kish, Brian A; Jacques, David R; Pachter, Meir

    2005-01-01

    The optimal employment of autonomous wide area search munitions is addressed. The scenario considered involves an airborne munition searching a battle space for stationary targets in the presence of false targets...

  16. Optimization of Key Parameters of Energy Management Strategy for Hybrid Electric Vehicle Using DIRECT Algorithm

    Directory of Open Access Journals (Sweden)

    Jingxian Hao

    2016-11-01

    Full Text Available The rule-based logic threshold control strategy has been frequently used in energy management strategies for hybrid electric vehicles (HEVs owing to its convenience in adjusting parameters, real-time performance, stability, and robustness. However, the logic threshold control parameters cannot usually ensure the best vehicle performance at different driving cycles and conditions. For this reason, the optimization of key parameters is important to improve the fuel economy, dynamic performance, and drivability. In principle, this is a multiparameter nonlinear optimization problem. The logic threshold energy management strategy for an all-wheel-drive HEV is comprehensively analyzed and developed in this study. Seven key parameters to be optimized are extracted. The optimization model of key parameters is proposed from the perspective of fuel economy. The global optimization method, DIRECT algorithm, which has good real-time performance, low computational burden, rapid convergence, is selected to optimize the extracted key parameters globally. The results show that with the optimized parameters, the engine operates more at the high efficiency range resulting into a fuel savings of 7% compared with non-optimized parameters. The proposed method can provide guidance for calibrating the parameters of the vehicle energy management strategy from the perspective of fuel economy.

  17. A Probabilistic Model to Evaluate the Optimal Density of Stations Measuring Snowfall.

    Science.gov (United States)

    Schneebeli, Martin; Laternser, Martin

    2004-05-01

    Daily new snow measurements are very important for avalanche forecasting and tourism. A dense network of manual or automatic stations measuring snowfall is necessary to have spatially reliable data. Snow stations in Switzerland were built at partially subjective locations. A probabilistic model based on the frequency and spatial extent of areas covered by heavy snowfalls was developed to quantify the probability that snowfall events are measured by the stations. Area probability relations were calculated for different thresholds of daily accumulated snowfall. A probabilistic model, including autocorrelation, was used to calculate the optimal spacing of stations based on simulated triangular grids and to compare the capture probability of different networks and snowfall thresholds. The Swiss operational snow-stations network captured snowfall events with high probability, but the distribution of the stations could be optimized. The spatial variability increased with higher thresholds of daily accumulated snowfall, and the capture probability decreased with increasing thresholds. The method can be used for other areas where the area probability relation for threshold values of snow or rain can be calculated.

  18. Threshold factorization redux

    Science.gov (United States)

    Chay, Junegone; Kim, Chul

    2018-05-01

    We reanalyze the factorization theorems for the Drell-Yan process and for deep inelastic scattering near threshold, as constructed in the framework of the soft-collinear effective theory (SCET), from a new, consistent perspective. In order to formulate the factorization near threshold in SCET, we should include an additional degree of freedom with small energy, collinear to the beam direction. The corresponding collinear-soft mode is included to describe the parton distribution function (PDF) near threshold. The soft function is modified by subtracting the contribution of the collinear-soft modes in order to avoid double counting on the overlap region. As a result, the proper soft function becomes infrared finite, and all the factorized parts are free of rapidity divergence. Furthermore, the separation of the relevant scales in each factorized part becomes manifest. We apply the same idea to the dihadron production in e+e- annihilation near threshold, and show that the resultant soft function is also free of infrared and rapidity divergences.

  19. Human Health Risk Assessment and Safety Threshold of Harmful Trace Elements in the Soil Environment of the Wulantuga Open-Cast Coal Mine

    Directory of Open Access Journals (Sweden)

    Jianli Jia

    2015-11-01

    Full Text Available In this study, soil samples were collected from a large-scale open-cast coal mine area in Inner Mongolia, China. Arsenic (As, cadmium (Cd, beryllium (Be and nickel (Ni in soil samples were detected using novel collision/reaction cell technology (CCT with inductively-coupled plasma mass spectrometry (ICP-MS; collectively ICP-CCT-MS after closed-vessel microwave digestion. Human health risk from As, Cd, Be and Ni was assessed via three exposure pathways—inhalation, skin contact and soil particle ingestion. The comprehensive carcinogenic risk from As in Wulantuga open-cast coal mine soil is 6.29–87.70-times the acceptable risk, and the highest total hazard quotient of As in soils in this area can reach 4.53-times acceptable risk levels. The carcinogenic risk and hazard quotient of Cd, Be and Ni are acceptable. The main exposure route of As from open-cast coal mine soils is soil particle ingestion, accounting for 76.64% of the total carcinogenic risk. Considering different control values for each exposure pathway, the minimum control value (1.59 mg/kg could be selected as the strict reference safety threshold for As in the soil environment of coal-chemical industry areas. However, acceptable levels of carcinogenic risk are not unanimous; thus, the safety threshold identified here, calculated under a 1.00 × 10−6 acceptable carcinogenic risk level, needs further consideration.

  20. CEAMF study, volume 2 : cumulative effects indicators, thresholds, and case studies : final

    International Nuclear Information System (INIS)

    2003-03-01

    The four types of cumulative effects on the environment are: alteration, loss, and fragmentation of habitat; disturbance; barriers to movement; and direct and indirect mortality. Defining where and how human activities can be continued without irreversible net harm to the environment is part of cumulative effects management. Various land-use and habitat indicators were tested in the Blueberry and Sukunka study areas of British Columbia, to address the environmental effects associated with oil and gas development. As recommended, a tiered threshold approach was used to allow for flexibility in different land management regimes and ecological settings. Success will depend on defining acceptable change, threshold values, standard public database, standard processes to calculate indicator values using the database, and project-specific and cooperative management actions. A pilot study was suggested to test the candidate thresholds and implementation process. The two areas proposed for consideration were the Jedney Enhanced Resource Development Resource Management Zone in the Fort St. John Forest District, and the Etsho Enhanced Resource Development Resource Management Zone in the Fort Nelson Forest District. Both are of interest to the petroleum and forest sectors, and support the woodland caribou, a species which is extremely sensitive to cumulative effects of habitat fragmentation and disturbance. 117 refs., 11 tabs., 39 figs.

  1. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  2. Integration of ecological-biological thresholds in conservation decision making.

    Science.gov (United States)

    Mavrommati, Georgia; Bithas, Kostas; Borsuk, Mark E; Howarth, Richard B

    2016-12-01

    In the Anthropocene, coupled human and natural systems dominate and only a few natural systems remain relatively unaffected by human influence. On the one hand, conservation criteria based on areas of minimal human impact are not relevant to much of the biosphere. On the other hand, conservation criteria based on economic factors are problematic with respect to their ability to arrive at operational indicators of well-being that can be applied in practice over multiple generations. Coupled human and natural systems are subject to economic development which, under current management structures, tends to affect natural systems and cross planetary boundaries. Hence, designing and applying conservation criteria applicable in real-world systems where human and natural systems need to interact and sustainably coexist is essential. By recognizing the criticality of satisfying basic needs as well as the great uncertainty over the needs and preferences of future generations, we sought to incorporate conservation criteria based on minimal human impact into economic evaluation. These criteria require the conservation of environmental conditions such that the opportunity for intergenerational welfare optimization is maintained. Toward this end, we propose the integration of ecological-biological thresholds into decision making and use as an example the planetary-boundaries approach. Both conservation scientists and economists must be involved in defining operational ecological-biological thresholds that can be incorporated into economic thinking and reflect the objectives of conservation, sustainability, and intergenerational welfare optimization. © 2016 Society for Conservation Biology.

  3. Low-threshold optical bistability with multilayer graphene-covering Otto configuration

    International Nuclear Information System (INIS)

    Wang, Hengliang; Wu, Jipeng; Xiang, Yuanjiang; Wen, Shuangchun; Guo, Jun; Jiang, Leyong

    2016-01-01

    In this paper, we propose a modified Otto configuration to realize tunable and low-threshold optical bistability at terahertz frequencies by attaching multilayer graphene sheets to a nonlinear substrate interface. Our work demonstrates that the threshold of optical bistability can be markedly reduced (three orders of magnitude) by covering the nonlinear substrate with multilayer graphene sheets, due to strong local field enhancement with the excitation of surface plasmons. We present the influences of the Fermi energy of graphene, the incident angle, the thickness of air gap and the relaxation time of graphene on the hysteresis phenomenon and give a way to optimize the surface plasmon resonance, which will enable us to further lower the minimal power requirements for realizing optical bistability due to the strong interaction of light with graphene sheets. These results are promising for realization of terahertz optical switches, optical modulators and logical devices. (paper)

  4. Detection thresholds of macaque otolith afferents.

    Science.gov (United States)

    Yu, Xiong-Jie; Dickman, J David; Angelaki, Dora E

    2012-06-13

    The vestibular system is our sixth sense and is important for spatial perception functions, yet the sensory detection and discrimination properties of vestibular neurons remain relatively unexplored. Here we have used signal detection theory to measure detection thresholds of otolith afferents using 1 Hz linear accelerations delivered along three cardinal axes. Direction detection thresholds were measured by comparing mean firing rates centered on response peak and trough (full-cycle thresholds) or by comparing peak/trough firing rates with spontaneous activity (half-cycle thresholds). Thresholds were similar for utricular and saccular afferents, as well as for lateral, fore/aft, and vertical motion directions. When computed along the preferred direction, full-cycle direction detection thresholds were 7.54 and 3.01 cm/s(2) for regular and irregular firing otolith afferents, respectively. Half-cycle thresholds were approximately double, with excitatory thresholds being half as large as inhibitory thresholds. The variability in threshold among afferents was directly related to neuronal gain and did not depend on spike count variance. The exact threshold values depended on both the time window used for spike count analysis and the filtering method used to calculate mean firing rate, although differences between regular and irregular afferent thresholds were independent of analysis parameters. The fact that minimum thresholds measured in macaque otolith afferents are of the same order of magnitude as human behavioral thresholds suggests that the vestibular periphery might determine the limit on our ability to detect or discriminate small differences in head movement, with little noise added during downstream processing.

  5. Threshold guidance update

    International Nuclear Information System (INIS)

    Wickham, L.E.

    1986-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Last years' activities (1984) included the development of a threshold guidance dose, the development of threshold concentrations corresponding to the guidance dose, the development of supporting documentation, review by a technical peer review committee, and review by the DOE community. As a result of the comments, areas have been identified for more extensive analysis, including an alternative basis for selection of the guidance dose and the development of quality assurance guidelines. Development of quality assurance guidelines will provide a reasonable basis for determining that a given waste stream qualifies as a threshold waste stream and can then be the basis for a more extensive cost-benefit analysis. The threshold guidance and supporting documentation will be revised, based on the comments received. The revised documents will be provided to DOE by early November. DOE-HQ has indicated that the revised documents will be available for review by DOE field offices and their contractors

  6. Radio-over-fiber linearization with optimized genetic algorithm CPWL model.

    Science.gov (United States)

    Mateo, Carlos; Carro, Pedro L; García-Dúcar, Paloma; De Mingo, Jesús; Salinas, Íñigo

    2017-02-20

    This article proposes an optimized version of a canonical piece-wise-linear (CPWL) digital predistorter in order to enhance the linearity of a radio-over-fiber (RoF) LTE mobile fronthaul. In this work, we propose a threshold allocation optimization process carried out by a genetic algorithm (GA) in order to optimize the CPWL model (GA-CPWL). Firstly, experiments show how the CPWL model outperforms the classical memory polynomial DPD in an intensity modulation/direct detection (IM/DD) RoF link. Then, the GA-CPWL predistorter is compared with the CPWL model in several scenarios, in order to verify that the proposed DPD offers better performance in different optical transmission conditions. Experimental results reveal that with a proper threshold allocation, the GA-CPWL predistorter offers very promising outcomes.

  7. Proposed diagnostic thresholds for gestational diabetes mellitus according to a 75-g oral glucose tolerance test

    DEFF Research Database (Denmark)

    Jensen, Dorte Møller; Damm, P; Sørensen, B

    2003-01-01

    AIMS: To study if established diagnostic threshold values for gestational diabetes based on a 75-g, 2-h oral glucose tolerance test can be supported by maternal and perinatal outcomes. METHODS: Historical cohort study of 3260 pregnant women examined for gestational diabetes on the basis of risk...... indicators. Information on oral glucose tolerance test results and clinical outcomes were collected from medical records. RESULTS: There was an increased risk of delivering a macrosomic infant in women with 2-h capillary blood glucose of 7.8-8.9 mmol/l compared with women with 2-h glucose ... mellitus. Until these results are available, a 2-h threshold level of 9.0 mmol/l after a 75-g oral glucose tolerance test seems acceptable....

  8. Influence of process parameters on threshold voltage and leakage current in 18nm NMOS device

    Science.gov (United States)

    Atan, Norani Binti; Ahmad, Ibrahim Bin; Majlis, Burhanuddin Bin Yeop; Fauzi, Izzati Binti Ahmad

    2015-04-01

    The process parameters are very crucial factor in the development of transistors. There are many process parameters that influenced in the development of the transistors. In this research, we investigate the effects of the process parameters variation on response characteristics such as threshold voltage (VTH) and sub-threshold leakage current (IOFF) in 18nm NMOS device. The technique to identify semiconductor process parameters whose variability would impact most on the device characteristic is realized through the process by using Taguchi robust design method. This paper presents the process parameters that influenced in threshold voltage (VTH) and sub-threshold leakage current (IOFF) which includes the Halo Implantation, Compensation Implantation, Adjustment Threshold voltage Implantation and Source/Drain Implantation. The design, fabrication and characterization of 18nm HfO2/TiSi2 NMOS device is simulated and performed via a tool called Virtual Wafer Fabrication (VWF) Silvaco TCAD Tool known as ATHENA and ATLAS simulators. These two simulators were combined with Taguchi L9 Orthogonal method to aid in the design and the optimization of the process parameters to achieve the optimum average of threshold voltage (VTH) and sub-threshold leakage current, (IOFF) in 18nm device. Results from this research were obtained; where Halo Implantation dose was identified as one of the process parameter that has the strongest effect on the response characteristics. Whereby the Compensation Implantation dose was identified as an adjustment factor to get the nominal values of threshold voltage VTH, and sub-threshold leakage current, IOFF for 18nm NMOS devices equal to 0.302849 volts and 1.9123×10-16 A/μm respectively. The design values are referred to ITRS 2011 prediction.

  9. Prognostic threshold levels of NT-proBNP testing in primary care

    DEFF Research Database (Denmark)

    Rosenberg, J.; Schou, M.; Gustafsson, F.

    2008-01-01

    AIMS: Chronic heart failure (HF) is a common condition with a poor prognosis. As delayed diagnosis and treatment of HF patients in primary care can be detrimental, risk-stratified waiting lists for echocardiography might optimize resource utilization. We investigated whether a prognostic threshold...... level of the cardiac peptide, NT-proBNP, could be identified. METHODS AND RESULTS: From 2003-2005, 5875 primary care patients with suspected HF (median age 73 years) had NT-proBNP analysed in the Copenhagen area. Eighteen percent died and 20% had a cardiovascular (CV) hospitalization (median follow....../mL) was associated with an 80% (95% CI: 20-190, P = 0.01) increased mortality risk after adjustment for age, sex, previous hospitalization, CV diseases, and chronic diseases. CONCLUSION: We identified prognostic threshold levels for mortality and CV hospitalization for NT-proBNP in primary care patients suspected...

  10. Optimization of Segmentation Quality of Integrated Circuit Images

    Directory of Open Access Journals (Sweden)

    Gintautas Mušketas

    2012-04-01

    Full Text Available The paper presents investigation into the application of genetic algorithms for the segmentation of the active regions of integrated circuit images. This article is dedicated to a theoretical examination of the applied methods (morphological dilation, erosion, hit-and-miss, threshold and describes genetic algorithms, image segmentation as optimization problem. The genetic optimization of the predefined filter sequence parameters is carried out. Improvement to segmentation accuracy using a non optimized filter sequence makes 6%.Artcile in Lithuanian

  11. Automatic treatment plan re-optimization for adaptive radiotherapy guided with the initial plan DVHs

    International Nuclear Information System (INIS)

    Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Graves, Yan Jiang; Gautier, Quentin; Mell, Loren; Jia, Xun; Jiang, Steve; Zhou, Linghong

    2013-01-01

    Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose–volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30

  12. Children's acceptance learning of New Nordic components and potential challenges

    DEFF Research Database (Denmark)

    Hartvig, Ditte Luise

    that repeated exposure as well as food engagement constitute efficient methods to enhance the acceptance of Nordic foods. Furthermore the importance of follow-up tests and initial liking was highlighted. Many different factors affect acceptance and acceptance learning of food products, some of those may even......It has been suggested that dietary recommendations should be tailored to regional conditions to bridge gastronomi, health and sustainability. The New Nordic diet (NND) has been defined as part of the OPUS project:”Optimal well-being, development and health of school children through a New Nordic......’s food preferences. In the first part of the project it was investigated how a five week intervention with Nordic foods and food engagement affected the acceptance of sea-buckthorn berry products, not included in the intervention. The effect of the intervention was compared to the effect of eight product...

  13. 3.05 kW monolithic fiber laser oscillator with simultaneous optimizations of stimulated Raman scattering and transverse mode instability

    Science.gov (United States)

    Yang, Baolai; Zhang, Hanwei; Shi, Chen; Tao, Rumao; Su, Rongtao; Ma, Pengfei; Wang, Xiaolin; Zhou, Pu; Xu, Xiaojun; Lu, Qisheng

    2018-01-01

    We report a high power monolithic ytterbium-doped fiber laser oscillator with an output power of 3.05 kW, which is achieved by simultaneous optimizations of the stimulated Raman scattering (SRS) and transverse mode instability (TMI). The optimizations of the SRS are designed and utilized in the construction of the fiber laser oscillator, while the TMI threshold is optimized with the study of the dependence of TMI threshold on the pump distribution. In the fiber laser oscillator, the TMI threshold is enhanced by ˜30% when the counter-pump scheme is employed instead of the co-pump scheme. By applying bidirectional-pump scheme and appropriately distributing the pump power, the TMI threshold is further enhanced and the monolithic fiber laser oscillator achieves an output power of 3.05 kW with near diffraction limited beam quality.

  14. Acceptability of the Predicting Abusive Head Trauma (PredAHT) clinical prediction tool: A qualitative study with child protection professionals.

    Science.gov (United States)

    Cowley, Laura E; Maguire, Sabine; Farewell, Daniel M; Quinn-Scoggins, Harriet D; Flynn, Matthew O; Kemp, Alison M

    2018-05-09

    The validated Predicting Abusive Head Trauma (PredAHT) tool estimates the probability of abusive head trauma (AHT) based on combinations of six clinical features: head/neck bruising; apnea; seizures; rib/long-bone fractures; retinal hemorrhages. We aimed to determine the acceptability of PredAHT to child protection professionals. We conducted qualitative semi-structured interviews with 56 participants: clinicians (25), child protection social workers (10), legal practitioners (9, including 4 judges), police officers (8), and pathologists (4), purposively sampled across southwest United Kingdom. Interviews were recorded, transcribed and imported into NVivo for thematic analysis (38% double-coded). We explored participants' evaluations of PredAHT, their opinions about the optimal way to present the calculated probabilities, and their interpretation of probabilities in the context of suspected AHT. Clinicians, child protection social workers and police thought PredAHT would be beneficial as an objective adjunct to their professional judgment, to give them greater confidence in their decisions. Lawyers and pathologists appreciated its value for prompting multidisciplinary investigations, but were uncertain of its usefulness in court. Perceived disadvantages included: possible over-reliance and false reassurance from a low score. Interpretations regarding which percentages equate to 'low', 'medium' or 'high' likelihood of AHT varied; participants preferred a precise % probability over these general terms. Participants would use PredAHT with provisos: if they received multi-agency training to define accepted risk thresholds for consistent interpretation; with knowledge of its development; if it was accepted by colleagues. PredAHT may therefore increase professionals' confidence in their decision-making when investigating suspected AHT, but may be of less value in court. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Simulated annealing method for electronic circuits design: adaptation and comparison with other optimization methods

    International Nuclear Information System (INIS)

    Berthiau, G.

    1995-10-01

    The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. Finally, our simulated annealing program

  16. Intermediate structure and threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2004-01-01

    The Intermediate Structure, evidenced through microstructures of the neutron strength function, is reflected in open reaction channels as fluctuations in excitation function of nuclear threshold effects. The intermediate state supporting both neutron strength function and nuclear threshold effect is a micro-giant neutron threshold state. (author)

  17. Asymmetric underlap optimization of sub-10nm finfets for realizing energy-efficient logic and robust memories

    Science.gov (United States)

    Akkala, Arun Goud

    Leakage currents in CMOS transistors have risen dramatically with technology scaling leading to significant increase in standby power consumption. Among the various transistor candidates, the excellent short channel immunity of Silicon double gate FinFETs have made them the best contender for successful scaling to sub-10nm nodes. For sub-10nm FinFETs, new quantum mechanical leakage mechanisms such as direct source to drain tunneling (DSDT) of charge carriers through channel potential energy barrier arising due to proximity of source/drain regions coupled with the high transport direction electric field is expected to dominate overall leakage. To counter the effects of DSDT and worsening short channel effects and to maintain Ion/ Ioff, performance and power consumption at reasonable values, device optimization techniques are necessary for deeply scaled transistors. In this work, source/drain underlapping of FinFETs has been explored using quantum mechanical device simulations as a potentially promising method to lower DSDT while maintaining the Ion/ Ioff ratio at acceptable levels. By adopting a device/circuit/system level co-design approach, it is shown that asymmetric underlapping, where the drain side underlap is longer than the source side underlap, results in optimal energy efficiency for logic circuits in near-threshold as well as standard, super-threshold operating regimes. In addition, read/write conflict in 6T SRAMs and the degradation in cell noise margins due to the low supply voltage can be mitigated by using optimized asymmetric underlapped n-FinFETs for the access transistor, thereby leading to robust cache memories. When gate-workfunction tuning is possible, using asymmetric underlapped n-FinFETs for both access and pull-down devices in an SRAM bit cell can lead to high-speed and low-leakage caches. Further, it is shown that threshold voltage degradation in the presence of Hot Carrier Injection (HCI) is less severe in asymmetric underlap n-FinFETs. A

  18. MULTIMODAL BIOMETRIC AUTHENTICATION USING PARTICLE SWARM OPTIMIZATION ALGORITHM WITH FINGERPRINT AND IRIS

    Directory of Open Access Journals (Sweden)

    A. Muthukumar

    2012-02-01

    Full Text Available In general, the identification and verification are done by passwords, pin number, etc., which is easily cracked by others. In order to overcome this issue biometrics is a unique tool for authenticate an individual person. Nevertheless, unimodal biometric is suffered due to noise, intra class variations, spoof attacks, non-universality and some other attacks. In order to avoid these attacks, the multimodal biometrics i.e. combining of more modalities is adapted. In a biometric authentication system, the acceptance or rejection of an entity is dependent on the similarity score falling above or below the threshold. Hence this paper has focused on the security of the biometric system, because compromised biometric templates cannot be revoked or reissued and also this paper has proposed a multimodal system based on an evolutionary algorithm, Particle Swarm Optimization that adapts for varying security environments. With these two concerns, this paper had developed a design incorporating adaptability, authenticity and security.

  19. Nuclear threshold effects and neutron strength function

    International Nuclear Information System (INIS)

    Hategan, Cornel; Comisel, Horia

    2003-01-01

    One proves that a Nuclear Threshold Effect is dependent, via Neutron Strength Function, on Spectroscopy of Ancestral Neutron Threshold State. The magnitude of the Nuclear Threshold Effect is proportional to the Neutron Strength Function. Evidence for relation of Nuclear Threshold Effects to Neutron Strength Functions is obtained from Isotopic Threshold Effect and Deuteron Stripping Threshold Anomaly. The empirical and computational analysis of the Isotopic Threshold Effect and of the Deuteron Stripping Threshold Anomaly demonstrate their close relationship to Neutron Strength Functions. It was established that the Nuclear Threshold Effects depend, in addition to genuine Nuclear Reaction Mechanisms, on Spectroscopy of (Ancestral) Neutron Threshold State. The magnitude of the effect is proportional to the Neutron Strength Function, in their dependence on mass number. This result constitutes also a proof that the origins of these threshold effects are Neutron Single Particle States at zero energy. (author)

  20. Review of issues relevant to acceptable risk criteria for nuclear waste management

    International Nuclear Information System (INIS)

    Cohen, J.J.

    1978-01-01

    Development of acceptable risk criteria for nuclear waste management requires the translation of publicly determined goals and objectives into definitive issues which, in turn, require resolution. Since these issues are largely of a subjective nature, they cannot be resolved by technological methods. Development of acceptable risk criteria might best be accomplished by application of a systematic methodology for the optimal implementation of subjective values. Multi-attribute decision analysis is well suited for this purpose

  1. Modeling jointly low, moderate, and heavy rainfall intensities without a threshold selection

    KAUST Repository

    Naveau, Philippe

    2016-04-09

    In statistics, extreme events are often defined as excesses above a given large threshold. This definition allows hydrologists and flood planners to apply Extreme-Value Theory (EVT) to their time series of interest. Even in the stationary univariate context, this approach has at least two main drawbacks. First, working with excesses implies that a lot of observations (those below the chosen threshold) are completely disregarded. The range of precipitation is artificially shopped down into two pieces, namely large intensities and the rest, which necessarily imposes different statistical models for each piece. Second, this strategy raises a nontrivial and very practical difficultly: how to choose the optimal threshold which correctly discriminates between low and heavy rainfall intensities. To address these issues, we propose a statistical model in which EVT results apply not only to heavy, but also to low precipitation amounts (zeros excluded). Our model is in compliance with EVT on both ends of the spectrum and allows a smooth transition between the two tails, while keeping a low number of parameters. In terms of inference, we have implemented and tested two classical methods of estimation: likelihood maximization and probability weighed moments. Last but not least, there is no need to choose a threshold to define low and high excesses. The performance and flexibility of this approach are illustrated on simulated and hourly precipitation recorded in Lyon, France.

  2. Optimal diagnostic measures and thresholds for hypogonadism in men with HIV/AIDS: comparison between 2 transdermal testosterone replacement therapy gels.

    Science.gov (United States)

    Blick, Gary

    2013-03-01

    Testim®. Patients treated with Testim® showed significantly greater improvements in libido, sexual performance, nighttime energy, focus/concentration, and abdominal girth, and trends for greater improvement in fatigue and erectile dysfunction than patients treated with AndroGel®. No patients discontinued therapy due to adverse events. The most useful serum testosterone measurement and threshold for diagnosing hypogonadism in men with HIV/AIDS was FT level < 100 pg/mL, which identified 64% of men as hypogonadal with the presence of ≥ 1 hypogonadal symptom. This is above currently accepted thresholds. Criteria using TT level < 300 ng/dL and FT level < 50 pg/mL only diagnosed 24% and 19% of patients, respectively, as having hypogonadism. Testim® was more effective than AndroGel® in increasing TT and FT levels and improving hypogonadal symptoms.

  3. Preoperative thresholds for pulmonary valve replacement in patients with corrected tetralogy of Fallot using cardiovascular magnetic resonance.

    NARCIS (Netherlands)

    Oosterhof, T.; Straten, A. van; Vliegen, H.W.; Meijboom, F.J.; Dijk, A.P.J. van; Spijkerboer, A.M.; Bouma, B.J.; Zwinderman, A.H.; Hazekamp, M.G.; Roos, A.; Mulder, B.J.M.

    2007-01-01

    BACKGROUND: To facilitate the optimal timing of pulmonary valve replacement, we analyzed preoperative thresholds of right ventricular (RV) volumes above which no decrease or normalization of RV size takes place after surgery. METHODS AND RESULTS: Between 1993 and 2006, 71 adult patients with

  4. Preoperative thresholds for pulmonary valve replacement in patients with corrected tetralogy of Fallot using cardiovascular magnetic resonance

    NARCIS (Netherlands)

    Oosterhof, Thomas; van Straten, Alexander; Vliegen, Hubert W.; Meijboom, Folkert J.; van Dijk, Arie P. J.; Spijkerboer, Anje M.; Bouma, Berto J.; Zwinderman, Aeilko H.; Hazekamp, Mark G.; de Roos, Albert; Mulder, Barbara J. M.

    2007-01-01

    Background - To facilitate the optimal timing of pulmonary valve replacement, we analyzed preoperative thresholds of right ventricular ( RV) volumes above which no decrease or normalization of RV size takes place after surgery. Methods and Results - Between 1993 and 2006, 71 adult patients with

  5. Optimization and Characterization of CMOS for Ultra Low Power Applications

    International Nuclear Information System (INIS)

    Ajmal Kafeel, M.; Hasan, M.; Shah Alalm, M; Pable, S.D.

    2015-01-01

    Aggressive voltage scaling into the subthreshold operating region holds great promise for applications with strict energy budget. However, it has been established that higher speed super threshold device is not suitable for moderate performance subthreshold circuits. The design constraint for selecting V_th and T_ox is much more flexible for subthreshold circuits at low voltage level than super threshold circuits. In order to obtain better performance from a device under subthreshold conditions, it is necessary to investigate and optimize the process and geometry parameters of a Si MOSFET at nanometer technology node. This paper calibrates the fabrication process parameters and electrical characteristics for n- and p-MOSFET s with 35 nm physical gate length. Thereafter, the calibrated device for super threshold application is optimized for better performance under subthreshold conditions using TCAD simulation. The device simulated in this work shows 9.89% improvement in subthreshold slope and 34% advantage I_on/I_off in ratio for the same drive current.

  6. High-resolution modeling of thermal thresholds and environmental influences on coral bleaching for local and regional reef management.

    Science.gov (United States)

    Kumagai, Naoki H; Yamano, Hiroya

    2018-01-01

    Coral reefs are one of the world's most threatened ecosystems, with global and local stressors contributing to their decline. Excessive sea-surface temperatures (SSTs) can cause coral bleaching, resulting in coral death and decreases in coral cover. A SST threshold of 1 °C over the climatological maximum is widely used to predict coral bleaching. In this study, we refined thermal indices predicting coral bleaching at high-spatial resolution (1 km) by statistically optimizing thermal thresholds, as well as considering other environmental influences on bleaching such as ultraviolet (UV) radiation, water turbidity, and cooling effects. We used a coral bleaching dataset derived from the web-based monitoring system Sango Map Project, at scales appropriate for the local and regional conservation of Japanese coral reefs. We recorded coral bleaching events in the years 2004-2016 in Japan. We revealed the influence of multiple factors on the ability to predict coral bleaching, including selection of thermal indices, statistical optimization of thermal thresholds, quantification of multiple environmental influences, and use of multiple modeling methods (generalized linear models and random forests). After optimization, differences in predictive ability among thermal indices were negligible. Thermal index, UV radiation, water turbidity, and cooling effects were important predictors of the occurrence of coral bleaching. Predictions based on the best model revealed that coral reefs in Japan have experienced recent and widespread bleaching. A practical method to reduce bleaching frequency by screening UV radiation was also demonstrated in this paper.

  7. Intelligent Mechanical Fault Diagnosis Based on Multiwavelet Adaptive Threshold Denoising and MPSO

    Directory of Open Access Journals (Sweden)

    Hao Sun

    2014-01-01

    Full Text Available The condition diagnosis of rotating machinery depends largely on the feature analysis of vibration signals measured for the condition diagnosis. However, the signals measured from rotating machinery usually are nonstationary and nonlinear and contain noise. The useful fault features are hidden in the heavy background noise. In this paper, a novel fault diagnosis method for rotating machinery based on multiwavelet adaptive threshold denoising and mutation particle swarm optimization (MPSO is proposed. Geronimo, Hardin, and Massopust (GHM multiwavelet is employed for extracting weak fault features under background noise, and the method of adaptively selecting appropriate threshold for multiwavelet with energy ratio of multiwavelet coefficient is presented. The six nondimensional symptom parameters (SPs in the frequency domain are defined to reflect the features of the vibration signals measured in each state. Detection index (DI using statistical theory has been also defined to evaluate the sensitiveness of SP for condition diagnosis. MPSO algorithm with adaptive inertia weight adjustment and particle mutation is proposed for condition identification. MPSO algorithm effectively solves local optimum and premature convergence problems of conventional particle swarm optimization (PSO algorithm. It can provide a more accurate estimate on fault diagnosis. Practical examples of fault diagnosis for rolling element bearings are given to verify the effectiveness of the proposed method.

  8. A New Wavelet Threshold Function and Denoising Application

    Directory of Open Access Journals (Sweden)

    Lu Jing-yi

    2016-01-01

    Full Text Available In order to improve the effects of denoising, this paper introduces the basic principles of wavelet threshold denoising and traditional structures threshold functions. Meanwhile, it proposes wavelet threshold function and fixed threshold formula which are both improved here. First, this paper studies the problems existing in the traditional wavelet threshold functions and introduces the adjustment factors to construct the new threshold function basis on soft threshold function. Then, it studies the fixed threshold and introduces the logarithmic function of layer number of wavelet decomposition to design the new fixed threshold formula. Finally, this paper uses hard threshold, soft threshold, Garrote threshold, and improved threshold function to denoise different signals. And the paper also calculates signal-to-noise (SNR and mean square errors (MSE of the hard threshold functions, soft thresholding functions, Garrote threshold functions, and the improved threshold function after denoising. Theoretical analysis and experimental results showed that the proposed approach could improve soft threshold functions with constant deviation and hard threshold with discontinuous function problems. The proposed approach could improve the different decomposition scales that adopt the same threshold value to deal with the noise problems, also effectively filter the noise in the signals, and improve the SNR and reduce the MSE of output signals.

  9. Explanatory style, dispositional optimism, and reported parental behavior.

    Science.gov (United States)

    Hjelle, L A; Busch, E A; Warren, J E

    1996-12-01

    The relationship between two cognitive personality constructs (explanatory style and dispositional optimism) and retrospective self-reports of maternal and paternal behavior were investigated. College students (62 men and 145 women) completed the Life Orientation Test, Attributional Style Questionnaire, and Parental Acceptance-Rejection Questionnaire in a single session. As predicted, dispositional optimism was positively correlated with reported maternal and paternal warmth/acceptance and negatively correlated with aggression/hostility, neglect/indifference, and undifferentiated rejection during middle childhood. Unexpectedly, explanatory style was found to be more strongly associated with retrospective reports of paternal as opposed to maternal behavior. The implications of these results for future research concerning the developmental antecedents of differences in explanatory style and dispositional optimism are discussed.

  10. Improvement of the drift chamber system in the SAPHIR detector and first measurements of the Φ meson production at threshold

    International Nuclear Information System (INIS)

    Scholmann, J.N.

    1996-09-01

    The SAPHIR detector at ELSA enables the measurement of photon induced Φ meson production from threshold up to 3 GeV in the full kinematical range. A considerable improvement of the drift chamber system is a precondition of gaining the necessary data rate in an acceptable time. The research focuses attention on the choice of the chamber gas and on a different mechanical construction, so as to minimize the negative influences of the photon beam crossing the sensitive volume of the drift chamber system. In addition, first preliminary results of the total and the differential cross section for the Φ meson production close to threshold were evaluated. (orig.)

  11. Threshold behavior in electron-atom scattering

    International Nuclear Information System (INIS)

    Sadeghpour, H.R.; Greene, C.H.

    1996-01-01

    Ever since the classic work of Wannier in 1953, the process of treating two threshold electrons in the continuum of a positively charged ion has been an active field of study. The authors have developed a treatment motivated by the physics below the double ionization threshold. By modeling the double ionization as a series of Landau-Zener transitions, they obtain an analytical formulation of the absolute threshold probability which has a leading power law behavior, akin to Wannier's law. Some of the noteworthy aspects of this derivation are that the derivation can be conveniently continued below threshold giving rise to a open-quotes cuspclose quotes at threshold, and that on both sides of the threshold, absolute values of the cross sections are obtained

  12. Temporary threshold shift after impulse-noise during video game play: laboratory data.

    Science.gov (United States)

    Spankovich, C; Griffiths, S K; Lobariñas, E; Morgenstein, K E; de la Calle, S; Ledon, V; Guercio, D; Le Prell, C G

    2014-03-01

    Prevention of temporary threshold shift (TTS) after laboratory-based exposure to pure-tones, broadband noise, and narrowband noise signals has been achieved, but prevention of TTS under these experimental conditions may not accurately reflect protection against hearing loss following impulse noise. This study used a controlled laboratory-based TTS paradigm that incorporated impulsive stimuli into the exposure protocol; development of this model could provide a novel platform for assessing proposed therapeutics. Participants played a video game that delivered gunfire-like sound through headphones as part of a target practice game. Effects were measured using audiometric threshold evaluations and distortion product otoacoustic emissions (DPOAEs). The sound level and number of impulses presented were sequentially increased throughout the study. Participants were normal-hearing students at the University of Florida who provided written informed consent prior to participation. TTS was not reliably induced by any of the exposure conditions assessed here. However, there was significant individual variability, and a subset of subjects showed TTS under some exposure conditions. A subset of participants demonstrated reliable threshold shifts under some conditions. Additional experiments are needed to better understand and optimize stimulus parameters that influence TTS after simulated impulse noise.

  13. Thresholds for HLB vector control in infected commercial citrus and compatibility with biological control

    OpenAIRE

    Monzo, C.; Hendricks, K.; Roberts, P.; Stansly, P. A.

    2014-01-01

    Control of the HLB vector, Diaphorina citri Kuwayama, is considered a basic component for management this disease, even in a high HLB incidence scenario. Such control is mostly chemically oriented. However, over use of insecticides would increase costs and be incompatible with biological control. Establishment of economic thresholds for psyllid control under different price scenarios could optimize returns on investment.

  14. Double Photoionization Near Threshold

    Science.gov (United States)

    Wehlitz, Ralf

    2007-01-01

    The threshold region of the double-photoionization cross section is of particular interest because both ejected electrons move slowly in the Coulomb field of the residual ion. Near threshold both electrons have time to interact with each other and with the residual ion. Also, different theoretical models compete to describe the double-photoionization cross section in the threshold region. We have investigated that cross section for lithium and beryllium and have analyzed our data with respect to the latest results in the Coulomb-dipole theory. We find that our data support the idea of a Coulomb-dipole interaction.

  15. Longitudinal Single-Bunch Instability in the ILC Damping Rings: Estimate of Current Threshold

    International Nuclear Information System (INIS)

    Venturini, Marco; Venturini, Marco

    2008-01-01

    Characterization of single-bunch instabilities in the International Linear Collider (ILC) damping rings (DRs) has been indicated as a high-priority activity toward completion of an engineering design. In this paper we report on a first estimate of the current thresholds for the instability using numerical and analytical models of the wake potentials associated with the various machine components. The numerical models were derived (upon appropriate scaling) from designs of the corresponding components installed in existing machines. The current thresholds for instabilities were determined by numerical solution of the Vlasov equation for the longitudinal dynamics. For the DR baseline lattice as of Feb. 2007 we find the critical current for instability to be safely above the design specifications leaving room for further optimization of the choice of the momentum compaction

  16. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  17. The impact of uncertainty on optimal emission policies

    Science.gov (United States)

    Botta, Nicola; Jansson, Patrik; Ionescu, Cezar

    2018-05-01

    We apply a computational framework for specifying and solving sequential decision problems to study the impact of three kinds of uncertainties on optimal emission policies in a stylized sequential emission problem.We find that uncertainties about the implementability of decisions on emission reductions (or increases) have a greater impact on optimal policies than uncertainties about the availability of effective emission reduction technologies and uncertainties about the implications of trespassing critical cumulated emission thresholds. The results show that uncertainties about the implementability of decisions on emission reductions (or increases) call for more precautionary policies. In other words, delaying emission reductions to the point in time when effective technologies will become available is suboptimal when these uncertainties are accounted for rigorously. By contrast, uncertainties about the implications of exceeding critical cumulated emission thresholds tend to make early emission reductions less rewarding.

  18. Gene expression analysis supports tumor threshold over 2.0 cm for T-category breast cancer.

    Science.gov (United States)

    Solvang, Hiroko K; Frigessi, Arnoldo; Kaveh, Fateme; Riis, Margit L H; Lüders, Torben; Bukholm, Ida R K; Kristensen, Vessela N; Andreassen, Bettina K

    2016-12-01

    Tumor size, as indicated by the T-category, is known as a strong prognostic indicator for breast cancer. It is common practice to distinguish the T1 and T2 groups at a tumor size of 2.0 cm. We investigated the 2.0-cm rule from a new point of view. Here, we try to find the optimal threshold based on the differences between the gene expression profiles of the T1 and T2 groups (as defined by the threshold). We developed a numerical algorithm to measure the overall differential gene expression between patients with smaller tumors and those with larger tumors among multiple expression datasets from different studies. We confirmed the performance of the proposed algorithm by a simulation study and then applied it to three different studies conducted at two Norwegian hospitals. We found that the maximum difference in gene expression is obtained at a threshold of 2.2-2.4 cm, and we confirmed that the optimum threshold was over 2.0 cm, as indicated by a validation study using five publicly available expression datasets. Furthermore, we observed a significant differentiation between the two threshold groups in terms of time to local recurrence for the Norwegian datasets. In addition, we performed an associated network and canonical pathway analyses for the genes differentially expressed between tumors below and above the given thresholds, 2.0 and 2.4 cm, using the Norwegian datasets. The associated network function illustrated a cellular assembly of the genes for the 2.0-cm threshold: an energy production for the 2.4-cm threshold and an enrichment in lipid metabolism based on the genes in the intersection for the 2.0- and 2.4-cm thresholds.

  19. Why women apologize more than men: gender differences in thresholds for perceiving offensive behavior.

    Science.gov (United States)

    Schumann, Karina; Ross, Michael

    2010-11-01

    Despite wide acceptance of the stereotype that women apologize more readily than men, there is little systematic evidence to support this stereotype or its supposed bases (e.g., men's fragile egos). We designed two studies to examine whether gender differences in apology behavior exist and, if so, why. In Study 1, participants reported in daily diaries all offenses they committed or experienced and whether an apology had been offered. Women reported offering more apologies than men, but they also reported committing more offenses. There was no gender difference in the proportion of offenses that prompted apologies. This finding suggests that men apologize less frequently than women because they have a higher threshold for what constitutes offensive behavior. In Study 2, we tested this threshold hypothesis by asking participants to evaluate both imaginary and recalled offenses. As predicted, men rated the offenses as less severe than women did. These different ratings of severity predicted both judgments of whether an apology was deserved and actual apology behavior.

  20. Validity of Lactate Thresholds in Inline Speed Skating.

    Science.gov (United States)

    Hecksteden, Anne; Heinze, Tobias; Faude, Oliver; Kindermann, Wilfried; Meyer, Tim

    2015-09-01

    Lactate thresholds are commonly used as estimates of the highest workload where lactate production and elimination are in equilibrium (maximum lactate steady state [MLSS]). However, because of the high static load on propulsive muscles, lactate kinetics in inline speed skating may differ significantly from other endurance exercise modes. Therefore, the discipline-specific validity of lactate thresholds has to be verified. Sixteen competitive inline-speed skaters (age: 30 ± 10 years; training per week: 10 ± 4 hours) completed an exhaustive stepwise incremental exercise test (start 24 km·h, step duration 3 minutes, increment 2 km·h) to determine individual anaerobic threshold (IAT) and the workload corresponding to a blood lactate concentration of 4 mmol·L (LT4) and 2-5 continuous load tests of (up to) 30 minutes to determine MLSS. The IAT and LT4 correlated significantly with MLSS, and the mean differences were almost negligible (MLSS 29.5 ± 2.5 km·h; IAT 29.2 ± 2.0 km·h; LT4 29.6 ± 2.3 km·h; p > 0.1 for all differences). However, the variability of differences was considerable resulting in 95% limits of agreement in the upper range of values known from other endurance disciplines (2.6 km·h [8.8%] for IAT and 3.1 km·h [10.3%] for LT4). Consequently, IAT and LT4 may be considered as valid estimates of the MLSS in inline speed skating, but verification by means of a constant load test should be considered in cases of doubt or when optimal accuracy is needed (e.g., in elite athletes or scientific studies).

  1. On the need for a time- and location-dependent estimation of the NDSI threshold value for reducing existing uncertainties in snow cover maps at different scales

    Science.gov (United States)

    Härer, Stefan; Bernhardt, Matthias; Siebers, Matthias; Schulz, Karsten

    2018-05-01

    Knowledge of current snow cover extent is essential for characterizing energy and moisture fluxes at the Earth's surface. The snow-covered area (SCA) is often estimated by using optical satellite information in combination with the normalized-difference snow index (NDSI). The NDSI thereby uses a threshold for the definition if a satellite pixel is assumed to be snow covered or snow free. The spatiotemporal representativeness of the standard threshold of 0.4 is however questionable at the local scale. Here, we use local snow cover maps derived from ground-based photography to continuously calibrate the NDSI threshold values (NDSIthr) of Landsat satellite images at two European mountain sites of the period from 2010 to 2015. The Research Catchment Zugspitzplatt (RCZ, Germany) and Vernagtferner area (VF, Austria) are both located within a single Landsat scene. Nevertheless, the long-term analysis of the NDSIthr demonstrated that the NDSIthr at these sites are not correlated (r = 0.17) and different than the standard threshold of 0.4. For further comparison, a dynamic and locally optimized NDSI threshold was used as well as another locally optimized literature threshold value (0.7). It was shown that large uncertainties in the prediction of the SCA of up to 24.1 % exist in satellite snow cover maps in cases where the standard threshold of 0.4 is used, but a newly developed calibrated quadratic polynomial model which accounts for seasonal threshold dynamics can reduce this error. The model minimizes the SCA uncertainties at the calibration site VF by 50 % in the evaluation period and was also able to improve the results at RCZ in a significant way. Additionally, a scaling experiment shows that the positive effect of a locally adapted threshold diminishes using a pixel size of 500 m or larger, underlining the general applicability of the standard threshold at larger scales.

  2. Sensitivity and Specificity of Swedish Interactive Threshold Algorithm and Standard Full Threshold Perimetry in Primary Open-angle Glaucoma.

    Science.gov (United States)

    Bamdad, Shahram; Beigi, Vahid; Sedaghat, Mohammad Reza

    2017-01-01

    Perimetry is one of the mainstays in glaucoma diagnosis and treatment. Various strategies offer different accuracies in glaucoma testing. Our aim was to determine and compare the diagnostic sensitivity and specificity of Swedish Interactive Threshold Algorithm (SITA) Fast and Standard Full Threshold (SFT) strategies of the Humphrey Field Analyzer (HFA) in identifying patients with visual field defect in glaucoma disease. This prospective observational case series study was conducted in a university-based eye hospital. A total of 37 eyes of 20 patients with glaucoma were evaluated using the central 30-2 program and both the SITA Fast and SFT strategies. Both strategies were performed for each strategy in each session and for four times in a 2-week period. Data were analyzed using the Student's t-test, analysis of variance, and chi-square test. The SITA Fast and SFT strategies had similar sensitivity of 93.3%. The specificity of SITA Fast and SFT strategies was 57.4% and 71.4% respectively. The mean duration of SFT tests was 14.6 minutes, and that of SITA Fast tests was 5.45 minutes (a statistically significant 62.5% reduction). In gray scale plots, visual field defect was less deep in SITA Fast than in SFT; however, more points had significant defect (p 0.5% and p deviation plots in SITA Fast than in SFT; these differences were not clinically significant. In conclusion, the SITA Fast strategy showed higher sensitivity for detection of glaucoma compared to the SFT strategy, yet with reduced specificity; however, the shorter test duration makes it a more acceptable choice in many clinical situations, especially for children, elderly, and those with musculoskeletal diseases.

  3. Optimized non relativistic potential for quarkonium

    International Nuclear Information System (INIS)

    Rekab, S.; Zenine, N.

    2006-01-01

    For non relativistic quarkonia description, we consider a wide class of quark antiquark potentials in the form of power law. A systematic study is made by optimizing the potential parameters with a fit on quarkonia vector mesons that lie below the threshold for strong decays. Implications of the obtained results are discussed

  4. Small-threshold behaviour of two-loop self-energy diagrams: two-particle thresholds

    International Nuclear Information System (INIS)

    Berends, F.A.; Davydychev, A.I.; Moskovskij Gosudarstvennyj Univ., Moscow; Smirnov, V.A.; Moskovskij Gosudarstvennyj Univ., Moscow

    1996-01-01

    The behaviour of two-loop two-point diagrams at non-zero thresholds corresponding to two-particle cuts is analyzed. The masses involved in a cut and the external momentum are assumed to be small as compared to some of the other masses of the diagram. By employing general formulae of asymptotic expansions of Feynman diagrams in momenta and masses, we construct an algorithm to derive analytic approximations to the diagrams. In such a way, we calculate several first coefficients of the expansion. Since no conditions on relative values of the small masses and the external momentum are imposed, the threshold irregularities are described analytically. Numerical examples, using diagrams occurring in the standard model, illustrate the convergence of the expansion below the first large threshold. (orig.)

  5. Study on control method of the actuators accepting commands from different classifications in nuclear power plant

    International Nuclear Information System (INIS)

    Tang Lixue; Zhang Nan; Fan Jin; Li Liang

    2015-01-01

    The distributed control system has become the main control system for the nuclear power plant, consisting of 1E and non-1E parts. Because the safety actuators accept commands from different safety classifications, this is a difficulty of controlling those actuators in nuclear power plant. This article discusses about the control method for safety actuators accepting commands from different classifications. Firstly, one control method adopted in new nuclear power projects is introduced. Then based on this, an optimized method is raised. The new method mainly concludes two points than the adopted method: 1. The concept 'local control mode' is introduced into the signal priority logic modules, and the priority logic module turns into local mode for the non-1E control system once it accepts safety signal; 2. The 'remote control mode' is added into the module of the safety actuator in the non-1E control system, and this can make the non-1E control system abandon controlling the safety actuator when the relevant priority logic module accept the safety signal. Based on verifying the correctness of modified scheme, comparisons between the fore-and-aft schemes are provided to summary the merits of the optimized method. It is concluded that optimized scheme is better in the aspects of reliability, safety and economy. (authors)

  6. Two-dimensional threshold voltage model and design considerations for gate electrode work function engineered recessed channel nanoscale MOSFET: I

    International Nuclear Information System (INIS)

    Chaujar, Rishu; Kaur, Ravneet; Gupta, Mridula; Gupta, R S; Saxena, Manoj

    2009-01-01

    This paper discusses a threshold voltage model for novel device structure: gate electrode work function engineered recessed channel (GEWE-RC) nanoscale MOSFET, which combines the advantages of both RC and GEWE structures. In part I, the model accurately predicts (a) surface potential, (b) threshold voltage and (c) sub-threshold slope for single material gate recessed channel (SMG-RC) and GEWE-RC structures. Part II focuses on the development of compact analytical drain current model taking into account the transition regimes from sub-threshold to saturation. Furthermore, the drain conductance evaluation has also been obtained, reflecting relevance of the proposed device for analogue design. The analysis takes into account the effect of gate length and groove depth in order to develop a compact model suitable for device design. The analytical results predicted by the model confirm well with the simulated results. Results in part I also provide valuable design insights in the performance of nanoscale GEWE-RC MOSFET with optimum threshold voltage and negative junction depth (NJD), and hence serves as a tool to optimize important device and technological parameters for 40 nm technology

  7. The Analysis of Closed-form Solution for Energy Detector Dynamic Threshold Adaptation in Cognitive Radio

    Directory of Open Access Journals (Sweden)

    R. Bozovic

    2017-12-01

    Full Text Available Spectrum sensing is the most important process in cognitive radio in order to ensure interference avoidance to primary users. For optimal performance of cognitive radio, it is substantial to monitor and promptly react to dynamic changes in its operating environment. In this paper, energy detector based spectrum sensing is considered. Under the assumption that detected signal can be modelled according to an autoregressive model, noise variance is estimated from that noisy signal, as well as primary user signal power. A closed-form solution for optimal decision threshold in dynamic electromagnetic environment is proposed and analyzed.

  8. The acceptability of waiting times for elective general surgery and the appropriateness of prioritising patients

    Directory of Open Access Journals (Sweden)

    Knol Dirk L

    2007-02-01

    Full Text Available Abstract Background Problematic waiting lists in public health care threaten the equity and timeliness of care provision in several countries. This study assesses different stakeholders' views on the acceptability of waiting lists in health care, their preferences for priority care of patients, and their judgements on acceptable waiting times for surgical patients. Methods A questionnaire survey was conducted among 257 former patients (82 with varicose veins, 86 with inguinal hernia, and 89 with gallstones, 101 surgeons, 95 occupational physicians, and 65 GPs. Judgements on acceptable waiting times were assessed using vignettes of patients with varicose veins, inguinal hernia, and gallstones. Results Participants endorsed the prioritisation of patients based on clinical need, but not on ability to benefit. The groups had significantly different opinions (p Acceptable waiting times ranged between 2 and 25 weeks depending on the type of disorder (p Conclusion The explicit prioritisation of patients seems an accepted means for reducing the overall burden from waiting lists. The disagreement about appropriate prioritisation criteria and the need for uniformity, however, raises concern about equity when implementing prioritisation in daily practice. Single factor waiting time thresholds seem insufficient for securing timely care provision in the presence of long waiting lists as they do not account for the different consequences of waiting between patients.

  9. Bridging the Gap between Social Acceptance and Ethical Acceptability.

    Science.gov (United States)

    Taebi, Behnam

    2017-10-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological risk, particularly when we evaluate technologies with transnational and intergenerational risks. I argue that good governance of risky technology requires analyzing both social acceptance and ethical acceptability. Conceptually, these two notions are mostly complementary. Social acceptance studies are not capable of sufficiently capturing all the morally relevant features of risky technologies; ethical analyses do not typically include stakeholders' opinions, and they therefore lack the relevant empirical input for a thorough ethical evaluation. Only when carried out in conjunction are these two types of analysis relevant to national and international governance of risky technology. I discuss the Rawlsian wide reflective equilibrium as a method for marrying social acceptance and ethical acceptability. Although the rationale of my argument is broadly applicable, I will examine the case of multinational nuclear waste repositories in particular. This example will show how ethical issues may be overlooked if we focus only on social acceptance, and will provide a test case for demonstrating how the wide reflective equilibrium can help to bridge the proverbial acceptance-acceptability gap. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  10. Hyper-arousal decreases human visual thresholds.

    Directory of Open Access Journals (Sweden)

    Adam J Woods

    Full Text Available Arousal has long been known to influence behavior and serves as an underlying component of cognition and consciousness. However, the consequences of hyper-arousal for visual perception remain unclear. The present study evaluates the impact of hyper-arousal on two aspects of visual sensitivity: visual stereoacuity and contrast thresholds. Sixty-eight participants participated in two experiments. Thirty-four participants were randomly divided into two groups in each experiment: Arousal Stimulation or Sham Control. The Arousal Stimulation group underwent a 50-second cold pressor stimulation (immersing the foot in 0-2° C water, a technique known to increase arousal. In contrast, the Sham Control group immersed their foot in room temperature water. Stereoacuity thresholds (Experiment 1 and contrast thresholds (Experiment 2 were measured before and after stimulation. The Arousal Stimulation groups demonstrated significantly lower stereoacuity and contrast thresholds following cold pressor stimulation, whereas the Sham Control groups showed no difference in thresholds. These results provide the first evidence that hyper-arousal from sensory stimulation can lower visual thresholds. Hyper-arousal's ability to decrease visual thresholds has important implications for survival, sports, and everyday life.

  11. A numerical study of threshold states

    International Nuclear Information System (INIS)

    Ata, M.S.; Grama, C.; Grama, N.; Hategan, C.

    1979-01-01

    There are some experimental evidences of charged particle threshold states. On the statistical background of levels, some simple structures were observed in excitation spectrum. They occur near the coulombian threshold and have a large reduced width for the decay in the threshold channel. These states were identified as charged cluster threshold states. Such threshold states were observed in sup(15,16,17,18)O, sup(18,19)F, sup(19,20)Ne, sup(24)Mg, sup(32)S. The types of clusters involved were d, t, 3 He, α and even 12 C. They were observed in heavy-ions transfer reactions in the residual nucleus as strong excited levels. The charged particle threshold states occur as simple structures at high excitation energy. They could be interesting both from nuclear structure as well as nuclear reaction mechanism point of view. They could be excited as simple structures both in compound and residual nucleus. (author)

  12. Sub-Volumetric Classification and Visualization of Emphysema Using a Multi-Threshold Method and Neural Network

    Science.gov (United States)

    Tan, Kok Liang; Tanaka, Toshiyuki; Nakamura, Hidetoshi; Shirahata, Toru; Sugiura, Hiroaki

    Chronic Obstructive Pulmonary Disease is a disease in which the airways and tiny air sacs (alveoli) inside the lung are partially obstructed or destroyed. Emphysema is what occurs as more and more of the walls between air sacs get destroyed. The goal of this paper is to produce a more practical emphysema-quantification algorithm that has higher correlation with the parameters of pulmonary function tests compared to classical methods. The use of the threshold range from approximately -900 Hounsfield Unit to -990 Hounsfield Unit for extracting emphysema from CT has been reported in many papers. From our experiments, we realize that a threshold which is optimal for a particular CT data set might not be optimal for other CT data sets due to the subtle radiographic variations in the CT images. Consequently, we propose a multi-threshold method that utilizes ten thresholds between and including -900 Hounsfield Unit and -990 Hounsfield Unit for identifying the different potential emphysematous regions in the lung. Subsequently, we divide the lung into eight sub-volumes. From each sub-volume, we calculate the ratio of the voxels with the intensity below a certain threshold. The respective ratios of the voxels below the ten thresholds are employed as the features for classifying the sub-volumes into four emphysema severity classes. Neural network is used as the classifier. The neural network is trained using 80 training sub-volumes. The performance of the classifier is assessed by classifying 248 test sub-volumes of the lung obtained from 31 subjects. Actual diagnoses of the sub-volumes are hand-annotated and consensus-classified by radiologists. The four-class classification accuracy of the proposed method is 89.82%. The sub-volumetric classification results produced in this study encompass not only the information of emphysema severity but also the distribution of emphysema severity from the top to the bottom of the lung. We hypothesize that besides emphysema severity, the

  13. Conceptions of nuclear threshold status

    International Nuclear Information System (INIS)

    Quester, G.H.

    1991-01-01

    This paper reviews some alternative definitions of nuclear threshold status. Each of them is important, and major analytical confusions would result if one sense of the term is mistaken for another. The motives for nations entering into such threshold status are a blend of civilian and military gains, and of national interests versus parochial or bureaucratic interests. A portion of the rationale for threshold status emerges inevitably from the pursuit of economic goals, and another portion is made more attraction by the derives of the domestic political process. Yet the impact on international security cannot be dismissed, especially where conflicts among the states remain real. Among the military or national security motives are basic deterrence, psychological warfare, war-fighting and, more generally, national prestige. In the end, as the threshold phenomenon is assayed for lessons concerning the role of nuclear weapons more generally in international relations and security, one might conclude that threshold status and outright proliferation coverage to a degree in the motives for all of the states involved and in the advantages attained. As this paper has illustrated, nuclear threshold status is more subtle and more ambiguous than outright proliferation, and it takes considerable time to sort out the complexities. Yet the world has now had a substantial amount of time to deal with this ambiguous status, and this may tempt more states to exploit it

  14. Identification of threshold prostate specific antigen levels to optimize the detection of clinically significant prostate cancer by magnetic resonance imaging/ultrasound fusion guided biopsy.

    Science.gov (United States)

    Shakir, Nabeel A; George, Arvin K; Siddiqui, M Minhaj; Rothwax, Jason T; Rais-Bahrami, Soroush; Stamatakis, Lambros; Su, Daniel; Okoro, Chinonyerem; Raskolnikov, Dima; Walton-Diaz, Annerleim; Simon, Richard; Turkbey, Baris; Choyke, Peter L; Merino, Maria J; Wood, Bradford J; Pinto, Peter A

    2014-12-01

    Prostate specific antigen sensitivity increases with lower threshold values but with a corresponding decrease in specificity. Magnetic resonance imaging/ultrasound targeted biopsy detects prostate cancer more efficiently and of higher grade than standard 12-core transrectal ultrasound biopsy but the optimal population for its use is not well defined. We evaluated the performance of magnetic resonance imaging/ultrasound targeted biopsy vs 12-core biopsy across a prostate specific antigen continuum. We reviewed the records of all patients enrolled in a prospective trial who underwent 12-core transrectal ultrasound and magnetic resonance imaging/ultrasound targeted biopsies from August 2007 through February 2014. Patients were stratified by each of 4 prostate specific antigen cutoffs. The greatest Gleason score using either biopsy method was compared in and across groups as well as across the population prostate specific antigen range. Clinically significant prostate cancer was defined as Gleason 7 (4 + 3) or greater. Univariate and multivariate analyses were performed. A total of 1,003 targeted and 12-core transrectal ultrasound biopsies were performed, of which 564 diagnosed prostate cancer for a 56.2% detection rate. Targeted biopsy led to significantly more upgrading to clinically significant disease compared to 12-core biopsy. This trend increased more with increasing prostate specific antigen, specifically in patients with prostate specific antigen 4 to 10 and greater than 10 ng/ml. Prostate specific antigen 5.2 ng/ml or greater captured 90% of upgrading by targeted biopsy, corresponding to 64% of patients who underwent multiparametric magnetic resonance imaging and subsequent fusion biopsy. Conversely a greater proportion of clinically insignificant disease was detected by 12-core vs targeted biopsy overall. These differences persisted when controlling for potential confounders on multivariate analysis. Prostate cancer upgrading with targeted biopsy increases

  15. The Optimal Timing of Adoption of a Green Technology

    International Nuclear Information System (INIS)

    Cunha-e-Sa, M.A.; Reis, A.B.

    2007-01-01

    We study the optimal timing of adoption of a cleaner technology and its effects on the rate of growth of an economy in the context of an AK endogenous growth model. We show that the results depend upon the behavior of the marginal utility of environmental quality with respect to consumption. When it is increasing, we derive the capital level at the optimal timing of adoption. We show that this capital threshold is independent of the initial conditions on the stock of capital, implying that capital-poor countries tend to take longer to adopt. Also, country-specific characteristics, as the existence of high barriers to adoption, may lead to different capital thresholds for different countries. If the marginal utility of environmental quality decreases with consumption, a country should never delay adoption; the optimal policy is either to adopt immediately or, if adoption costs are t oo high , to never adopt. The policy implications of these results are discussed in the context of the international debate surrounding the environmental political agenda

  16. Temperature thresholds and thermal requirements for development of Nasonovia ribisnigri (Hemiptera: Aphididae).

    Science.gov (United States)

    Diaz, Beatriz Maria; Muñiz, Mariano; Barrios, Laura; Fereres, Alberto

    2007-08-01

    Early detection of Nasonovia ribisnigri (Mosley) (Hemiptera: Aphididae) on lettuce is of primary importance for its effective control. Temperature thresholds for development of this pest were estimated using developmental rates [r(T)] at different constant temperatures (8, 12, 16, 20, 24, 26, and 28 degrees C). Observed developmental rates data and temperature were fitted to two linear (Campbell and Muñiz and Gil) and a nonlinear (Lactin) models. Lower temperature threshold estimated by the Campbell model was 3.6 degrees C for apterous, 4.1 degrees C for alates, and 3.1 degrees C for both aphid adult morphs together. Similar values of the lower temperature threshold were obtained with the Muñiz and Gil model, for apterous (4.0 degrees C), alates (4.2 degrees C), and both adult morphs together (3.7 degrees C) of N. ribisnigri. Thermal requirements of N. ribisnigri to complete development were estimated by Campbell and Muñiz and Gil models for apterous in 125 and 129 DD and for both adult morphs together in 143 and 139 DD, respectively. For complete development from birth to adulthood, the alate morph needed 15-18 DD more than the apterous morph. The lower temperature threshold determined by the Lactin model was 5.3 degrees C for alates, 2.3 degrees C for apterous, and 1.9 degrees C for both adult morphs together. The optimal and upper temperature thresholds were 25.2 and 33.6 degrees C, respectively, for the alate morph, 27 and 35.9 degrees C, respectively, for the apterous morph, and 26.1 and 35.3 degrees C, respectively, for the two adult morphs together. The Campbell model provided the best fit to the observed developmental rates data of N. ribisnigri. This information could be incorporated in forecasting models of this pest.

  17. Exercise increases pressure pain tolerance but not pressure and heat pain thresholds in healthy young men.

    Science.gov (United States)

    Vaegter, H B; Hoeger Bement, M; Madsen, A B; Fridriksson, J; Dasa, M; Graven-Nielsen, T

    2017-01-01

    Exercise causes an acute decrease in the pain sensitivity known as exercise-induced hypoalgesia (EIH), but the specificity to certain pain modalities remains unknown. This study aimed to compare the effect of isometric exercise on the heat and pressure pain sensitivity. On three different days, 20 healthy young men performed two submaximal isometric knee extensions (30% maximal voluntary contraction in 3 min) and a control condition (quiet rest). Before and immediately after exercise and rest, the sensitivity to heat pain and pressure pain was assessed in randomized and counterbalanced order. Cuff pressure pain threshold (cPPT) and pain tolerance (cPTT) were assessed on the ipsilateral lower leg by computer-controlled cuff algometry. Heat pain threshold (HPT) was recorded on the ipsilateral foot by a computer-controlled thermal stimulator. Cuff pressure pain tolerance was significantly increased after exercise compared with baseline and rest (p  0.77) compared with HPT (intraclass correlation = 0.54). The results indicate that hypoalgesia after submaximal isometric exercise is primarily affecting tolerance of pressure pain compared with the pain threshold. These data contribute to the understanding of how isometric exercise influences pain perception, which is necessary to optimize the clinical utility of exercise in management of chronic pain. The effect of isometric exercise on pain tolerance may be relevant for patients in chronic musculoskeletal pain as a pain-coping strategy. WHAT DOES THIS STUDY ADD?: The results indicate that hypoalgesia after submaximal isometric exercise is primarily affecting tolerance of pressure pain compared with the heat and pressure pain threshold. These data contribute to the understanding of how isometric exercise influences pain perception, which is necessary to optimize the clinical utility of exercise in management of chronic pain. © 2016 European Pain Federation - EFIC®.

  18. Threshold Concepts in Finance: Student Perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  19. Thresholding magnetic resonance images of human brain

    Institute of Scientific and Technical Information of China (English)

    Qing-mao HU; Wieslaw L NOWINSKI

    2005-01-01

    In this paper, methods are proposed and validated to determine low and high thresholds to segment out gray matter and white matter for MR images of different pulse sequences of human brain. First, a two-dimensional reference image is determined to represent the intensity characteristics of the original three-dimensional data. Then a region of interest of the reference image is determined where brain tissues are present. The non-supervised fuzzy c-means clustering is employed to determine: the threshold for obtaining head mask, the low threshold for T2-weighted and PD-weighted images, and the high threshold for T1-weighted, SPGR and FLAIR images. Supervised range-constrained thresholding is employed to determine the low threshold for T1-weighted, SPGR and FLAIR images. Thresholding based on pairs of boundary pixels is proposed to determine the high threshold for T2- and PD-weighted images. Quantification against public data sets with various noise and inhomogeneity levels shows that the proposed methods can yield segmentation robust to noise and intensity inhomogeneity. Qualitatively the proposed methods work well with real clinical data.

  20. Near threshold fatigue testing

    Science.gov (United States)

    Freeman, D. C.; Strum, M. J.

    1993-01-01

    Measurement of the near-threshold fatigue crack growth rate (FCGR) behavior provides a basis for the design and evaluation of components subjected to high cycle fatigue. Typically, the near-threshold fatigue regime describes crack growth rates below approximately 10(exp -5) mm/cycle (4 x 10(exp -7) inch/cycle). One such evaluation was recently performed for the binary alloy U-6Nb. The procedures developed for this evaluation are described in detail to provide a general test method for near-threshold FCGR testing. In particular, techniques for high-resolution measurements of crack length performed in-situ through a direct current, potential drop (DCPD) apparatus, and a method which eliminates crack closure effects through the use of loading cycles with constant maximum stress intensity are described.

  1. Optimization of Charge/Discharge Coordination to Satisfy Network Requirements Using Heuristic Algorithms in Vehicle-to-Grid Concept

    Directory of Open Access Journals (Sweden)

    DOGAN, A.

    2018-02-01

    Full Text Available Image thresholding is the most crucial step in microscopic image analysis to distinguish bacilli objects causing of tuberculosis disease. Therefore, several bi-level thresholding algorithms are widely used to increase the bacilli segmentation accuracy. However, bi-level microscopic image thresholding problem has not been solved using optimization algorithms. This paper introduces a novel approach for the segmentation problem using heuristic algorithms and presents visual and quantitative comparisons of heuristic and state-of-art thresholding algorithms. In this study, well-known heuristic algorithms such as Firefly Algorithm, Particle Swarm Optimization, Cuckoo Search, Flower Pollination are used to solve bi-level microscopic image thresholding problem, and the results are compared with the state-of-art thresholding algorithms such as K-Means, Fuzzy C-Means, Fast Marching. Kapur's entropy is chosen as the entropy measure to be maximized. Experiments are performed to make comparisons in terms of evaluation metrics and execution time. The quantitative results are calculated based on ground truth segmentation. According to the visual results, heuristic algorithms have better performance and the quantitative results are in accord with the visual results. Furthermore, experimental time comparisons show the superiority and effectiveness of the heuristic algorithms over traditional thresholding algorithms.

  2. Optimization of Overflow Policies in Call Centers

    DEFF Research Database (Denmark)

    Koole, G.M.; Nielsen, B.F.; Nielsen, T.B.

    2015-01-01

    . A Markov decision chain is used to determine the optimal policy. This policy outperforms considerably the ones used most often in practice, which use a fixed threshold. The present method can be used also for other call-center models and other situations where performance is based on actual waiting times...

  3. An investigation of the effects of technology readiness on technology acceptance in e-HRM

    OpenAIRE

    Erdoğmuş, Nihat; Esen, Murat

    2011-01-01

    The aim of this paper is to investigate the effects of technology readiness on technology acceptance in e-HRM field. The data for this study were collected from 65 Human Resource (HR) managers representing top 500 largest private sector companies in Turkey. The research model based on two theories: Parasuraman's technology readiness and Davis’ technology acceptance model. The results of the study showed that optimism and innovativeness dimensions of technology readiness positively influenced ...

  4. Study of a spherical gaseous detector for research of rare events at low energy threshold

    International Nuclear Information System (INIS)

    Dastgheibi-Fard, Ali

    2014-01-01

    The Spherical gaseous detector (or Spherical Proportional Counter, SPC) is a novel type of a particle detector, with a broad range of applications. Its main features include a very low energy threshold which is independent of the volume (due to its very low capacitance), a good energy resolution, robustness and a single detection readout channel. SEDINE, a low background detector installed at the underground site of Laboratoire Souterrain de Modane is currently being operated and aims at measuring events at a very low energy threshold, around 40 eV. The sensitivity for the rare events detection at low energy is correlated to the detector background and to the decreasing the level of energy threshold, which was the main point of this thesis. A major effort has been devoted to the operating of the experimental detector. Several detection parameters were optimized: the electric field homogeneity in the sphere, keeping clear of sparks, the electronic noise level and the leak rate of the detector. The detector is optimized for operation with a high pressure stable gain. The modification of the shield, cleanings of the detector and the addition of an anti-Radon tent have significantly reduced the background of SEDINE. Progress has increased the sensitivity of the detector at low energy up to a value comparable to the results other underground research experiences for the low mass WIMPs. We will present the results with a measured background in the region of keV, which has allowed us to show a competitive figure of exclusion for the production of light dark matter. (author) [fr

  5. The optimal structure-conductivity relation in epoxy-phthalocyanine nanocomposites

    NARCIS (Netherlands)

    Huijbregts, L.J.; Brom, H.B.; Brokken-Zijp, J.C.M.; Kemerink, M.; Chen, Z.; Goeje, de M.P.; Yuan, M.; Michels, M.A.J.

    2006-01-01

    Phthalcon-11 (aquocyanophthalocyaninatocobalt (III)) forms semiconducting nanocrystals that can be dispersed in epoxy coatings to obtain a semiconducting material with a low percolation threshold. We investigated the structure-cond. relation in this composite and the deviation from its optimal

  6. At-Risk-of-Poverty Threshold

    Directory of Open Access Journals (Sweden)

    Táňa Dvornáková

    2012-06-01

    Full Text Available European Statistics on Income and Living Conditions (EU-SILC is a survey on households’ living conditions. The main aim of the survey is to get long-term comparable data on social and economic situation of households. Data collected in the survey are used mainly in connection with the evaluation of income poverty and determinationof at-risk-of-poverty rate. This article deals with the calculation of the at risk-of-poverty threshold based on data from EU-SILC 2009. The main task is to compare two approaches to the computation of at riskof-poverty threshold. The first approach is based on the calculation of the threshold for each country separately,while the second one is based on the calculation of the threshold for all states together. The introduction summarizes common attributes in the calculation of the at-risk-of-poverty threshold, such as disposable household income, equivalised household income. Further, different approaches to both calculations are introduced andadvantages and disadvantages of these approaches are stated. Finally, the at-risk-of-poverty rate calculation is described and comparison of the at-risk-of-poverty rates based on these two different approaches is made.

  7. A Robust Bayesian Approach to an Optimal Replacement Policy for Gas Pipelines

    Directory of Open Access Journals (Sweden)

    José Pablo Arias-Nicolás

    2015-06-01

    Full Text Available In the paper, we address Bayesian sensitivity issues when integrating experts’ judgments with available historical data in a case study about strategies for the preventive maintenance of low-pressure cast iron pipelines in an urban gas distribution network. We are interested in replacement priorities, as determined by the failure rates of pipelines deployed under different conditions. We relax the assumptions, made in previous papers, about the prior distributions on the failure rates and study changes in replacement priorities under different choices of generalized moment-constrained classes of priors. We focus on the set of non-dominated actions, and among them, we propose the least sensitive action as the optimal choice to rank different classes of pipelines, providing a sound approach to the sensitivity problem. Moreover, we are also interested in determining which classes have a failure rate exceeding a given acceptable value, considered as the threshold determining no need for replacement. Graphical tools are introduced to help decisionmakers to determine if pipelines are to be replaced and the corresponding priorities.

  8. Summary of DOE threshold limits efforts

    International Nuclear Information System (INIS)

    Wickham, L.E.; Smith, C.F.; Cohen, J.J.

    1987-01-01

    The Department of Energy (DOE) has been developing the concept of threshold quantities for use in determining which waste materials may be disposed of as nonradioactive waste in DOE sanitary landfills. Waste above a threshold level could be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. After extensive review of a draft threshold guidance document in 1985, a second draft threshold background document was produced in March 1986. The second draft included a preliminary cost-benefit analysis and quality assurance considerations. The review of the second draft has been completed. Final changes to be incorporated include an in-depth cost-benefit analysis of two example sites and recommendations of how to further pursue (i.e. employ) the concept of threshold quantities within the DOE. 3 references

  9. Joint Optimization of Preventive Maintenance and Spare Parts Inventory with Appointment Policy

    Directory of Open Access Journals (Sweden)

    Jing Cai

    2017-01-01

    Full Text Available Under the background of the wide application of condition-based maintenance (CBM in maintenance practice, the joint optimization of maintenance and spare parts inventory is becoming a hot research to take full advantage of CBM and reduce the operational cost. In order to avoid both the high inventory level and the shortage of spare parts, an appointment policy of spare parts is first proposed based on the prediction of remaining useful lifetime, and then a corresponding joint optimization model of preventive maintenance and spare parts inventory is established. Due to the complexity of the model, the combination method of genetic algorithm and Monte Carlo is presented to get the optimal maximum inventory level, safety inventory level, potential failure threshold, and appointment threshold to minimize the cost rate. Finally, the proposed model is studied through a case study and compared with both the separate optimization and the joint optimization without appointment policy, and the results show that the proposed model is more effective. In addition, the sensitivity analysis shows that the proposed model is consistent with the actual situation of maintenance practices and inventory management.

  10. Study of the Production of Single Pions in Pion-proton Collisions near Threshold

    CERN Multimedia

    2002-01-01

    This experiment aims at a complete-kinematics measurement of the processes @p|-p @A @p|-@p|+n and @p|+p @A @p|+@p|+n in the region of incident momenta between 300 MeV/c and 460 MeV/c. It uses the Omicron Spectrometer with detectors placed in the magnetic field close to a 12 atm. hydrogen gas target. The apparatus has an acceptance of 4\\% for the processes to be studied. \\\\ \\\\ Their threshold is at 279 MeV/c and pion production in this region is interesting from the point of view of the determination of transformation properties of the chiral-symmetry-breaking part of the Lagrangian.

  11. The reaction np→ pp π- from threshold up to 570 MeV

    International Nuclear Information System (INIS)

    Daum, M.; Finger, M.; Slunecka, M.; Finger, M. Jr.; Janata, A.; Franz, J.; Heinsius, F.H.; Koenigsmann, K.; Lacker, H.; Schmitt, H.; Schweiger, W.; Sereni, P.

    2002-01-01

    The reaction np→ppπ - has been studied in a kinematically complete measurement with a large acceptance time-of-flight spectrometer for incident neutron energies between threshold and 570 MeV. The proton-proton invariant mass distributions show a strong enhancement due to the pp( 1 S 0 ) final state interaction. A large anisotropy was found in the pion angular distributions in contrast to the reaction pp→ppπ 0 . At small energies, a large forward/backward asymmetry has been observed. From the measured integrated cross section σ(np→ppπ - ), the isoscalar cross section σ 01 has been extracted. Its energy dependence indicates that mainly partial waves with Sp final states contribute. (orig.)

  12. Optimal information transmission in organizations: search and congestion

    Energy Technology Data Exchange (ETDEWEB)

    Arenas, A.; Cabrales, A.; Danon, L.; Diaz-Guilera, A.; Guimera, R.; Vega-Redondo, F.

    2008-01-01

    We propose a stylized model of a problem-solving organization whose internal communication structure is given by a fixed network. Problems arrive randomly anywhere in this network and must find their way to their respective specialized solvers by relying on local information alone. The organization handles multiple problems simultaneously. For this reason, the process may be subject to congestion. We provide a characterization of the threshold of collapse of the network and of the stock of floating problems (or average delay) that prevails below that threshold. We build upon this characterization to address a design problem: the determination of what kind of network architecture optimizes performance for any given problem arrival rate. We conclude that, for low arrival rates, the optimal network is very polarized (i.e. star-like or centralized), whereas it is largely homogeneous (or decentralized) for high arrival rates. These observations are in line with a common transformation experienced by information-intensive organizations as their work flow has risen in recent years.

  13. Parton distributions with threshold resummation

    CERN Document Server

    Bonvini, Marco; Rojo, Juan; Rottoli, Luca; Ubiali, Maria; Ball, Richard D.; Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.

    2015-01-01

    We construct a set of parton distribution functions (PDFs) in which fixed-order NLO and NNLO calculations are supplemented with soft-gluon (threshold) resummation up to NLL and NNLL accuracy respectively, suitable for use in conjunction with any QCD calculation in which threshold resummation is included at the level of partonic cross sections. These resummed PDF sets, based on the NNPDF3.0 analysis, are extracted from deep-inelastic scattering, Drell-Yan, and top quark pair production data, for which resummed calculations can be consistently used. We find that, close to threshold, the inclusion of resummed PDFs can partially compensate the enhancement in resummed matrix elements, leading to resummed hadronic cross-sections closer to the fixed-order calculation. On the other hand, far from threshold, resummed PDFs reduce to their fixed-order counterparts. Our results demonstrate the need for a consistent use of resummed PDFs in resummed calculations.

  14. An open trial of Acceptance-based Separated Family Treatment (ASFT) for adolescents with anorexia nervosa.

    Science.gov (United States)

    Timko, C Alix; Zucker, Nancy L; Herbert, James D; Rodriguez, Daniel; Merwin, Rhonda M

    2015-06-01

    Family based-treatments have the most empirical support in the treatment of adolescent anorexia nervosa; yet, a significant percentage of adolescents and their families do not respond to manualized family based treatment (FBT). The aim of this open trial was to conduct a preliminary evaluation of an innovative family-based approach to the treatment of anorexia: Acceptance-based Separated Family Treatment (ASFT). Treatment was grounded in Acceptance and Commitment Therapy (ACT), delivered in a separated format, and included an ACT-informed skills program. Adolescents (ages 12-18) with anorexia or sub-threshold anorexia and their families received 20 treatment sessions over 24 weeks. Outcome indices included eating disorder symptomatology reported by the parent and adolescent, percentage of expected body weight achieved, and changes in psychological acceptance/avoidance. Half of the adolescents (48.0%) met criteria for full remission at the end of treatment, 29.8% met criteria for partial remission, and 21.3% did not improve. Overall, adolescents had a significant reduction in eating disorder symptoms and reached expected body weight. Treatment resulted in changes in psychological acceptance in the expected direction for both parents and adolescents. This open trial provides preliminary evidence for the feasibility, acceptability, and efficacy of ASFT for adolescents with anorexia. Directions for future research are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Log canonical thresholds of smooth Fano threefolds

    International Nuclear Information System (INIS)

    Cheltsov, Ivan A; Shramov, Konstantin A

    2008-01-01

    The complex singularity exponent is a local invariant of a holomorphic function determined by the integrability of fractional powers of the function. The log canonical thresholds of effective Q-divisors on normal algebraic varieties are algebraic counterparts of complex singularity exponents. For a Fano variety, these invariants have global analogues. In the former case, it is the so-called α-invariant of Tian; in the latter case, it is the global log canonical threshold of the Fano variety, which is the infimum of log canonical thresholds of all effective Q-divisors numerically equivalent to the anticanonical divisor. An appendix to this paper contains a proof that the global log canonical threshold of a smooth Fano variety coincides with its α-invariant of Tian. The purpose of the paper is to compute the global log canonical thresholds of smooth Fano threefolds (altogether, there are 105 deformation families of such threefolds). The global log canonical thresholds are computed for every smooth threefold in 64 deformation families, and the global log canonical thresholds are computed for a general threefold in 20 deformation families. Some bounds for the global log canonical thresholds are computed for 14 deformation families. Appendix A is due to J.-P. Demailly.

  16. Optimizing Soft Tissue Management and Spacer Design in Segmental Bone Defects

    Science.gov (United States)

    2016-12-01

    of HA using an air/water/ hydroxyapatite phantom scanned under the same conditions. Bone threshold was set at 1300 mg HA/cm3 (747 HU). The analyzed...of HA using an air/water/ hydroxyapatite phantom scanned under the same conditions. Bone threshold was set at 1300 mg HA/cm3 (747 HU). The analyzed...that can be used to manipulate the Masquelet induced membrane to create a graft bed that optimizes bone regeneration. The effect of surgical

  17. At-risk and intervention thresholds of occupational stress using a visual analogue scale

    Science.gov (United States)

    Pereira, Bruno; Moustafa, Farès; Naughton, Geraldine; Lesage, François-Xavier; Lambert, Céline

    2017-01-01

    Background The visual analogue scale (VAS) is widely used in clinical practice by occupational physicians to assess perceived stress in workers. However, a single cut-off (black-or-white decision) inadequately discriminates between workers with and without stress. We explored an innovative statistical approach to distinguish an at-risk population among stressed workers, and to establish a threshold over which an action is urgently required, via the use of two cut-offs. Methods Participants were recruited during annual work medical examinations by a random sample of workers from five occupational health centres. We previously proposed a single cut-off of VAS stress in comparison with the Perceived Stress Scale (PSS14). Similar methodology was used in the current study, along with a gray zone approach. The lower limit of the gray zone supports sensitivity (“at-risk” threshold; interpreted as requiring closer surveillance) and the upper limit supports specificity (i.e. “intervention” threshold–emergency action required). Results We included 500 workers (49.6% males), aged 40±11 years, with a PSS14 score of 3.8±1.4 and a VAS score of 4.0±2.4. Using a receiver operating characteristic curve and the PSS cut-off score of 7.2, the optimal VAS threshold was 6.8 (sensitivity = 0.89, specificity = 0.87). The lower and upper thresholds of the gray zone were 5 and 8.2, respectively. Conclusions We identified two clinically relevant cut-offs on the VAS of stress: a first cut-off of 5.0 for an at-risk population, and a second cut-off of 8.2 over which an action is urgently required. Future investigations into the relationships between this upper threshold and deleterious events are required. PMID:28586383

  18. Threshold Concepts and Information Literacy

    Science.gov (United States)

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  19. Threshold Assessment: Definition of Acceptable Sites as Part of Site Selection for the Japanese HLW Program

    International Nuclear Information System (INIS)

    McKenna, S.A.; Wakasugi, Keiichiro; Webb, E.K.; Makino, Hitoshi; Ishihara, Yoshinao; Ijiri, Yuji; Sawada, Atsushi; Baba, Tomoko; Ishiguro, Katsuhiko; Umeki, Hiroyuki

    2000-01-01

    For the last ten years, the Japanese High-Level Nuclear Waste (HLW) repository program has focused on assessing the feasibility of a basic repository concept, which resulted in the recently published H12 Report. As Japan enters the implementation phase, a new organization must identify, screen and choose potential repository sites. Thus, a rapid mechanism for determining the likelihood of site suitability is critical. The threshold approach, described here, is a simple mechanism for defining the likelihood that a site is suitable given estimates of several critical parameters. We rely on the results of a companion paper, which described a probabilistic performance assessment simulation of the HLW reference case in the H12 report. The most critical two or three input parameters are plotted against each other and treated as spatial variables. Geostatistics is used to interpret the spatial correlation, which in turn is used to simulate multiple realizations of the parameter value maps. By combining an array of realizations, we can look at the probability that a given site, as represented by estimates of this combination of parameters, would be good host for a repository site

  20. Perceptual thresholds for detecting modifications applied to the acoustical properties of a violin.

    Science.gov (United States)

    Fritz, Claudia; Cross, Ian; Moore, Brian C J; Woodhouse, Jim

    2007-12-01

    This study is the first step in the psychoacoustic exploration of perceptual differences between the sounds of different violins. A method was used which enabled the same performance to be replayed on different "virtual violins," so that the relationships between acoustical characteristics of violins and perceived qualities could be explored. Recordings of real performances were made using a bridge-mounted force transducer, giving an accurate representation of the signal from the violin string. These were then played through filters corresponding to the admittance curves of different violins. Initially, limits of listener performance in detecting changes in acoustical characteristics were characterized. These consisted of shifts in frequency or increases in amplitude of single modes or frequency bands that have been proposed previously to be significant in the perception of violin sound quality. Thresholds were significantly lower for musically trained than for nontrained subjects but were not significantly affected by the violin used as a baseline. Thresholds for the musicians typically ranged from 3 to 6 dB for amplitude changes and 1.5%-20% for frequency changes. Interpretation of the results using excitation patterns showed that thresholds for the best subjects were quite well predicted by a multichannel model based on optimal processing.

  1. Optimal Threshold for a Positive Hybrid Capture 2 Test for Detection of Human Papillomavirus: Data from the ARTISTIC Trial▿

    Science.gov (United States)

    Sargent, A.; Bailey, A.; Turner, A.; Almonte, M.; Gilham, C.; Baysson, H.; Peto, J.; Roberts, C.; Thomson, C.; Desai, M.; Mather, J.; Kitchener, H.

    2010-01-01

    We present data on the use of the Hybrid Capture 2 (HC2) test for the detection of high-risk human papillomavirus (HR HPV) with different thresholds for positivity within a primary screening setting and as a method of triage for low-grade cytology. In the ARTISTIC population-based trial, 18,386 women were screened by cytology and for HPV. Cervical intraepithelial neoplasia lesions of grade two and higher (CIN2+ lesions) were identified for 453 women within 30 months of an abnormal baseline sample. When a relative light unit/cutoff (RLU/Co) ratio of ≥1 was used as the threshold for considering an HC2 result positive, 15.6% of results were positive, and the proportion of CIN2+ lesions in this group was 14.7%. The relative sensitivity for CIN2+ lesion detection was 93.4%. When an RLU/Co ratio of ≥2 was used as the threshold, there was a 2.5% reduction in positivity, with an increase in the proportion of CIN2+ lesions detected. The relative sensitivity decreased slightly, to 90.3%. Among women with low-grade cytology, HPV prevalences were 43.7% and 40.3% at RLU/Co ratios of ≥1 and ≥2, respectively. The proportions of CIN2+ lesions detected were 17.3% and 18.0%, with relative sensitivities of 87.7% at an RLU/Co ratio of ≥1 and 84.2% at an RLU/Co ratio of ≥2. At an RLU/Co ratio of ≥1, 68.3% of HC2-positive results were confirmed by the Roche line blot assay, compared to 77.2% of those at an RLU/Co ratio of ≥2. Fewer HC2-positive results were confirmed for 35- to 64-year-olds (50.3% at an RLU/Co ratio of ≥1 and 63.2% at an RLU/Co ratio of >2) than for 20- to 34-year-olds (78.7% at an RLU/Co ratio of ≥1 and 83.7% at an RLU/Co ratio of >2). If the HC2 test is used for routine screening as an initial test or as a method of triage for low-grade cytology, we would suggest increasing the threshold for positivity from the RLU/Co ratio of ≥1, recommended by the manufacturer, to an RLU/Co ratio of ≥2, since this study has shown that a beneficial balance

  2. A Threshold Continuum for Aeolian Sand Transport

    Science.gov (United States)

    Swann, C.; Ewing, R. C.; Sherman, D. J.

    2015-12-01

    The threshold of motion for aeolian sand transport marks the initial entrainment of sand particles by the force of the wind. This is typically defined and modeled as a singular wind speed for a given grain size and is based on field and laboratory experimental data. However, the definition of threshold varies significantly between these empirical models, largely because the definition is based on visual-observations of initial grain movement. For example, in his seminal experiments, Bagnold defined threshold of motion when he observed that 100% of the bed was in motion. Others have used 50% and lesser values. Differences in threshold models, in turn, result is large errors in predicting the fluxes associated with sand and dust transport. Here we use a wind tunnel and novel sediment trap to capture the fractions of sand in creep, reptation and saltation at Earth and Mars pressures and show that the threshold of motion for aeolian sand transport is best defined as a continuum in which grains progress through stages defined by the proportion of grains in creep and saltation. We propose the use of scale dependent thresholds modeled by distinct probability distribution functions that differentiate the threshold based on micro to macro scale applications. For example, a geologic timescale application corresponds to a threshold when 100% of the bed in motion whereas a sub-second application corresponds to a threshold when a single particle is set in motion. We provide quantitative measurements (number and mode of particle movement) corresponding to visual observations, percent of bed in motion and degrees of transport intermittency for Earth and Mars. Understanding transport as a continuum provides a basis for revaluating sand transport thresholds on Earth, Mars and Titan.

  3. Identification of Optimal Preventive Maintenance Decisions for Composite Components

    NARCIS (Netherlands)

    Laks, P.; Verhagen, W.J.C.; Gherman, B.; Porumbel, I.

    2018-01-01

    This research proposes a decision support tool which identifies cost-optimal maintenance decisions for a given planning period. Simultaneously, the reliability state of the component is kept at or below a given reliability threshold: a failure limit policy applies. The tool is developed to support

  4. Performance breakdown in optimal stimulus decoding.

    Science.gov (United States)

    Lubomir Kostal; Lansky, Petr; Pilarski, Stevan

    2015-06-01

    One of the primary goals of neuroscience is to understand how neurons encode and process information about their environment. The problem is often approached indirectly by examining the degree to which the neuronal response reflects the stimulus feature of interest. In this context, the methods of signal estimation and detection theory provide the theoretical limits on the decoding accuracy with which the stimulus can be identified. The Cramér-Rao lower bound on the decoding precision is widely used, since it can be evaluated easily once the mathematical model of the stimulus-response relationship is determined. However, little is known about the behavior of different decoding schemes with respect to the bound if the neuronal population size is limited. We show that under broad conditions the optimal decoding displays a threshold-like shift in performance in dependence on the population size. The onset of the threshold determines a critical range where a small increment in size, signal-to-noise ratio or observation time yields a dramatic gain in the decoding precision. We demonstrate the existence of such threshold regions in early auditory and olfactory information coding. We discuss the origin of the threshold effect and its impact on the design of effective coding approaches in terms of relevant population size.

  5. Effects of battery charge acceptance and battery aging in complete vehicle energy management

    NARCIS (Netherlands)

    Khalik, Z.; Romijn, T.C.J.; Donkers, M.C.F.; Weiland, S.

    2017-01-01

    In this paper, we propose a solution to the complete vehicle energy management problem with battery charge acceptance limitations and battery aging limitations. The problem is solved using distributed optimization for a case study of a hybrid heavy-duty vehicle, equipped with a refrigerated

  6. Iran: the next nuclear threshold state?

    OpenAIRE

    Maurer, Christopher L.

    2014-01-01

    Approved for public release; distribution is unlimited A nuclear threshold state is one that could quickly operationalize its peaceful nuclear program into one capable of producing a nuclear weapon. This thesis compares two known threshold states, Japan and Brazil, with Iran to determine if the Islamic Republic could also be labeled a threshold state. Furthermore, it highlights the implications such a status could have on U.S. nonproliferation policy. Although Iran's nuclear program is mir...

  7. Responsible technology acceptance

    DEFF Research Database (Denmark)

    Toft, Madeleine Broman; Schuitema, Geertje; Thøgersen, John

    2014-01-01

    As a response to climate change and the desire to gain independence from imported fossil fuels, there is a pressure to increase the proportion of electricity from renewable sources which is one of the reasons why electricity grids are currently being turned into Smart Grids. In this paper, we focus...... on private consumers’ acceptance of having Smart Grid technology installed in their home. We analyse acceptance in a combined framework of the Technology Acceptance Model and the Norm Activation Model. We propose that individuals are only likely to accept Smart Grid technology if they assess usefulness...... in terms of a positive impact for society and the environment. Therefore, we expect that Smart Grid technology acceptance can be better explained when the well-known technology acceptance parameters included in the Technology Acceptance Model are supplemented by moral norms as suggested by the Norm...

  8. Objective lens simultaneously optimized for pupil ghosting, wavefront delivery and pupil imaging

    Science.gov (United States)

    Olczak, Eugene G (Inventor)

    2011-01-01

    An objective lens includes multiple optical elements disposed between a first end and a second end, each optical element oriented along an optical axis. Each optical surface of the multiple optical elements provides an angle of incidence to a marginal ray that is above a minimum threshold angle. This threshold angle minimizes pupil ghosts that may enter an interferometer. The objective lens also optimizes wavefront delivery and pupil imaging onto an optical surface under test.

  9. Power optimized variation aware dual-threshold SRAM cell design technique

    Directory of Open Access Journals (Sweden)

    Aminul Islam

    2011-02-01

    Full Text Available Aminul Islam1, Mohd Hasan21Department of Electronics and Communication Engineering, Birla Institute of Technology, Mesra, Ranchi, Jharkhand, India; 2Department of Electronics Engineering, Aligarh Muslim University, Aligarh, Uttar Pradesh, IndiaAbstract: Bulk complementary metal-oxide semiconductor (CMOS technology is facing enormous challenges at channel lengths below 45 nm, such as gate tunneling, device mismatch, random dopant fluctuations, and mobility degradation. Although multiple gate transistors and strained silicon devices overcome some of the bulk CMOS problems, it is sensible to look for revolutionary new materials and devices to replace silicon. It is obvious that future technology materials should exhibit higher mobility, better channel electrostatics, scalability, and robustness against process variations. Carbon nanotube-based technology is very promising because it has most of these desired features. There is a need to explore the potential of this emerging technology by designing circuits based on this technology and comparing their performance with that of existing bulk CMOS technology. In this paper, we propose a low-power variation-immune dual-threshold voltage carbon nanotube field effect transistor (CNFET-based seven-transistor (7T static random access memory (SRAM cell. The proposed CNFET-based 7T SRAM cell offers ~1.2× improvement in standby power, ~1.3× improvement in read delay, and ~1.1× improvement in write delay. It offers narrower spread in write access time (1.4× at optimum energy point [OEP] and 1.2× at 1 V. It features 56.3% improvement in static noise margin and 40% improvement in read static noise margin. All the simulation measurements are taken at proposed OEP decided by the optimum results obtained after extensive simulation on HSPICE (high-performance simulation program with integrated circuit emphasis environment.Keywords: carbon nanotube field effect transistor (CNFET, chirality vector, random dopant

  10. Adaptive Spot Detection With Optimal Scale Selection in Fluorescence Microscopy Images.

    Science.gov (United States)

    Basset, Antoine; Boulanger, Jérôme; Salamero, Jean; Bouthemy, Patrick; Kervrann, Charles

    2015-11-01

    Accurately detecting subcellular particles in fluorescence microscopy is of primary interest for further quantitative analysis such as counting, tracking, or classification. Our primary goal is to segment vesicles likely to share nearly the same size in fluorescence microscopy images. Our method termed adaptive thresholding of Laplacian of Gaussian (LoG) images with autoselected scale (ATLAS) automatically selects the optimal scale corresponding to the most frequent spot size in the image. Four criteria are proposed and compared to determine the optimal scale in a scale-space framework. Then, the segmentation stage amounts to thresholding the LoG of the intensity image. In contrast to other methods, the threshold is locally adapted given a probability of false alarm (PFA) specified by the user for the whole set of images to be processed. The local threshold is automatically derived from the PFA value and local image statistics estimated in a window whose size is not a critical parameter. We also propose a new data set for benchmarking, consisting of six collections of one hundred images each, which exploits backgrounds extracted from real microscopy images. We have carried out an extensive comparative evaluation on several data sets with ground-truth, which demonstrates that ATLAS outperforms existing methods. ATLAS does not need any fine parameter tuning and requires very low computation time. Convincing results are also reported on real total internal reflection fluorescence microscopy images.

  11. Hydrometeorological threshold conditions for debris flow initiation in Norway

    Directory of Open Access Journals (Sweden)

    N. K. Meyer

    2012-10-01

    Full Text Available Debris flows, triggered by extreme precipitation events and rapid snow melt, cause considerable damage to the Norwegian infrastructure every year. To define intensity-duration (ID thresholds for debris flow initiation critical water supply conditions arising from intensive rainfall or snow melt were assessed on the basis of daily hydro-meteorological information for 502 documented debris flow events. Two threshold types were computed: one based on absolute ID relationships and one using ID relationships normalized by the local precipitation day normal (PDN. For each threshold type, minimum, medium and maximum threshold values were defined by fitting power law curves along the 10th, 50th and 90th percentiles of the data population. Depending on the duration of the event, the absolute threshold intensities needed for debris flow initiation vary between 15 and 107 mm day−1. Since the PDN changes locally, the normalized thresholds show spatial variations. Depending on location, duration and threshold level, the normalized threshold intensities vary between 6 and 250 mm day−1. The thresholds obtained were used for a frequency analysis of over-threshold events giving an estimation of the exceedance probability and thus potential for debris flow events in different parts of Norway. The absolute thresholds are most often exceeded along the west coast, while the normalized thresholds are most frequently exceeded on the west-facing slopes of the Norwegian mountain ranges. The minimum thresholds derived in this study are in the range of other thresholds obtained for regions with a climate comparable to Norway. Statistics reveal that the normalized threshold is more reliable than the absolute threshold as the former shows no spatial clustering of debris flows related to water supply events captured by the threshold.

  12. Optimization of Approximate Inhibitory Rules Relative to Number of Misclassifications

    KAUST Repository

    Alsolami, Fawaz

    2013-10-04

    In this work, we consider so-called nonredundant inhibitory rules, containing an expression “attribute:F value” on the right- hand side, for which the number of misclassifications is at most a threshold γ. We study a dynamic programming approach for description of the considered set of rules. This approach allows also the optimization of nonredundant inhibitory rules relative to the length and coverage. The aim of this paper is to investigate an additional possibility of optimization relative to the number of misclassifications. The results of experiments with decision tables from the UCI Machine Learning Repository show this additional optimization achieves a fewer misclassifications. Thus, the proposed optimization procedure is promising.

  13. 11 CFR 9036.1 - Threshold submission.

    Science.gov (United States)

    2010-01-01

    ... credit or debit card, including one made over the Internet, the candidate shall provide sufficient... section shall not count toward the threshold amount. (c) Threshold certification by Commission. (1) After...

  14. Acceptance of Others, Feeling of Being Accepted and Striving for Being Accepted Among the Representatives of Different Kinds of Occupations

    Directory of Open Access Journals (Sweden)

    Gergana Stanoeva

    2012-05-01

    Full Text Available This paper deals with an important issue related to the human attitudes and needs in interpersonal and professional aspects. The theoretical part deals with several psychological components of the self-esteem and esteem of the others – acceptance of the others, feeling of being accepted, need for approval. Some gender differences in manifestations of acceptance and feeling of being accepted at the workplace are discussed. This article presents some empirical data for the degree of acceptance of others, feeling of being accepted and the strive for being accepted among the representatives of helping, pedagogical, administrative and economic occupations, as well as non-qualified workers. The goals of the study were to reveal the interdependency between these constructs and to be found some significant differences between the representatives of the four groups of occupations. The methods of the first study were W. Fey’s scales “Acceptance of others”, and “How do I feel accepted by others”. The method of the second study was Crown and Marlowe Scale for Social Desirability. The results indicated some significant differences in acceptance of others and feeling of being accepted between the non-qualified workers and the representatives of helping, administrative and economic occupations. There were not any significant difference in strive for being accepted between the fouroccupational groups.

  15. Advantages of binaural amplification to acceptable noise level of directional hearing aid users.

    Science.gov (United States)

    Kim, Ja-Hee; Lee, Jae Hee; Lee, Ho-Ki

    2014-06-01

    The goal of the present study was to examine whether Acceptable Noise Levels (ANLs) would be lower (greater acceptance of noise) in binaural listening than in monaural listening condition and also whether meaningfulness of background speech noise would affect ANLs for directional microphone hearing aid users. In addition, any relationships between the individual binaural benefits on ANLs and the individuals' demographic information were investigated. Fourteen hearing aid users (mean age, 64 years) participated for experimental testing. For the ANL calculation, listeners' most comfortable listening levels and background noise level were measured. Using Korean ANL material, ANLs of all participants were evaluated under monaural and binaural amplification with a counterbalanced order. The ANLs were also compared across five types of competing speech noises, consisting of 1- through 8-talker background speech maskers. Seven young normal-hearing listeners (mean age, 27 years) participated for the same measurements as a pilot testing. The results demonstrated that directional hearing aid users accepted more noise (lower ANLs) with binaural amplification than with monaural amplification, regardless of the type of competing speech. When the background speech noise became more meaningful, hearing-impaired listeners accepted less amount of noise (higher ANLs), revealing that ANL is dependent on the intelligibility of the competing speech. The individuals' binaural advantages in ANLs were significantly greater for the listeners with longer experience of hearing aids, yet not related to their age or hearing thresholds. Binaural directional microphone processing allowed hearing aid users to accept a greater amount of background noise, which may in turn improve listeners' hearing aid success. Informational masking substantially influenced background noise acceptance. Given a significant association between ANLs and duration of hearing aid usage, ANL measurement can be useful for

  16. Thermotactile perception thresholds measurement conditions.

    Science.gov (United States)

    Maeda, Setsuo; Sakakibara, Hisataka

    2002-10-01

    The purpose of this paper is to investigate the effects of posture, push force and rate of temperature change on thermotactile thresholds and to clarify suitable measuring conditions for Japanese people. Thermotactile (warm and cold) thresholds on the right middle finger were measured with an HVLab thermal aesthesiometer. Subjects were eight healthy male Japanese students. The effects of posture in measurement were examined in the posture of a straight hand and forearm placed on a support, the same posture without a support, and the fingers and hand flexed at the wrist with the elbow placed on a desk. The finger push force applied to the applicator of the thermal aesthesiometer was controlled at a 0.5, 1.0, 2.0 and 3.0 N. The applicator temperature was changed to 0.5, 1.0, 1.5, 2.0 and 2.5 degrees C/s. After each measurement, subjects were asked about comfort under the measuring conditions. Three series of experiments were conducted on different days to evaluate repeatability. Repeated measures ANOVA showed that warm thresholds were affected by the push force and the rate of temperature change and that cold thresholds were influenced by posture and push force. The comfort assessment indicated that the measurement posture of a straight hand and forearm laid on a support was the most comfortable for the subjects. Relatively high repeatability was obtained under measurement conditions of a 1 degrees C/s temperature change rate and a 0.5 N push force. Measurement posture, push force and rate of temperature change can affect the thermal threshold. Judging from the repeatability, a push force of 0.5 N and a temperature change of 1.0 degrees C/s in the posture with the straight hand and forearm laid on a support are recommended for warm and cold threshold measurements.

  17. DOE approach to threshold quantities

    International Nuclear Information System (INIS)

    Wickham, L.E.; Kluk, A.F.; Department of Energy, Washington, DC)

    1985-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Ideally, the threshold must be set high enough to significantly reduce the amount of waste requiring special handling. It must also be low enough so that waste at the threshold quantity poses a very small health risk and multiple exposures to such waste would still constitute a small health risk. It should also be practical to segregate waste above or below the threshold quantity using available instrumentation. Guidance is being prepared to aid DOE sites in establishing threshold quantity values based on pathways analysis using site-specific parameters (waste stream characteristics, maximum exposed individual, population considerations, and site specific parameters such as rainfall, etc.). A guidance dose of between 0.001 to 1.0 mSv/y (0.1 to 100 mrem/y) was recommended with 0.3 mSv/y (30 mrem/y) selected as the guidance dose upon which to base calculations. Several tasks were identified, beginning with the selection of a suitable pathway model for relating dose to the concentration of radioactivity in the waste. Threshold concentrations corresponding to the guidance dose were determined for waste disposal sites at a selected humid and arid site. Finally, cost-benefit considerations at the example sites were addressed. The results of the various tasks are summarized and the relationship of this effort with related developments at other agencies discussed

  18. Muscle Weakness Thresholds for Prediction of Diabetes in Adults.

    Science.gov (United States)

    Peterson, Mark D; Zhang, Peng; Choksi, Palak; Markides, Kyriakos S; Al Snih, Soham

    2016-05-01

    Despite the known links between weakness and early mortality, what remains to be fully understood is the extent to which strength preservation is associated with protection from cardiometabolic diseases, such as diabetes. The purposes of this study were to determine the association between muscle strength and diabetes among adults, and to identify age- and sex-specific thresholds of low strength for detection of risk. A population-representative sample of 4066 individuals, aged 20-85 years, was included from the combined 2011-2012 National Health and Nutrition Examination Survey (NHANES) data sets. Strength was assessed using a handheld dynamometer, and the single highest reading from either hand was normalized to body mass. A logistic regression model was used to assess the association between normalized grip strength and risk of diabetes, as determined by haemoglobin A1c levels ≥6.5 % (≥48 mmol/mol), while controlling for sociodemographic characteristics, anthropometric measures and television viewing time. For every 0.05 decrement in normalized strength, there were 1.26 times increased adjusted odds for diabetes in men and women. Women were at lower odds of having diabetes (odds ratio 0.49; 95 % confidence interval 0.29-0.82). Age, waist circumference and lower income were also associated with diabetes. The optimal sex- and age-specific weakness thresholds to detect diabetes were 0.56, 0.50 and 0.45 for men at ages of 20-39, 40-59 and 60-80 years, respectively, and 0.42, 0.38 and 0.33 for women at ages of 20-39, 40-59 and 60-80 years, respectively. We present thresholds of strength that can be incorporated into a clinical setting for identifying adults who are at risk of developing diabetes and might benefit from lifestyle interventions to reduce risk.

  19. Doubler system quench detection threshold

    International Nuclear Information System (INIS)

    Kuepke, K.; Kuchnir, M.; Martin, P.

    1983-01-01

    The experimental study leading to the determination of the sensitivity needed for protecting the Fermilab Doubler from damage during quenches is presented. The quench voltage thresholds involved were obtained from measurements made on Doubler cable of resistance x temperature and voltage x time during quenches under several currents and from data collected during operation of the Doubler Quench Protection System as implemented in the B-12 string of 20 magnets. At 4kA, a quench voltage threshold in excess of 5.OV will limit the peak Doubler cable temperature to 452K for quenches originating in the magnet coils whereas a threshold of 0.5V is required for quenches originating outside of coils

  20. Application of the threshold of toxicological concern concept to pharmaceutical manufacturing operations.

    Science.gov (United States)

    Dolan, David G; Naumann, Bruce D; Sargent, Edward V; Maier, Andrew; Dourson, Michael

    2005-10-01

    A scientific rationale is provided for estimating acceptable daily intake values (ADIs) for compounds with limited or no toxicity information to support pharmaceutical manufacturing operations. These ADIs are based on application of the "thresholds of toxicological concern" (TTC) principle, in which levels of human exposure are estimated that pose no appreciable risk to human health. The same concept has been used by the US Food and Drug Administration (FDA) to establish "thresholds of regulation" for indirect food additives and adopted by the Joint FAO/WHO Expert Committee on Food Additives for flavoring substances. In practice, these values are used as a statement of safety and indicate when no actions need to be taken in a given exposure situation. Pharmaceutical manufacturing relies on ADIs for cleaning validation of process equipment and atypical extraneous matter investigations. To provide practical guidance for handling situations where relatively unstudied compounds with limited or no toxicity data are encountered, recommendations are provided on ADI values that correspond to three categories of compounds: (1) compounds that are likely to be carcinogenic, (2) compounds that are likely to be potent or highly toxic, and (3) compounds that are not likely to be potent, highly toxic or carcinogenic. Corresponding ADIs for these categories of materials are 1, 10, and 100 microg/day, respectively.

  1. SINR balancing in the downlink of cognitive radio networks with imperfect channel knowledge

    KAUST Repository

    Hanif, Muhammad Fainan; Smith, Peter J.; Alouini, Mohamed-Slim

    2010-01-01

    an acceptable threshold with uncertain channel state information available at the CR base-station (BS). We optimize the beamforming vectors at the CR BS so that the worst user SINR is maximized and transmit power constraints at the CR BS and interference

  2. The Acceptance Strategy for Nuclear Power Plant In Indonesia

    Science.gov (United States)

    Suhaemi, Tjipta; Syaukat, Achmad

    2010-06-01

    THE ACCEPTANCE STRATEGY FOR NUCLEAR POWER PLANT IN INDONESIA. Indonesia has planned to build nuclear power plants. Some feasibility studies have been conducted intensively. However, the processes of NPP introduction are still uncertain. National Energy Plan in Indonesia, which has been made by some governmental agencies, does not yet give positive impact to the government decision to construct the nuclear power plant (NPP). This paper discusses the process of NPP introduction in Indonesia, which has been colored with debate of stakeholder and has delayed decision for go-nuclear. The technology paradigm is used to promote NPP as an alternative of reliable energy resources. This paradigm should be complemented with international politic-economic point of view. The international politic-economic point of view shows that structural powers, consisting of security, production, finance, and knowledge structures, within which the NPP is introduced, have dynamic characteristics. The process of NPP introduction in Indonesia contains some infrastructure development (R&D, legislation, regulation, energy planning, site study, public acceptance efforts, etc), but they need a better coherent NPP implementation program and NPP Acceptance Program. Strategic patterns for NPP acceptance described in this paper are made by considering nuclear regulation development and the interest of basic domestic participation. The first NPP program in Indonesia having proven technology and basic domestic participation is and important milestone toward and optimal national energy-mix.

  3. Reaction thresholds in doubly special relativity

    International Nuclear Information System (INIS)

    Heyman, Daniel; Major, Seth; Hinteleitner, Franz

    2004-01-01

    Two theories of special relativity with an additional invariant scale, 'doubly special relativity', are tested with calculations of particle process kinematics. Using the Judes-Visser modified conservation laws, thresholds are studied in both theories. In contrast with some linear approximations, which allow for particle processes forbidden in special relativity, both the Amelino-Camelia and Magueijo-Smolin frameworks allow no additional processes. To first order, the Amelino-Camelia framework thresholds are lowered and the Magueijo-Smolin framework thresholds may be raised or lowered

  4. Network meta-analysis of diagnostic test accuracy studies identifies and ranks the optimal diagnostic tests and thresholds for health care policy and decision-making.

    Science.gov (United States)

    Owen, Rhiannon K; Cooper, Nicola J; Quinn, Terence J; Lees, Rosalind; Sutton, Alex J

    2018-07-01

    Network meta-analyses (NMA) have extensively been used to compare the effectiveness of multiple interventions for health care policy and decision-making. However, methods for evaluating the performance of multiple diagnostic tests are less established. In a decision-making context, we are often interested in comparing and ranking the performance of multiple diagnostic tests, at varying levels of test thresholds, in one simultaneous analysis. Motivated by an example of cognitive impairment diagnosis following stroke, we synthesized data from 13 studies assessing the efficiency of two diagnostic tests: Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA), at two test thresholds: MMSE accounting for the correlations between multiple test accuracy measures from the same study. We developed and successfully fitted a model comparing multiple tests/threshold combinations while imposing threshold constraints. Using this model, we found that MoCA at threshold decision making. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Thresholds in chemical respiratory sensitisation.

    Science.gov (United States)

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-03

    There is a continuing interest in determining whether it is possible to identify thresholds for chemical allergy. Here allergic sensitisation of the respiratory tract by chemicals is considered in this context. This is an important occupational health problem, being associated with rhinitis and asthma, and in addition provides toxicologists and risk assessors with a number of challenges. In common with all forms of allergic disease chemical respiratory allergy develops in two phases. In the first (induction) phase exposure to a chemical allergen (by an appropriate route of exposure) causes immunological priming and sensitisation of the respiratory tract. The second (elicitation) phase is triggered if a sensitised subject is exposed subsequently to the same chemical allergen via inhalation. A secondary immune response will be provoked in the respiratory tract resulting in inflammation and the signs and symptoms of a respiratory hypersensitivity reaction. In this article attention has focused on the identification of threshold values during the acquisition of sensitisation. Current mechanistic understanding of allergy is such that it can be assumed that the development of sensitisation (and also the elicitation of an allergic reaction) is a threshold phenomenon; there will be levels of exposure below which sensitisation will not be acquired. That is, all immune responses, including allergic sensitisation, have threshold requirement for the availability of antigen/allergen, below which a response will fail to develop. The issue addressed here is whether there are methods available or clinical/epidemiological data that permit the identification of such thresholds. This document reviews briefly relevant human studies of occupational asthma, and experimental models that have been developed (or are being developed) for the identification and characterisation of chemical respiratory allergens. The main conclusion drawn is that although there is evidence that the

  6. How many ELNs are optimal for breast cancer patients with more than three PLNs who underwent MRM? A large population-based study

    Directory of Open Access Journals (Sweden)

    Wang X

    2018-02-01

    Full Text Available Xiaohui Wang,1 Changbin Ji,2 Huiying Chi,3 Haiyong Wang4 1Research Service Office, Shandong Liaocheng People’s Hospital, Liaocheng, China; 2Orthopedics Department, Shandong Liaocheng People’s Hospital, Liaocheng, China; 3Shanghai Geriatrics Institute of Traditional Chinese Medicine, Shanghai, China; 4Department of Internal Medicine-Oncology, Shandong Cancer Hospital and Institute, Shandong Cancer Hospital affiliated to Shandong University, Shandong Academy of Medical Sciences, Jinan, China Background: Few studies have focused on the optimal threshold of examed lymph nodes (ELNs for breast cancer patients with more than three positive lymph nodes after modified radical mastectomy.Materials and methods: The X-tile and the minimum P-value models were applied to determine the optimal threshold. Cox proportional hazard analysis was used to analyze the cancer-specific survival and perform subgroup analysis.Results: The results showed that 12 ELNs was the optimal threshold for these patients, and the patients with >12 ELNs had a better cancer-specific survival benefit compared with the patients with <12 ELNs (P<0.001. Conclusion: The number 12 can be selected as the optimal threshold of ELNs for breast cancer patients with >3 positive lymph nodes after modified radical mastectomy. Keywords: breast cancer, mastectomy, ELNs, positive lymph nodes, X-tile 

  7. Feasibility and acceptability of workers' health surveillance for fire fighters.

    Science.gov (United States)

    Plat, Marie-Christine J; Frings-Dresen, Monique Hw; Sluiter, Judith K

    2011-09-01

    The objective of this study was to test the feasibility and acceptability of a new workers' health surveillance (WHS) for fire fighters in a Dutch pilot-implementation project. In three fire departments, between November 2007 and February 2009, feasibility was tested with respect to i) worker intent to change health and behavior; ii) the quality of instructions for testing teams; iii) the planned procedure in the field; and iv) future WHS organisation. Acceptability involved i) satisfaction with WHS and ii) verification of the job-specificity of the content of two physical tests of WHS. Fire fighters were surveyed after completing WHS, three testing teams were interviewed, and the content of the two tests was studied by experts. nearly all of the 275 fire fighters intended to improve their health when recommended by the occupational physician. The testing teams found the instructions to be clear, and they were mostly positive about the organisation of WHS. Acceptability: the fire fighters rated WHS at eight points (out of a maximum of ten). The experts also reached a consensus about the optimal job-specific content of the future functional physical tests. Overall, it is feasible and acceptable to implement WHS in a definitive form in the Dutch fire-fighting sector.

  8. Optimization of Excitation in FDTD Method and Corresponding Source Modeling

    Directory of Open Access Journals (Sweden)

    B. Dimitrijevic

    2015-04-01

    Full Text Available Source and excitation modeling in FDTD formulation has a significant impact on the method performance and the required simulation time. Since the abrupt source introduction yields intensive numerical variations in whole computational domain, a generally accepted solution is to slowly introduce the source, using appropriate shaping functions in time. The main goal of the optimization presented in this paper is to find balance between two opposite demands: minimal required computation time and acceptable degradation of simulation performance. Reducing the time necessary for source activation and deactivation is an important issue, especially in design of microwave structures, when the simulation is intensively repeated in the process of device parameter optimization. Here proposed optimized source models are realized and tested within an own developed FDTD simulation environment.

  9. Bit Error Rate Performance Analysis of a Threshold-Based Generalized Selection Combining Scheme in Nakagami Fading Channels

    Directory of Open Access Journals (Sweden)

    Kousa Maan

    2005-01-01

    Full Text Available The severity of fading on mobile communication channels calls for the combining of multiple diversity sources to achieve acceptable error rate performance. Traditional approaches perform the combining of the different diversity sources using either the conventional selective diversity combining (CSC, equal-gain combining (EGC, or maximal-ratio combining (MRC schemes. CSC and MRC are the two extremes of compromise between performance quality and complexity. Some researches have proposed a generalized selection combining scheme (GSC that combines the best branches out of the available diversity resources ( . In this paper, we analyze a generalized selection combining scheme based on a threshold criterion rather than a fixed-size subset of the best channels. In this scheme, only those diversity branches whose energy levels are above a specified threshold are combined. Closed-form analytical solutions for the BER performances of this scheme over Nakagami fading channels are derived. We also discuss the merits of this scheme over GSC.

  10. Compositional threshold for Nuclear Waste Glass Durability

    International Nuclear Information System (INIS)

    Kruger, Albert A.; Farooqi, Rahmatullah; Hrma, Pavel R.

    2013-01-01

    Within the composition space of glasses, a distinct threshold appears to exist that separates 'good' glasses, i.e., those which are sufficiently durable, from 'bad' glasses of a low durability. The objective of our research is to clarify the origin of this threshold by exploring the relationship between glass composition, glass structure and chemical durability around the threshold region

  11. Identifying Threshold Concepts for Information Literacy: A Delphi Study

    Directory of Open Access Journals (Sweden)

    Lori Townsend

    2016-06-01

    Full Text Available This study used the Delphi method to engage expert practitioners on the topic of threshold concepts for information literacy. A panel of experts considered two questions. First, is the threshold concept approach useful for information literacy instruction? The panel unanimously agreed that the threshold concept approach holds potential for information literacy instruction. Second, what are the threshold concepts for information literacy instruction? The panel proposed and discussed over fifty potential threshold concepts, finally settling on six information literacy threshold concepts.

  12. Multiuser switched diversity scheduling systems with per-user threshold

    KAUST Repository

    Nam, Haewoon

    2010-05-01

    A multiuser switched diversity scheduling scheme with per-user feedback threshold is proposed and analyzed in this paper. The conventional multiuser switched diversity scheduling scheme uses a single feedback threshold for every user, where the threshold is a function of the average signal-to-noise ratios (SNRs) of the users as well as the number of users involved in the scheduling process. The proposed scheme, however, constructs a sequence of feedback thresholds instead of a single feedback threshold such that each user compares its channel quality with the corresponding feedback threshold in the sequence. Numerical and simulation results show that thanks to the flexibility of threshold selection, where a potentially different threshold can be used for each user, the proposed scheme provides a higher system capacity than that for the conventional scheme. © 2006 IEEE.

  13. Threshold for the destabilisation of the ion-temperature-gradient mode in magnetically confined toroidal plasmas

    Science.gov (United States)

    Zocco, A.; Xanthopoulos, P.; Doerk, H.; Connor, J. W.; Helander, P.

    2018-02-01

    The threshold for the resonant destabilisation of ion-temperature-gradient (ITG) driven instabilities that render the modes ubiquitous in both tokamaks and stellarators is investigated. We discover remarkably similar results for both confinement concepts if care is taken in the analysis of the effect of the global shear . We revisit, analytically and by means of gyrokinetic simulations, accepted tokamak results and discover inadequacies of some aspects of their theoretical interpretation. In particular, for standard tokamak configurations, we find that global shear effects on the critical gradient cannot be attributed to the wave-particle resonance destabilising mechanism of Hahm & Tang (Phys. Plasmas, vol. 1, 1989, pp. 1185-1192), but are consistent with a stabilising contribution predicted by Biglari et al. (Phys. Plasmas, vol. 1, 1989, pp. 109-118). Extensive analytical and numerical investigations show that virtually no previous tokamak theoretical predictions capture the temperature dependence of the mode frequency at marginality, thus leading to incorrect instability thresholds. In the asymptotic limit , where is the rotational transform, and such a threshold should be solely determined by the resonant toroidal branch of the ITG mode, we discover a family of unstable solutions below the previously known threshold of instability. This is true for a tokamak case described by a local local equilibrium, and for the stellarator Wendelstein 7-X, where these unstable solutions are present even for configurations with a small trapped-particle population. We conjecture they are of the Floquet type and derive their properties from the Fourier analysis of toroidal drift modes of Connor & Taylor (Phys. Fluids, vol. 30, 1987, pp. 3180-3185), and to Hill's theory of the motion of the lunar perigee (Acta Math., vol. 8, 1886, pp. 1-36). The temperature dependence of the newly determined threshold is given for both confinement concepts. In the first case, the new temperature

  14. Optimization of lens layout for THz signal free-space delivery

    Science.gov (United States)

    Yu, Jimmy; Zhou, Wen

    2018-03-01

    We investigate how to extend the air-space distance for Terahertz (THz) signal by using optimized lens layout. After a delivery over 129.6 cm air-space we realize the BER of 10 Gb/s QPSK signal at 450 GHz smaller than 1 ×10-4 with this optimized lens layout. If only two lenses are employed, the BER is higher than forward error correction (FEC) threshold at the input power of 15 dBm into the photodiode.

  15. Double-adjustment in propensity score matching analysis: choosing a threshold for considering residual imbalance.

    Science.gov (United States)

    Nguyen, Tri-Long; Collins, Gary S; Spence, Jessica; Daurès, Jean-Pierre; Devereaux, P J; Landais, Paul; Le Manach, Yannick

    2017-04-28

    Double-adjustment can be used to remove confounding if imbalance exists after propensity score (PS) matching. However, it is not always possible to include all covariates in adjustment. We aimed to find the optimal imbalance threshold for entering covariates into regression. We conducted a series of Monte Carlo simulations on virtual populations of 5,000 subjects. We performed PS 1:1 nearest-neighbor matching on each sample. We calculated standardized mean differences across groups to detect any remaining imbalance in the matched samples. We examined 25 thresholds (from 0.01 to 0.25, stepwise 0.01) for considering residual imbalance. The treatment effect was estimated using logistic regression that contained only those covariates considered to be unbalanced by these thresholds. We showed that regression adjustment could dramatically remove residual confounding bias when it included all of the covariates with a standardized difference greater than 0.10. The additional benefit was negligible when we also adjusted for covariates with less imbalance. We found that the mean squared error of the estimates was minimized under the same conditions. If covariate balance is not achieved, we recommend reiterating PS modeling until standardized differences below 0.10 are achieved on most covariates. In case of remaining imbalance, a double adjustment might be worth considering.

  16. Threshold-Based Random Charging Scheme for Decentralized PEV Charging Operation in a Smart Grid.

    Science.gov (United States)

    Kwon, Ojin; Kim, Pilkee; Yoon, Yong-Jin

    2016-12-26

    Smart grids have been introduced to replace conventional power distribution systems without real time monitoring for accommodating the future market penetration of plug-in electric vehicles (PEVs). When a large number of PEVs require simultaneous battery charging, charging coordination techniques have become one of the most critical factors to optimize the PEV charging performance and the conventional distribution system. In this case, considerable computational complexity of a central controller and exchange of real time information among PEVs may occur. To alleviate these problems, a novel threshold-based random charging (TBRC) operation for a decentralized charging system is proposed. Using PEV charging thresholds and random access rates, the PEVs themselves can participate in the charging requests. As PEVs with a high battery state do not transmit the charging requests to the central controller, the complexity of the central controller decreases due to the reduction of the charging requests. In addition, both the charging threshold and the random access rate are statistically calculated based on the average of supply power of the PEV charging system that do not require a real time update. By using the proposed TBRC with a tolerable PEV charging degradation, a 51% reduction of the PEV charging requests is achieved.

  17. OPTIMAL NETWORK TOPOLOGY DESIGN

    Science.gov (United States)

    Yuen, J. H.

    1994-01-01

    This program was developed as part of a research study on the topology design and performance analysis for the Space Station Information System (SSIS) network. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. It is intended that this new design technique consider all important performance measures explicitly and take into account the constraints due to various technical feasibilities. In the current program, technical constraints are taken care of by the user properly forming the starting set of candidate components (e.g. nonfeasible links are not included). As subsets are generated, they are tested to see if they form an acceptable network by checking that all requirements are satisfied. Thus the first acceptable subset encountered gives the cost-optimal topology satisfying all given constraints. The user must sort the set of "feasible" link elements in increasing order of their costs. The program prompts the user for the following information for each link: 1) cost, 2) connectivity (number of stations connected by the link), and 3) the stations connected by that link. Unless instructed to stop, the program generates all possible acceptable networks in increasing order of their total costs. The program is written only to generate topologies that are simply connected. Tests on reliability, delay, and other performance measures are discussed in the documentation, but have not been incorporated into the program. This program is written in PASCAL for interactive execution and has been implemented on an IBM PC series computer operating under PC DOS. The disk contains source code only. This program was developed in 1985.

  18. Energy-Aware Routing Optimization in Dynamic GMPLS Controlled Optical Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Ricciardi, Sergio; Fagertun, Anna Manolova

    2012-01-01

    In this paper, routing optimizations based on energy sources are proposed in dynamic GMPLS controlled optical networks. The influences of re-routing and load balancing factors on the algorithm are evaluated, with a focus on different re-routing thresholds. Results from dynamic network simulations...

  19. Admission Control Threshold in Cellular Relay Networks with Power Adjustment

    Directory of Open Access Journals (Sweden)

    Lee Ki-Dong

    2009-01-01

    Full Text Available Abstract In the cellular network with relays, the mobile station can benefit from both coverage extension and capacity enhancement. However, the operation complexity increases as the number of relays grows up. Furthermore, in the cellular network with cooperative relays, it is even more complex because of an increased dimension of signal-to-noise ratios (SNRs formed in the cooperative wireless transmission links. In this paper, we propose a new method for admission capacity planning in a cellular network using a cooperative relaying mechanism called decode-and-forward. We mathematically formulate the dropping ratio using the randomness of "channel gain." With this, we formulate an admission threshold planning problem as a simple optimization problem, where we maximize the accommodation capacity (in number of connections subject to two types of constraints. (1 A constraint that the sum of the transmit powers of the source node and relay node is upper-bounded where both nodes can jointly adjust the transmit power. (2 A constraint that the dropping ratio is upper-bounded by a certain threshold value. The simplicity of the problem formulation facilitates its solution in real-time. We believe that the proposed planning method can provide an attractive guideline for dimensioning a cellular relay network with cooperative relays.

  20. An Improved Real-Coded Population-Based Extremal Optimization Method for Continuous Unconstrained Optimization Problems

    Directory of Open Access Journals (Sweden)

    Guo-Qiang Zeng

    2014-01-01

    Full Text Available As a novel evolutionary optimization method, extremal optimization (EO has been successfully applied to a variety of combinatorial optimization problems. However, the applications of EO in continuous optimization problems are relatively rare. This paper proposes an improved real-coded population-based EO method (IRPEO for continuous unconstrained optimization problems. The key operations of IRPEO include generation of real-coded random initial population, evaluation of individual and population fitness, selection of bad elements according to power-law probability distribution, generation of new population based on uniform random mutation, and updating the population by accepting the new population unconditionally. The experimental results on 10 benchmark test functions with the dimension N=30 have shown that IRPEO is competitive or even better than the recently reported various genetic algorithm (GA versions with different mutation operations in terms of simplicity, effectiveness, and efficiency. Furthermore, the superiority of IRPEO to other evolutionary algorithms such as original population-based EO, particle swarm optimization (PSO, and the hybrid PSO-EO is also demonstrated by the experimental results on some benchmark functions.

  1. An improved experimental scheme for simultaneous measurement of high-resolution zero electron kinetic energy (ZEKE) photoelectron and threshold photoion (MATI) spectra

    Science.gov (United States)

    Michels, François; Mazzoni, Federico; Becucci, Maurizio; Müller-Dethlefs, Klaus

    2017-10-01

    An improved detection scheme is presented for threshold ionization spectroscopy with simultaneous recording of the Zero Electron Kinetic Energy (ZEKE) and Mass Analysed Threshold Ionisation (MATI) signals. The objective is to obtain accurate dissociation energies for larger molecular clusters by simultaneously detecting the fragment and parent ion MATI signals with identical transmission. The scheme preserves an optimal ZEKE spectral resolution together with excellent separation of the spontaneous ion and MATI signals in the time-of-flight mass spectrum. The resulting improvement in sensitivity will allow for the determination of dissociation energies in clusters with substantial mass difference between parent and daughter ions.

  2. Pain thresholds, supra-threshold pain and lidocaine sensitivity in patients with erythromelalgia, including the I848Tmutation in NaV 1.7.

    Science.gov (United States)

    Helås, T; Sagafos, D; Kleggetveit, I P; Quiding, H; Jönsson, B; Segerdahl, M; Zhang, Z; Salter, H; Schmelz, M; Jørum, E

    2017-09-01

    Nociceptive thresholds and supra-threshold pain ratings as well as their reduction upon local injection with lidocaine were compared between healthy subjects and patients with erythromelalgia (EM). Lidocaine (0.25, 0.50, 1.0 or 10 mg/mL) or placebo (saline) was injected intradermally in non-painful areas of the lower arm, in a randomized, double-blind manner, to test the effect on dynamic and static mechanical sensitivity, mechanical pain sensitivity, thermal thresholds and supra-threshold heat pain sensitivity. Heat pain thresholds and pain ratings to supra-threshold heat stimulation did not differ between EM-patients (n = 27) and controls (n = 25), neither did the dose-response curves for lidocaine. Only the subgroup of EM-patients with mutations in sodium channel subunits Na V 1.7, 1.8 or 1.9 (n = 8) had increased lidocaine sensitivity for supra-threshold heat stimuli, contrasting lower sensitivity to strong mechanical stimuli. This pattern was particularly clear in the two patients carrying the Na V 1.7 I848T mutations in whom lidocaine's hyperalgesic effect on mechanical pain sensitivity contrasted more effective heat analgesia. Heat pain thresholds are not sensitized in EM patients, even in those with gain-of-function mutations in Na V 1.7. Differential lidocaine sensitivity was overt only for noxious stimuli in the supra-threshold range suggesting that sensitized supra-threshold encoding is important for the clinical pain phenotype in EM in addition to lower activation threshold. Intracutaneous lidocaine dose-dependently blocked nociceptive sensations, but we did not identify EM patients with particular high lidocaine sensitivity that could have provided valuable therapeutic guidance. Acute pain thresholds and supra-threshold heat pain in controls and patients with erythromelalgia do not differ and have the same lidocaine sensitivity. Acute heat pain thresholds even in EM patients with the Na V 1.7 I848T mutation are normal and only nociceptor

  3. Low-threshold stimulated emission at 249 nm and 256 nm from AlGaN-based multiple-quantum-well lasers grown on sapphire substrates

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xiao-Hang; Detchprohm, Theeradetch; Kao, Tsung-Ting; Satter, Md. Mahbub; Shen, Shyh-Chiang; Douglas Yoder, P.; Dupuis, Russell D., E-mail: dupuis@gatech.edu [Center for Compound Semiconductors and School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332-0250 (United States); Wang, Shuo; Wei, Yong O.; Xie, Hongen; Fischer, Alec M.; Ponce, Fernando A. [Department of Physics, Arizona State University, Tempe, Arizona 85287-1504 (United States); Wernicke, Tim; Reich, Christoph; Martens, Martin; Kneissl, Michael [Technical University of Berlin, Institute for Solid State Physics, Berlin D-10623 (Germany)

    2014-10-06

    Optically pumped deep-ultraviolet (DUV) lasing with low threshold was demonstrated from AlGaN-based multiple-quantum-well (MQW) heterostructures grown on sapphire substrates. The epitaxial layers were grown pseudomorphically by metalorganic chemical vapor deposition on (0001) sapphire substrates. Stimulated emission was observed at wavelengths of 256 nm and 249 nm with thresholds of 61 kW/cm{sup 2} and 95 kW/cm{sup 2} at room temperature, respectively. The thresholds are comparable to the reported state-of-the-art AlGaN-based MQW DUV lasers grown on bulk AlN substrates emitting at 266 nm. These low thresholds are attributed to the optimization of active region and waveguide layer as well as the use of high-quality AlN/sapphire templates. The stimulated emission above threshold was dominated by transverse-electric polarization. This work demonstrates the potential candidacy of sapphire substrates for DUV diode lasers.

  4. Low-threshold stimulated emission at 249 nm and 256 nm from AlGaN-based multiple-quantum-well lasers grown on sapphire substrates

    International Nuclear Information System (INIS)

    Li, Xiao-Hang; Detchprohm, Theeradetch; Kao, Tsung-Ting; Satter, Md. Mahbub; Shen, Shyh-Chiang; Douglas Yoder, P.; Dupuis, Russell D.; Wang, Shuo; Wei, Yong O.; Xie, Hongen; Fischer, Alec M.; Ponce, Fernando A.; Wernicke, Tim; Reich, Christoph; Martens, Martin; Kneissl, Michael

    2014-01-01

    Optically pumped deep-ultraviolet (DUV) lasing with low threshold was demonstrated from AlGaN-based multiple-quantum-well (MQW) heterostructures grown on sapphire substrates. The epitaxial layers were grown pseudomorphically by metalorganic chemical vapor deposition on (0001) sapphire substrates. Stimulated emission was observed at wavelengths of 256 nm and 249 nm with thresholds of 61 kW/cm 2 and 95 kW/cm 2 at room temperature, respectively. The thresholds are comparable to the reported state-of-the-art AlGaN-based MQW DUV lasers grown on bulk AlN substrates emitting at 266 nm. These low thresholds are attributed to the optimization of active region and waveguide layer as well as the use of high-quality AlN/sapphire templates. The stimulated emission above threshold was dominated by transverse-electric polarization. This work demonstrates the potential candidacy of sapphire substrates for DUV diode lasers.

  5. Clinical Practice Guidelines From the AABB: Red Blood Cell Transfusion Thresholds and Storage.

    Science.gov (United States)

    Carson, Jeffrey L; Guyatt, Gordon; Heddle, Nancy M; Grossman, Brenda J; Cohn, Claudia S; Fung, Mark K; Gernsheimer, Terry; Holcomb, John B; Kaplan, Lewis J; Katz, Louis M; Peterson, Nikki; Ramsey, Glenn; Rao, Sunil V; Roback, John D; Shander, Aryeh; Tobian, Aaron A R

    2016-11-15

    More than 100 million units of blood are collected worldwide each year, yet the indication for red blood cell (RBC) transfusion and the optimal length of RBC storage prior to transfusion are uncertain. To provide recommendations for the target hemoglobin level for RBC transfusion among hospitalized adult patients who are hemodynamically stable and the length of time RBCs should be stored prior to transfusion. Reference librarians conducted a literature search for randomized clinical trials (RCTs) evaluating hemoglobin thresholds for RBC transfusion (1950-May 2016) and RBC storage duration (1948-May 2016) without language restrictions. The results were summarized using the Grading of Recommendations Assessment, Development and Evaluation method. For RBC transfusion thresholds, 31 RCTs included 12 587 participants and compared restrictive thresholds (transfusion not indicated until the hemoglobin level is 7-8 g/dL) with liberal thresholds (transfusion not indicated until the hemoglobin level is 9-10 g/dL). The summary estimates across trials demonstrated that restrictive RBC transfusion thresholds were not associated with higher rates of adverse clinical outcomes, including 30-day mortality, myocardial infarction, cerebrovascular accident, rebleeding, pneumonia, or thromboembolism. For RBC storage duration, 13 RCTs included 5515 participants randomly allocated to receive fresher blood or standard-issue blood. These RCTs demonstrated that fresher blood did not improve clinical outcomes. It is good practice to consider the hemoglobin level, the overall clinical context, patient preferences, and alternative therapies when making transfusion decisions regarding an individual patient. Recommendation 1: a restrictive RBC transfusion threshold in which the transfusion is not indicated until the hemoglobin level is 7 g/dL is recommended for hospitalized adult patients who are hemodynamically stable, including critically ill patients, rather than when the hemoglobin level

  6. When do price thresholds matter in retail categories?

    OpenAIRE

    Pauwels, Koen; Srinivasan, Shuba; Franses, Philip Hans

    2007-01-01

    textabstractMarketing literature has long recognized that brand price elasticity need not be monotonic and symmetric, but has yet to provide generalizable market-level insights on threshold-based price elasticity, asymmetric thresholds, and the sign and magnitude of elasticity transitions. This paper introduces smooth transition regression models to study threshold-based price elasticity of the top 4 brands across 20 fast-moving consumer good categories. Threshold-based price elasticity is fo...

  7. High-order above-threshold dissociation of molecules

    Science.gov (United States)

    Lu, Peifen; Wang, Junping; Li, Hui; Lin, Kang; Gong, Xiaochun; Song, Qiying; Ji, Qinying; Zhang, Wenbin; Ma, Junyang; Li, Hanxiao; Zeng, Heping; He, Feng; Wu, Jian

    2018-03-01

    Electrons bound to atoms or molecules can simultaneously absorb multiple photons via the above-threshold ionization featured with discrete peaks in the photoelectron spectrum on account of the quantized nature of the light energy. Analogously, the above-threshold dissociation of molecules has been proposed to address the multiple-photon energy deposition in the nuclei of molecules. In this case, nuclear energy spectra consisting of photon-energy spaced peaks exceeding the binding energy of the molecular bond are predicted. Although the observation of such phenomena is difficult, this scenario is nevertheless logical and is based on the fundamental laws. Here, we report conclusive experimental observation of high-order above-threshold dissociation of H2 in strong laser fields where the tunneling-ionized electron transfers the absorbed multiphoton energy, which is above the ionization threshold to the nuclei via the field-driven inelastic rescattering. Our results provide an unambiguous evidence that the electron and nuclei of a molecule as a whole absorb multiple photons, and thus above-threshold ionization and above-threshold dissociation must appear simultaneously, which is the cornerstone of the nowadays strong-field molecular physics.

  8. Thresholds in Xeric Hydrology and Biogeochemistry

    Science.gov (United States)

    Meixner, T.; Brooks, P. D.; Simpson, S. C.; Soto, C. D.; Yuan, F.; Turner, D.; Richter, H.

    2011-12-01

    Due to water limitation, thresholds in hydrologic and biogeochemical processes are common in arid and semi-arid systems. Some of these thresholds such as those focused on rainfall runoff relationships have been well studied. However to gain a full picture of the role that thresholds play in driving the hydrology and biogeochemistry of xeric systems a full view of the entire array of processes at work is needed. Here a walk through the landscape of xeric systems will be conducted illustrating the powerful role of hydrologic thresholds on xeric system biogeochemistry. To understand xeric hydro-biogeochemistry two key ideas need to be focused on. First, it is important to start from a framework of reaction and transport. Second an understanding of the temporal and spatial components of thresholds that have a large impact on hydrologic and biogeochemical fluxes needs to be offered. In the uplands themselves episodic rewetting and drying of soils permits accelerated biogeochemical processing but also more gradual drainage of water through the subsurface than expected in simple conceptions of biogeochemical processes. Hydrologic thresholds (water content above hygroscopic) results in a stop start nutrient spiral of material across the landscape since runoff connecting uplands to xeric perennial riparian is episodic and often only transports materials a short distance (100's of m). This episodic movement results in important and counter-intuitive nutrient inputs to riparian zones but also significant processing and uptake of nutrients. The floods that transport these biogeochemicals also result in significant input to riparian groundwater and may be key to sustaining these critical ecosystems. Importantly the flood driven recharge process itself is a threshold process dependent on flood characteristics (floods greater than 100 cubic meters per second) and antecedent conditions (losing to near neutral gradients). Floods also appear to influence where arid and semi

  9. Determinants of Change in the Cost-effectiveness Threshold.

    Science.gov (United States)

    Paulden, Mike; O'Mahony, James; McCabe, Christopher

    2017-02-01

    The cost-effectiveness threshold in health care systems with a constrained budget should be determined by the cost-effectiveness of displacing health care services to fund new interventions. Using comparative statics, we review some potential determinants of the threshold, including the budget for health care, the demand for existing health care interventions, the technical efficiency of existing interventions, and the development of new health technologies. We consider the anticipated direction of impact that would affect the threshold following a change in each of these determinants. Where the health care system is technically efficient, an increase in the health care budget unambiguously raises the threshold, whereas an increase in the demand for existing, non-marginal health interventions unambiguously lowers the threshold. Improvements in the technical efficiency of existing interventions may raise or lower the threshold, depending on the cause of the improvement in efficiency, whether the intervention is already funded, and, if so, whether it is marginal. New technologies may also raise or lower the threshold, depending on whether the new technology is a substitute for an existing technology and, again, whether the existing technology is marginal. Our analysis permits health economists and decision makers to assess if and in what direction the threshold may change over time. This matters, as threshold changes impact the cost-effectiveness of interventions that require decisions now but have costs and effects that fall in future periods.

  10. Low heat pain thresholds in migraineurs between attacks.

    Science.gov (United States)

    Schwedt, Todd J; Zuniga, Leslie; Chong, Catherine D

    2015-06-01

    Between attacks, migraine is associated with hypersensitivities to sensory stimuli. The objective of this study was to investigate hypersensitivity to pain in migraineurs between attacks. Cutaneous heat pain thresholds were measured in 112 migraineurs, migraine free for ≥ 48 hours, and 75 healthy controls. Pain thresholds at the head and at the arm were compared between migraineurs and controls using two-tailed t-tests. Among migraineurs, correlations between heat pain thresholds and headache frequency, allodynia symptom severity, and time interval until next headache were calculated. Migraineurs had lower pain thresholds than controls at the head (43.9 ℃ ± 3.2 ℃ vs. 45.1 ℃ ± 3.0 ℃, p = 0.015) and arm (43.2 ℃ ± 3.4 ℃ vs. 44.8 ℃ ± 3.3 ℃, p pain thresholds and headache frequency or allodynia symptom severity. For the 41 migraineurs for whom time to next headache was known, there were positive correlations between time to next headache and pain thresholds at the head (r = 0.352, p = 0.024) and arm (r = 0.312, p = 0.047). This study provides evidence that migraineurs have low heat pain thresholds between migraine attacks. Mechanisms underlying these lower pain thresholds could also predispose migraineurs to their next migraine attack, a hypothesis supported by finding positive correlations between pain thresholds and time to next migraine attack. © International Headache Society 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. Determining the Threshold for HbA1c as a Predictor for Adverse Outcomes After Total Joint Arthroplasty: A Multicenter, Retrospective Study.

    Science.gov (United States)

    Tarabichi, Majd; Shohat, Noam; Kheir, Michael M; Adelani, Muyibat; Brigati, David; Kearns, Sean M; Patel, Pankajkumar; Clohisy, John C; Higuera, Carlos A; Levine, Brett R; Schwarzkopf, Ran; Parvizi, Javad; Jiranek, William A

    2017-09-01

    Although HbA1c is commonly used for assessing glycemic control before surgery, there is no consensus regarding its role and the appropriate threshold in predicting adverse outcomes. This study was designed to evaluate the potential link between HbA1c and subsequent periprosthetic joint infection (PJI), with the intention of determining the optimal threshold for HbA1c. This is a multicenter retrospective study, which identified 1645 diabetic patients who underwent primary total joint arthroplasty (1004 knees and 641 hips) between 2001 and 2015. All patients had an HbA1c measured within 3 months of surgery. The primary outcome of interest was a PJI at 1 year based on the Musculoskeletal Infection Society criteria. Secondary outcomes included orthopedic (wound and mechanical complications) and nonorthopedic complications (sepsis, thromboembolism, genitourinary, and cardiovascular complications). A regression analysis was performed to determine the independent influence of HbA1c for predicting PJI. Overall 22 cases of PJI occurred at 1 year (1.3%). HbA1c at a threshold of 7.7 was distinct for predicting PJI (area under the curve, 0.65; 95% confidence interval, 0.51-0.78). Using this threshold, PJI rates increased from 0.8% (11 of 1441) to 5.4% (11 of 204). In the stepwise logistic regression analysis, PJI remained the only variable associated with higher HbA1c (odds ratio, 1.5; confidence interval, 1.2-2.0; P = .0001). There was no association between high HbA1c levels and other complications assessed. High HbA1c levels are associated with an increased risk for PJI. A threshold of 7.7% seems to be more indicative of infection than the commonly used 7% and should perhaps be the goal in preoperative patient optimization. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Subthreshold SPICE Model Optimization

    Science.gov (United States)

    Lum, Gregory; Au, Henry; Neff, Joseph; Bozeman, Eric; Kamin, Nick; Shimabukuro, Randy

    2011-04-01

    The first step in integrated circuit design is the simulation of said design in software to verify proper functionally and design requirements. Properties of the process are provided by fabrication foundries in the form of SPICE models. These SPICE models contain the electrical data and physical properties of the basic circuit elements. A limitation of these models is that the data collected by the foundry only accurately model the saturation region. This is fine for most users, but when operating devices in the subthreshold region they are inadequate for accurate simulation results. This is why optimizing the current SPICE models to characterize the subthreshold region is so important. In order to accurately simulate this region of operation, MOSFETs of varying widths and lengths are fabricated and the electrical test data is collected. From the data collected the parameters of the model files are optimized through parameter extraction rather than curve fitting. With the completed optimized models the circuit designer is able to simulate circuit designs for the sub threshold region accurately.

  13. Some considerations regarding the creep crack growth threshold

    International Nuclear Information System (INIS)

    Thouless, M.D.; Evans, A.G.

    1984-01-01

    The preceding analysis reveals that the existence of a threshold determined by the sintering stress does not influence the post threshold crack velocity. Considerations of the sintering stress can thus be conveniently excluded from analysis of the post threshold crack velocity. The presence of a crack growth threshold has been predicted, based on the existence of cavity nucleation controlled crack growth. A preliminary analysis of cavity nucleation rates within the damage zone reveals that this threshold is relatively abrupt, in accord with experimental observations. Consequently, at stress intensities below K /SUB th/ growth becomes nucleation limited and crack blunting occurs in preference to crack growth

  14. Explicit optimization of plan quality measures in intensity-modulated radiation therapy treatment planning.

    Science.gov (United States)

    Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn

    2017-06-01

    To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.

  15. When Do Price Thresholds Matter in Retail Categories?

    OpenAIRE

    Koen Pauwels; Shuba Srinivasan; Philip Hans Franses

    2007-01-01

    Marketing literature has long recognized that brand price elasticity need not be monotonic and symmetric, but has yet to provide generalizable market-level insights on threshold-based price elasticity, asymmetric thresholds, and the sign and magnitude of elasticity transitions. This paper introduces smooth transition regression models to study threshold-based price elasticity of the top 4 brands across 20 fast-moving consumer good categories. Threshold-based price elasticity is found for 76% ...

  16. Estimating the Threshold Level of Inflation for Thailand

    OpenAIRE

    Jiranyakul, Komain

    2017-01-01

    Abstract. This paper analyzes the relationship between inflation and economic growth in Thailand using annual dataset during 1990 and 2015. The threshold model is estimated for different levels of threshold inflation rate. The results suggest that the threshold level of inflation above which inflation significantly slow growth is estimated at 3 percent. The negative relationship between inflation and growth is apparent above this threshold level of inflation. In other words, the inflation rat...

  17. The effect of discounting, different mortality reduction schemes and predictive cohort life tables on risk acceptability criteria

    International Nuclear Information System (INIS)

    Rackwitz, Ruediger

    2006-01-01

    Technical facilities should be optimal with respect to benefits and cost. Optimization of technical facilities involving risks for human life and limb require an acceptability criterion and suitable discount rates both for the public and the operator depending on for whom the optimization is carried out. The life quality index is presented and embedded into modem socio-economic concepts. A general risk acceptability criterion is derived. The societal life saving cost to be used in optimization as life saving or compensation cost and the societal willingness-to-pay based on the societal value of a statistical life or on the societal life quality index are developed. Different mortality reduction schemes are studied. Also, predictive cohort life tables are derived and applied. Discount rates γ must be long-term averages in view of the time horizon of some 20 to more than 100 years for the facilities of interest and net of inflation and taxes. While the operator may use long-term averages from the financial market for his cost-benefit analysis the assessment of interest rates for investments of the public into risk reduction is more difficult. The classical Ramsey model decomposes the real interest rate (=output growth rate) into the rate of time preference of consumption and the rate of economical growth multiplied by the elasticity of marginal utility of consumption. It is proposed to use a relatively small interest rate of 3% implying a rate of time preference of consumption of about 1%. This appears intergenerationally acceptable from an ethical point of view. Risk-consequence curves are derived for an example

  18. Simultaneous optimization of beam orientations and beam weights in conformal radiotherapy

    International Nuclear Information System (INIS)

    Rowbottom, Carl Graham; Khoo, Vincent S.; Webb, Steve

    2001-01-01

    A methodology for the concurrent optimization of beam orientations and beam weights in conformal radiotherapy treatment planning has been developed and tested on a cohort of five patients. The algorithm is based on a beam-weight optimization scheme with a downhill simplex optimization engine. The use of random voxels in the dose calculation provides much of the required speed up in the optimization process, and allows the simultaneous optimization of beam orientations and beam weights in a reasonable time. In the implementation of the beam-weight optimization algorithm just 10% of the original patient voxels are used for the dose calculation and cost function evaluation. A fast simulated annealing algorithm controls the optimization of the beam arrangement. The optimization algorithm was able to produce clinically acceptable plans for the five patients in the cohort study. The algorithm equalized the dose to the optic nerves compared to the standard plans and reduced the mean dose to the brain stem by an average of 4.4% (±1.9, 1 SD), p value=0.007. The dose distribution to the PTV was not compromised by developing beam arrangements via the optimization algorithm. In conclusion, the simultaneous optimization of beam orientations and beam weights has been developed to be routinely used in a realistic time. The results of optimization in a small cohort study show that the optimization can reliably produce clinically acceptable dose distributions and may be able to improve dose distributions compared to those from a human planner

  19. Time-efficient multidimensional threshold tracking method

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Kowalewski, Borys; Dau, Torsten

    2015-01-01

    Traditionally, adaptive methods have been used to reduce the time it takes to estimate psychoacoustic thresholds. However, even with adaptive methods, there are many cases where the testing time is too long to be clinically feasible, particularly when estimating thresholds as a function of anothe...

  20. Threshold Games and Cooperation on Multiplayer Graphs.

    Directory of Open Access Journals (Sweden)

    Kaare B Mikkelsen

    Full Text Available The study investigates the effect on cooperation in multiplayer games, when the population from which all individuals are drawn is structured-i.e. when a given individual is only competing with a small subset of the entire population.To optimize the focus on multiplayer effects, a class of games were chosen for which the payoff depends nonlinearly on the number of cooperators-this ensures that the game cannot be represented as a sum of pair-wise interactions, and increases the likelihood of observing behaviour different from that seen in two-player games. The chosen class of games are named "threshold games", and are defined by a threshold, M > 0, which describes the minimal number of cooperators in a given match required for all the participants to receive a benefit. The model was studied primarily through numerical simulations of large populations of individuals, each with interaction neighbourhoods described by various classes of networks.When comparing the level of cooperation in a structured population to the mean-field model, we find that most types of structure lead to a decrease in cooperation. This is both interesting and novel, simply due to the generality and breadth of relevance of the model-it is likely that any model with similar payoff structure exhibits related behaviour. More importantly, we find that the details of the behaviour depends to a large extent on the size of the immediate neighbourhoods of the individuals, as dictated by the network structure. In effect, the players behave as if they are part of a much smaller, fully mixed, population, which we suggest an expression for.

  1. A light-powered sub-threshold microprocessor

    Energy Technology Data Exchange (ETDEWEB)

    Liu Ming; Chen Hong; Zhang Chun; Li Changmeng; Wang Zhihua, E-mail: lium02@mails.tsinghua.edu.cn [Institute of Microelectronics, Tsinghua University, Beijing 100084 (China)

    2010-11-15

    This paper presents an 8-bit sub-threshold microprocessor which can be powered by an integrated photosensitive diode. With a custom designed sub-threshold standard cell library and 1 kbit sub-threshold SRAM design, the leakage power of 58 nW, dynamic power of 385 nW - 165 kHz, EDP 13 pJ/inst and the operating voltage of 350 mV are achieved. Under a light of about 150 kLux, the microprocessor can run at a rate of up to 500 kHz. The microprocessor can be used for wireless-sensor-network nodes.

  2. National turnaround time survey: professional consensus standards for optimal performance and thresholds considered to compromise efficient and effective clinical management.

    Science.gov (United States)

    McKillop, Derek J; Auld, Peter

    2017-01-01

    Background Turnaround time can be defined as the time from receipt of a sample by the laboratory to the validation of the result. The Royal College of Pathologists recommends that a number of performance indicators for turnaround time should be agreed with stakeholders. The difficulty is in arriving at a goal which has some evidence base to support it other than what may simply be currently achievable technically. This survey sought to establish a professional consensus on the goals and meaning of targets for laboratory turnaround time. Methods A questionnaire was circulated by the National Audit Committee to 173 lead consultants for biochemistry in the UK. The survey asked each participant to state their current target turnaround time for core investigations in a broad group of clinical settings. Each participant was also asked to provide a professional opinion on what turnaround time would pose an unacceptable risk to patient safety for each departmental category. A super majority (2/3) was selected as the threshold for consensus. Results The overall response rate was 58% ( n = 100) with a range of 49-72% across the individual Association for Clinical Biochemistry and Laboratory Medicine regions. The consensus optimal turnaround time for the emergency department was 2 h considered unacceptable. The times for general practice and outpatient department were 48 h and for Wards 12 h, respectively. Conclusions We consider that the figures provide a useful benchmark of current opinion, but clearly more empirical standards will have to develop alongside other aspects of healthcare delivery.

  3. Threshold concepts in finance: student perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-10-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by finance academics. In addition, we investigate the potential of a framework of different types of knowledge to differentiate the delivery of the finance curriculum and the role of modelling in finance. Our purpose is to identify ways to improve curriculum design and delivery, leading to better student outcomes. Whilst we find that there is significant overlap between what students identify as important in finance and the threshold concepts identified by academics, much of this overlap is expressed by indirect reference to the concepts. Further, whilst different types of knowledge are apparent in the student data, there is evidence that students do not necessarily distinguish conceptual from other types of knowledge. As well as investigating the finance curriculum, the research demonstrates the use of threshold concepts to compare and contrast student and academic perceptions of a discipline and, as such, is of interest to researchers in education and other disciplines.

  4. Configuration space analysis of common cost functions in radiotherapy beam-weight optimization algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Rowbottom, Carl Graham [Joint Department of Physics, Institute of Cancer Research and the Royal Marsden NHS Trust, Sutton, Surrey (United Kingdom); Webb, Steve [Joint Department of Physics, Institute of Cancer Research and the Royal Marsden NHS Trust, Sutton, Surrey (United Kingdom)

    2002-01-07

    The successful implementation of downhill search engines in radiotherapy optimization algorithms depends on the absence of local minima in the search space. Such techniques are much faster than stochastic optimization methods but may become trapped in local minima if they exist. A technique known as 'configuration space analysis' was applied to examine the search space of cost functions used in radiotherapy beam-weight optimization algorithms. A downhill-simplex beam-weight optimization algorithm was run repeatedly to produce a frequency distribution of final cost values. By plotting the frequency distribution as a function of final cost, the existence of local minima can be determined. Common cost functions such as the quadratic deviation of dose to the planning target volume (PTV), integral dose to organs-at-risk (OARs), dose-threshold and dose-volume constraints for OARs were studied. Combinations of the cost functions were also considered. The simple cost function terms such as the quadratic PTV dose and integral dose to OAR cost function terms are not susceptible to local minima. In contrast, dose-threshold and dose-volume OAR constraint cost function terms are able to produce local minima in the example case studied. (author)

  5. Bedding material affects mechanical thresholds, heat thresholds and texture preference

    Science.gov (United States)

    Moehring, Francie; O’Hara, Crystal L.; Stucky, Cheryl L.

    2015-01-01

    It has long been known that the bedding type animals are housed on can affect breeding behavior and cage environment. Yet little is known about its effects on evoked behavior responses or non-reflexive behaviors. C57BL/6 mice were housed for two weeks on one of five bedding types: Aspen Sani Chips® (standard bedding for our institute), ALPHA-Dri®, Cellu-Dri™, Pure-o’Cel™ or TEK-Fresh. Mice housed on Aspen exhibited the lowest (most sensitive) mechanical thresholds while those on TEK-Fresh exhibited 3-fold higher thresholds. While bedding type had no effect on responses to punctate or dynamic light touch stimuli, TEK-Fresh housed animals exhibited greater responsiveness in a noxious needle assay, than those housed on the other bedding types. Heat sensitivity was also affected by bedding as animals housed on Aspen exhibited the shortest (most sensitive) latencies to withdrawal whereas those housed on TEK-Fresh had the longest (least sensitive) latencies to response. Slight differences between bedding types were also seen in a moderate cold temperature preference assay. A modified tactile conditioned place preference chamber assay revealed that animals preferred TEK-Fresh to Aspen bedding. Bedding type had no effect in a non-reflexive wheel running assay. In both acute (two day) and chronic (5 week) inflammation induced by injection of Complete Freund’s Adjuvant in the hindpaw, mechanical thresholds were reduced in all groups regardless of bedding type, but TEK-Fresh and Pure-o’Cel™ groups exhibited a greater dynamic range between controls and inflamed cohorts than Aspen housed mice. PMID:26456764

  6. 40 CFR 68.115 - Threshold determination.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Regulated Substances for Accidental Release Prevention... process exceeds the threshold. (b) For the purposes of determining whether more than a threshold quantity... portion of the process is less than 10 millimeters of mercury (mm Hg), the amount of the substance in the...

  7. Approach to DOE threshold guidance limits

    International Nuclear Information System (INIS)

    Shuman, R.D.; Wickham, L.E.

    1984-01-01

    The need for less restrictive criteria governing disposal of extremely low-level radioactive waste has long been recognized. The Low-Level Waste Management Program has been directed by the Department of Energy (DOE) to aid in the development of a threshold guidance limit for DOE low-level waste facilities. Project objectives are concernd with the definition of a threshold limit dose and pathway analysis of radionuclide transport within selected exposure scenarios at DOE sites. Results of the pathway analysis will be used to determine waste radionuclide concentration guidelines that meet the defined threshold limit dose. Methods of measurement and verification of concentration limits round out the project's goals. Work on defining a threshold limit dose is nearing completion. Pathway analysis of sanitary landfill operations at the Savannah River Plant and the Idaho National Engineering Laboratory is in progress using the DOSTOMAN computer code. Concentration limit calculations and determination of implementation procedures shall follow completion of the pathways work. 4 references

  8. Towards a unifying basis of auditory thresholds: binaural summation.

    Science.gov (United States)

    Heil, Peter

    2014-04-01

    Absolute auditory threshold decreases with increasing sound duration, a phenomenon explainable by the assumptions that the sound evokes neural events whose probabilities of occurrence are proportional to the sound's amplitude raised to an exponent of about 3 and that a constant number of events are required for threshold (Heil and Neubauer, Proc Natl Acad Sci USA 100:6151-6156, 2003). Based on this probabilistic model and on the assumption of perfect binaural summation, an equation is derived here that provides an explicit expression of the binaural threshold as a function of the two monaural thresholds, irrespective of whether they are equal or unequal, and of the exponent in the model. For exponents >0, the predicted binaural advantage is largest when the two monaural thresholds are equal and decreases towards zero as the monaural threshold difference increases. This equation is tested and the exponent derived by comparing binaural thresholds with those predicted on the basis of the two monaural thresholds for different values of the exponent. The thresholds, measured in a large sample of human subjects with equal and unequal monaural thresholds and for stimuli with different temporal envelopes, are compatible only with an exponent close to 3. An exponent of 3 predicts a binaural advantage of 2 dB when the two ears are equally sensitive. Thus, listening with two (equally sensitive) ears rather than one has the same effect on absolute threshold as doubling duration. The data suggest that perfect binaural summation occurs at threshold and that peripheral neural signals are governed by an exponent close to 3. They might also shed new light on mechanisms underlying binaural summation of loudness.

  9. Exercise increases pressure pain tolerance but not pressure and heat pain thresholds in healthy young men

    DEFF Research Database (Denmark)

    Vaegter, H. B.; Bement, M. Hoeger; Madsen, A. B.

    2017-01-01

    BACKGROUND: Exercise causes an acute decrease in the pain sensitivity known as exercise-induced hypoalgesia (EIH), but the specificity to certain pain modalities remains unknown. This study aimed to compare the effect of isometric exercise on the heat and pressure pain sensitivity. METHODS...... and counterbalanced order. Cuff pressure pain threshold (cPPT) and pain tolerance (cPTT) were assessed on the ipsilateral lower leg by computer-controlled cuff algometry. Heat pain threshold (HPT) was recorded on the ipsilateral foot by a computer-controlled thermal stimulator. RESULTS: Cuff pressure pain tolerance...... to the understanding of how isometric exercise influences pain perception, which is necessary to optimize the clinical utility of exercise in management of chronic pain. SIGNIFICANCE: The effect of isometric exercise on pain tolerance may be relevant for patients in chronic musculoskeletal pain as a pain...

  10. Experimental design and multicriteria decision making methods for the optimization of ice cream composition

    Directory of Open Access Journals (Sweden)

    Cristian Rojas

    2012-03-01

    Full Text Available The aim of the present work was to optimize the sensorial and technological features of ice cream. The experimental work was performed in two stages: 1 optimization of lactose enzymatic hydrolysis, and 2 optimization of the process and product. For the first stage a complete factorial design was developed, optimized using both response surface and the steepest ascent method. In the second stage a mixture design was performed, combining the process variables. The product with the best sensorial acceptance, high yield and low cost was selected. The acceptance of the product was developed by an untrained taster’s panel. As a main result the sensorial and technological features of the final product were improved, establishing the optimum parameters for its elaboration.

  11. Spike-threshold adaptation predicted by membrane potential dynamics in vivo.

    Directory of Open Access Journals (Sweden)

    Bertrand Fontaine

    2014-04-01

    Full Text Available Neurons encode information in sequences of spikes, which are triggered when their membrane potential crosses a threshold. In vivo, the spiking threshold displays large variability suggesting that threshold dynamics have a profound influence on how the combined input of a neuron is encoded in the spiking. Threshold variability could be explained by adaptation to the membrane potential. However, it could also be the case that most threshold variability reflects noise and processes other than threshold adaptation. Here, we investigated threshold variation in auditory neurons responses recorded in vivo in barn owls. We found that spike threshold is quantitatively predicted by a model in which the threshold adapts, tracking the membrane potential at a short timescale. As a result, in these neurons, slow voltage fluctuations do not contribute to spiking because they are filtered by threshold adaptation. More importantly, these neurons can only respond to input spikes arriving together on a millisecond timescale. These results demonstrate that fast adaptation to the membrane potential captures spike threshold variability in vivo.

  12. Ultralow percolation threshold of single walled carbon nanotube-epoxy composites synthesized via an ionic liquid dispersant/initiator

    Science.gov (United States)

    Watters, Arianna L.; Palmese, Giuseppe R.

    2014-09-01

    Uniform dispersion of single walled carbon nanotubes (SWNTs) in an epoxy was achieved by a streamlined mechano-chemical processing method. SWNT-epoxy composites were synthesized using a room temperature ionic liquid (IL) with an imidazolium cation and dicyanamide anion. The novel approach of using ionic liquid that behaves as a dispersant for SWNTs and initiator for epoxy polymerization greatly simplifies nanocomposite synthesis. The material was processed using simple and scalable three roll milling. The SWNT dispersion of the resultant composite was evaluated by electron microscopy and electrical conductivity measurements in conjunction with percolation theory. Processing conditions were optimized to achieve the lowest possible percolation threshold, 4.29 × 10-5 volume fraction SWNTs. This percolation threshold is among the best reported in literature yet it was obtained using a streamlined method that greatly simplifies processing.

  13. Statistical Algorithm for the Adaptation of Detection Thresholds

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    Many event detection mechanisms in spark ignition automotive engines are based on the comparison of the engine signals to the detection threshold values. Different signal qualities for new and aged engines necessitate the development of an adaptation algorithm for the detection thresholds...... remains constant regardless of engine age and changing detection threshold values. This, in turn, guarantees the same event detection performance for new and aged engines/sensors. Adaptation of the engine knock detection threshold is given as an example. Udgivelsesdato: 2008...

  14. Perspective: Uses and misuses of thresholds in diagnostic decision making.

    Science.gov (United States)

    Warner, Jeremy L; Najarian, Robert M; Tierney, Lawrence M

    2010-03-01

    The concept of thresholds plays a vital role in decisions involving the initiation, continuation, and completion of diagnostic testing. Much research has focused on the development of explicit thresholds, in the form of practice guidelines and decision analyses. However, these tools are used infrequently; most medical decisions are made at the bedside, using implicit thresholds. Study of these thresholds can lead to a deeper understanding of clinical decision making. The authors examine some factors constituting individual clinicians' implicit thresholds. They propose a model for static thresholds using the concept of situational gravity to explain why some thresholds are high, and some low. Next, they consider the hypothetical effects of incorrect placement of thresholds (miscalibration) and changes to thresholds during diagnosis (manipulation). They demonstrate these concepts using common clinical scenarios. Through analysis of miscalibration of thresholds, the authors demonstrate some common maladaptive clinical behaviors, which are nevertheless internally consistent. They then explain how manipulation of thresholds gives rise to common cognitive heuristics including premature closure and anchoring. They also discuss the case where no threshold has been exceeded despite exhaustive collection of data, which commonly leads to application of the availability or representativeness heuristics. Awareness of implicit thresholds allows for a more effective understanding of the processes of medical decision making and, possibly, to the avoidance of detrimental heuristics and their associated medical errors. Research toward accurately defining these thresholds for individual physicians and toward determining their dynamic properties during the diagnostic process may yield valuable insights.

  15. Optimal sum rules inequalities for spin 1/2 Compton scattering

    International Nuclear Information System (INIS)

    Guiasu, I.; Radescu, E.E.; Razillier, I.

    1979-08-01

    A formalism appropriate for model independent dispersion theoretic investigations of the (not necessarily forward) Compton scattering off spin 1/2 hadronic targets, which fully exploits the analyticity properties of the amplitudes (to lowest order in electromagnetism) in ν 2 at fixed t(ν=(s-u)/4) s,t,u = Mandelstam variables), is developed. It relies on methods which are specific to boundary value problems for analytic matrix-valued functions. An analytic factorization of the positive definite hermitian matrix associated with the bilinear expression of the unpolarized differential cross section (u.d.c.s.) in terms of the Bardeen-Tung (B.T.) invariant amplitudes is explicitly obtained. For t in a specified portion of the physical region, six new amplitudes describing the process are thereby constructed which have the same good analyticity structure in ν 2 as the (crossing symmetrized) B.T. amplitudes, while their connection with the usual helicity amplitudes is given by a matrix which is unitary on the unitarity cut. A bound on a certain integral over the u.d.c.s. above the first inelastic threshold, established in terms of the target's charge and anomalous magnetic moment, improves a previous weaker result, being now optimal under the information accepted as known. (author)

  16. Promoting the acceptance of nuclear technology

    International Nuclear Information System (INIS)

    Rueckl, E.

    1998-01-01

    Restoring the public acceptance of nuclear technology requires optimized public relations work and an enhanced interaction among the nuclear industry and schools and universities. Thinking in contexts needs to be promoted, also in order to improve knowledge of mass flows. Specific terms often mean different things to experts and to the public. This can be corrected by careful use of language and precision in public relations work. The young generation is more openminded towards technology now than it was in the seventies and eighties. This is a point of departure in winning young people also for nuclear technology. For this to happen, science education in schools needs to be improved and the appropriate courses need to be introduced. (orig.) [de

  17. Is action potential threshold lowest in the axon?

    NARCIS (Netherlands)

    Kole, Maarten H. P.; Stuart, Greg J.

    2008-01-01

    Action potential threshold is thought to be lowest in the axon, but when measured using conventional techniques, we found that action potential voltage threshold of rat cortical pyramidal neurons was higher in the axon than at other neuronal locations. In contrast, both current threshold and voltage

  18. Bridging the Gap between Social Acceptance and Ethical Acceptability

    NARCIS (Netherlands)

    Taebi, B.

    2016-01-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological

  19. Applying Threshold Concepts to Finance Education

    Science.gov (United States)

    Hoadley, Susan; Wood, Leigh N.; Tickle, Leonie; Kyng, Tim

    2016-01-01

    Purpose: The purpose of this paper is to investigate and identify threshold concepts that are the essential conceptual content of finance programmes. Design/Methodology/Approach: Conducted in three stages with finance academics and students, the study uses threshold concepts as both a theoretical framework and a research methodology. Findings: The…

  20. Maximizing Total Profit in Two-agent Problem of Order Acceptance and Scheduling

    Directory of Open Access Journals (Sweden)

    Mohammad Reisi-Nafchi

    2017-03-01

    Full Text Available In competitive markets, attracting potential customers and keeping current customers is a survival condition for each company. So, paying attention to the requests of customers is important and vital. In this paper, the problem of order acceptance and scheduling has been studied, in which two types of customers or agents compete in a single machine environment. The objective is maximizing sum of the total profit of first agent's accepted orders and the total revenue of second agent. Therefore, only the first agent has penalty and its penalty function is lateness and the second agent's orders have a common due date and this agent does not accept any tardy order. To solve the problem, a mathematical programming, a heuristic algorithm and a pseudo-polynomial dynamic programming algorithm are proposed. Computational results confirm the ability of solving all problem instances up to 70 orders size optimally and also 93.12% of problem instances up to 150 orders size by dynamic programming.

  1. Video Game Acceptance: A Meta-Analysis of the Extended Technology Acceptance Model.

    Science.gov (United States)

    Wang, Xiaohui; Goh, Dion Hoe-Lian

    2017-11-01

    The current study systematically reviews and summarizes the existing literature of game acceptance, identifies the core determinants, and evaluates the strength of the relationships in the extended technology acceptance model. Moreover, this study segments video games into two categories: hedonic and utilitarian and examines player acceptance of these two types separately. Through a meta-analysis of 50 articles, we find that perceived ease of use (PEOU), perceived usefulness (PU), and perceived enjoyment (PE) significantly associate with attitude and behavioral intention. PE is the dominant predictor of hedonic game acceptance, while PEOU and PU are the main determinants of utilitarian game acceptance. Furthermore, we find that respondent type and game platform are significant moderators. Findings of this study provide critical insights into the phenomenon of game acceptance and suggest directions for future research.

  2. The optimal structure-conductivity relation in epoxy-phthalocyanine nanocomposites.

    Science.gov (United States)

    Huijbregts, L J; Brom, H B; Brokken-Zijp, J C M; Kemerink, M; Chen, Z; Goeje, M P de; Yuan, M; Michels, M A J

    2006-11-23

    Phthalcon-11 (aquocyanophthalocyaninatocobalt (III)) forms semiconducting nanocrystals that can be dispersed in epoxy coatings to obtain a semiconducting material with a low percolation threshold. We investigated the structure-conductivity relation in this composite and the deviation from its optimal realization by combining two techniques. The real parts of the electrical conductivity of a Phthalcon-11/epoxy coating and of Phthalcon-11 powder were measured by dielectric spectroscopy as a function of frequency and temperature. Conducting atomic force microscopy (C-AFM) was applied to quantify the conductivity through the coating locally along the surface. This combination gives an excellent tool to visualize the particle network. We found that a large fraction of the crystals is organized in conducting channels of fractal building blocks. In this picture, a low percolation threshold automatically leads to a conductivity that is much lower than that of the filler. Since the structure-conductivity relation for the found network is almost optimal, a drastic increase in the conductivity of the coating cannot be achieved by changing the particle network, but only by using a filler with a higher conductivity level.

  3. The asymmetry of U.S. monetary policy: Evidence from a threshold Taylor rule with time-varying threshold values

    Science.gov (United States)

    Zhu, Yanli; Chen, Haiqiang

    2017-05-01

    In this paper, we revisit the issue whether U.S. monetary policy is asymmetric by estimating a forward-looking threshold Taylor rule with quarterly data from 1955 to 2015. In order to capture the potential heterogeneity for regime shift mechanism under different economic conditions, we modify the threshold model by assuming the threshold value as a latent variable following an autoregressive (AR) dynamic process. We use the unemployment rate as the threshold variable and separate the sample into two periods: expansion periods and recession periods. Our findings support that the U.S. monetary policy operations are asymmetric in these two regimes. More precisely, the monetary authority tends to implement an active Taylor rule with a weaker response to the inflation gap (the deviation of inflation from its target) and a stronger response to the output gap (the deviation of output from its potential level) in recession periods. The threshold value, interpreted as the targeted unemployment rate of monetary authorities, exhibits significant time-varying properties, confirming the conjecture that policy makers may adjust their reference point for the unemployment rate accordingly to reflect their attitude on the health of general economy.

  4. Model Threshold untuk Pembelajaran Memproduksi Pantun Kelas XI

    Directory of Open Access Journals (Sweden)

    Fitri Nura Murti

    2017-03-01

    Full Text Available Abstract: The learning pantun method in schools provided less opportunity to develop the students’ creativity in producing pantun. This situation was supported by the result of the observation conducted on eleventh graders at SMAN 2 Bondowoso. It showed that the students tend to plagiarize their pantun. The general objective of this research and development is to develop Threshold Pantun model for learning to produce pantun for elevent graders. The product was presented in guidance book for teachers entitled “Pembelajaran Memproduksi Pantun Menggunakan Model Threshold Pantun untuk Kelas XI”. This study adapted design method of Borg-Gall’s R&D procedure. The result of this study showed that Threshold Pantun model was appropriate to be implemented for learning to produce pantun. Key Words: Threshold Pantun model, produce pantun Abstrak: Pembelajaran pantun di sekolah selama ini kurang mengembangkan kreativitas siswa dalam memproduksi pantun. Hal tersebut dikuatkan oleh hasil observasi siswa kelas XI SMAN 2 Bondowoso yang menunjukkan adanya kecenderungan produk siswa bersifat plagiat. Tujuan penelitian dan pengembangan ini secara umum adalah mengembangkan model Threshold Pantun untuk pembelajaran memproduksi pantun kelas XI..Produk disajikan dalam bentuk buku panduan bagi guru dengan judul “Pembelajaran Memproduksi Pantun Menggunakan Model Threshold Pantun untuk Kelas XI”. Penelitian ini menggunakan rancangan penelitian yang diadaptasi dari prosedur penelitian dan pengembangan Borg dan Gall. Berdasarkan hasil validasi model Threshold Pantun untuk pembelajaran memproduksi pantun layak diimplementasikan. Kata kunci: model Threshold Pantun, memproduksi pantun

  5. The H-mode power threshold in JET

    Energy Technology Data Exchange (ETDEWEB)

    Start, D F.H.; Bhatnagar, V P; Campbell, D J; Cordey, J G; Esch, H P.L. de; Gormezano, C; Hawkes, N; Horton, L; Jones, T T.C.; Lomas, P J; Lowry, C; Righi, E; Rimini, F G; Saibene, G; Sartori, R; Sips, G; Stork, D; Thomas, P; Thomsen, K; Tubbing, B J.D.; Von Hellermann, M; Ward, D J [Commission of the European Communities, Abingdon (United Kingdom). JET Joint Undertaking

    1994-07-01

    New H-mode threshold data over a range of toroidal field and density values have been obtained from the present campaign. The scaling with n{sub e} B{sub t} is almost identical with that of the 91/92 period for the same discharge conditions. The scaling with toroidal field alone gives somewhat higher thresholds than the older data. The 1991/2 database shows a scaling of P{sub th} (power threshold) with n{sub e} B{sub t} which is approximately linear and agrees well with that observed on other tokamaks. For NBI and carbon target tiles the threshold power is a factor of two higher with the ion {Nu}B drift away from the target compared with the value found with the drift towards the target. The combination of ICRH and beryllium tiles appears to be beneficial for reducing P{sub th}. The power threshold is largely insensitive to plasma current, X-point height and distance between the last closed flux surface and the limiter, at least for values greater than 2 cm. (authors). 3 refs., 6 figs.

  6. QRS Detection Based on Improved Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Xuanyu Lu

    2018-01-01

    Full Text Available Cardiovascular disease is the first cause of death around the world. In accomplishing quick and accurate diagnosis, automatic electrocardiogram (ECG analysis algorithm plays an important role, whose first step is QRS detection. The threshold algorithm of QRS complex detection is known for its high-speed computation and minimized memory storage. In this mobile era, threshold algorithm can be easily transported into portable, wearable, and wireless ECG systems. However, the detection rate of the threshold algorithm still calls for improvement. An improved adaptive threshold algorithm for QRS detection is reported in this paper. The main steps of this algorithm are preprocessing, peak finding, and adaptive threshold QRS detecting. The detection rate is 99.41%, the sensitivity (Se is 99.72%, and the specificity (Sp is 99.69% on the MIT-BIH Arrhythmia database. A comparison is also made with two other algorithms, to prove our superiority. The suspicious abnormal area is shown at the end of the algorithm and RR-Lorenz plot drawn for doctors and cardiologists to use as aid for diagnosis.

  7. Determination of threshold values for operating transients via 3-D parametric analyses

    International Nuclear Information System (INIS)

    Raju, P.P.; Baylac, G.; Faidy, C.

    1983-01-01

    The main objective of the work reported herein was to determine the threshold values of operating parameters such as internal pressure and temperature fluctuations in order that the monitoring of these parameters could be optimized in an operating nuclear power plant on the basis that these fluctuations would not adversely affect the structural integrity and/or fatigue life of the systems and components involved. Accordingly, a parametric study was performed, using a typical and potentially critical lateral connection commonly used in the PWR system. The d/D and D/T ratios for the selected configuration were 0.36 and 10.6, respectively. A three dimensional finite element model was generated for the study using the latest modeling techniques. The stresses due to 1 MPa internal pressure were computed first. Then, a transient thermal analysis was performed for the specified fluid temperature fluctuation of 30 0 C in 60 seconds. Subsequently, a thermal stress analysis was performed using the calculated thermal gradients through the wall. The results of the foregoing analyses are presented and discussed with the help of a threshold equation formulated to prevent fatigue failure. Stress intensification factors are also reported for critical areas

  8. Critical Power: An Important Fatigue Threshold in Exercise Physiology.

    Science.gov (United States)

    Poole, David C; Burnley, Mark; Vanhatalo, Anni; Rossiter, Harry B; Jones, Andrew M

    2016-11-01

    : The hyperbolic form of the power-duration relationship is rigorous and highly conserved across species, forms of exercise, and individual muscles/muscle groups. For modalities such as cycling, the relationship resolves to two parameters, the asymptote for power (critical power [CP]) and the so-called W' (work doable above CP), which together predict the tolerable duration of exercise above CP. Crucially, the CP concept integrates sentinel physiological profiles-respiratory, metabolic, and contractile-within a coherent framework that has great scientific and practical utility. Rather than calibrating equivalent exercise intensities relative to metabolically distant parameters such as the lactate threshold or V˙O2max, setting the exercise intensity relative to CP unifies the profile of systemic and intramuscular responses and, if greater than CP, predicts the tolerable duration of exercise until W' is expended, V˙O2max is attained, and intolerance is manifested. CP may be regarded as a "fatigue threshold" in the sense that it separates exercise intensity domains within which the physiological responses to exercise can (CP) be stabilized. The CP concept therefore enables important insights into 1) the principal loci of fatigue development (central vs. peripheral) at different intensities of exercise and 2) mechanisms of cardiovascular and metabolic control and their modulation by factors such as O2 delivery. Practically, the CP concept has great potential application in optimizing athletic training programs and performance as well as improving the life quality for individuals enduring chronic disease.

  9. Where should I send it? Optimizing the submission decision process.

    Directory of Open Access Journals (Sweden)

    Santiago Salinas

    Full Text Available How do scientists decide where to submit manuscripts? Many factors influence this decision, including prestige, acceptance probability, turnaround time, target audience, fit, and impact factor. Here, we present a framework for evaluating where to submit a manuscript based on the theory of Markov decision processes. We derive two models, one in which an author is trying to optimally maximize citations and another in which that goal is balanced by either minimizing the number of resubmissions or the total time in review. We parameterize the models with data on acceptance probability, submission-to-decision times, and impact factors for 61 ecology journals. We find that submission sequences beginning with Ecology Letters, Ecological Monographs, or PLOS ONE could be optimal depending on the importance given to time to acceptance or number of resubmissions. This analysis provides some guidance on where to submit a manuscript given the individual-specific values assigned to these disparate objectives.

  10. Threshold enhancement of diphoton resonances

    Directory of Open Access Journals (Sweden)

    Aoife Bharucha

    2016-10-01

    Full Text Available We revisit a mechanism to enhance the decay width of (pseudo-scalar resonances to photon pairs when the process is mediated by loops of charged fermions produced near threshold. Motivated by the recent LHC data, indicating the presence of an excess in the diphoton spectrum at approximately 750 GeV, we illustrate this threshold enhancement mechanism in the case of a 750 GeV pseudoscalar boson A with a two-photon decay mediated by a charged and uncolored fermion having a mass at the 12MA threshold and a small decay width, <1 MeV. The implications of such a threshold enhancement are discussed in two explicit scenarios: i the Minimal Supersymmetric Standard Model in which the A state is produced via the top quark mediated gluon fusion process and decays into photons predominantly through loops of charginos with masses close to 12MA and ii a two Higgs doublet model in which A is again produced by gluon fusion but decays into photons through loops of vector-like charged heavy leptons. In both these scenarios, while the mass of the charged fermion has to be adjusted to be extremely close to half of the A resonance mass, the small total widths are naturally obtained if only suppressed three-body decay channels occur. Finally, the implications of some of these scenarios for dark matter are discussed.

  11. ENTRIA workshop. Determine threshold values in radiation protection

    International Nuclear Information System (INIS)

    Diener, Lisa

    2015-01-01

    Threshold values affect our daily lives. Whether it concerns traffic or noise regulations, we all experience thresholds on a regular basis. But how are such values generated? The conference ''Determine Thres-hold Values in Radiation Protection'', taking place on January 27th 2015 in Braunschweig, focused on this question. The conference was undertaken in the context of the BMBF-funded interdisciplinary research project ''ENTRIA - Disposal Options for Radioactive Residues''. It aimed to stimulate a cross-disciplinary discussion. Spea-kers from different disciplinary backgrounds talked about topics like procedures of setting threshold values, standards for evaluating dosages, and public participation in the standardization of threshold values. Two major theses emerged: First, setting threshold values always requires considering contexts and protection targets. Second, existing uncertainties must be communicated in and with the public. Altogether, the conference offered lots of input and issues for discussion. In addition, it raised interesting and important questions for further and ongoing work in the research project ENTRIA.

  12. Thresholds of ion turbulence in tokamaks

    International Nuclear Information System (INIS)

    Garbet, X.; Laurent, L.; Mourgues, F.; Roubin, J.P.; Samain, A.; Zou, X.L.

    1991-01-01

    The linear thresholds of ionic turbulence are numerically calculated for the Tokamaks JET and TORE SUPRA. It is proved that the stability domain at η i >0 is determined by trapped ion modes and is characterized by η i ≥1 and a threshold L Ti /R of order (0.2/0.3)/(1+T i /T e ). The latter value is significantly smaller than what has been previously predicted. Experimental temperature profiles in heated discharges are usually marginal with respect to this criterium. It is also shown that the eigenmodes are low frequency, low wavenumber ballooned modes, which may produce a very large transport once the threshold ion temperature gradient is reached

  13. Effect of dissipation on dynamical fusion thresholds

    International Nuclear Information System (INIS)

    Sierk, A.J.

    1986-01-01

    The existence of dynamical thresholds to fusion in heavy nuclei (A greater than or equal to 200) due to the nature of the potential-energy surface is shown. These thresholds exist even in the absence of dissipative forces, due to the coupling between the various collective deformation degrees of freedom. Using a macroscopic model of nuclear shape dynamics, It is shown how three different suggested dissipation mechanisms increase by varying amounts the excitation energy over the one-dimensional barrier required to cause compound-nucleus formation. The recently introduced surface-plus-window dissipation may give a reasonable representation of experimental data on fusion thresholds, in addition to properly describing fission-fragment kinetic energies and isoscalar giant multipole widths. Scaling of threshold results to asymmetric systems is discussed. 48 refs., 10 figs

  14. Reward rate optimization in two-alternative decision making: empirical tests of theoretical predictions.

    Science.gov (United States)

    Simen, Patrick; Contreras, David; Buck, Cara; Hu, Peter; Holmes, Philip; Cohen, Jonathan D

    2009-12-01

    The drift-diffusion model (DDM) implements an optimal decision procedure for stationary, 2-alternative forced-choice tasks. The height of a decision threshold applied to accumulating information on each trial determines a speed-accuracy tradeoff (SAT) for the DDM, thereby accounting for a ubiquitous feature of human performance in speeded response tasks. However, little is known about how participants settle on particular tradeoffs. One possibility is that they select SATs that maximize a subjective rate of reward earned for performance. For the DDM, there exist unique, reward-rate-maximizing values for its threshold and starting point parameters in free-response tasks that reward correct responses (R. Bogacz, E. Brown, J. Moehlis, P. Holmes, & J. D. Cohen, 2006). These optimal values vary as a function of response-stimulus interval, prior stimulus probability, and relative reward magnitude for correct responses. We tested the resulting quantitative predictions regarding response time, accuracy, and response bias under these task manipulations and found that grouped data conformed well to the predictions of an optimally parameterized DDM.

  15. LPI Optimization Framework for Target Tracking in Radar Network Architectures Using Information-Theoretic Criteria

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2014-01-01

    Full Text Available Widely distributed radar network architectures can provide significant performance improvement for target detection and localization. For a fixed radar network, the achievable target detection performance may go beyond a predetermined threshold with full transmitted power allocation, which is extremely vulnerable in modern electronic warfare. In this paper, we study the problem of low probability of intercept (LPI design for radar network and propose two novel LPI optimization schemes based on information-theoretic criteria. For a predefined threshold of target detection, Schleher intercept factor is minimized by optimizing transmission power allocation among netted radars in the network. Due to the lack of analytical closed-form expression for receiver operation characteristics (ROC, we employ two information-theoretic criteria, namely, Bhattacharyya distance and J-divergence as the metrics for target detection performance. The resulting nonconvex and nonlinear LPI optimization problems associated with different information-theoretic criteria are cast under a unified framework, and the nonlinear programming based genetic algorithm (NPGA is used to tackle the optimization problems in the framework. Numerical simulations demonstrate that our proposed LPI strategies are effective in enhancing the LPI performance for radar network.

  16. Smartphone threshold audiometry in underserved primary health-care contexts.

    Science.gov (United States)

    Sandström, Josefin; Swanepoel, De Wet; Carel Myburgh, Hermanus; Laurent, Claude

    2016-01-01

    To validate a calibrated smartphone-based hearing test in a sound booth environment and in primary health-care clinics. A repeated-measure within-subject study design was employed whereby air-conduction hearing thresholds determined by smartphone-based audiometry was compared to conventional audiometry in a sound booth and a primary health-care clinic environment. A total of 94 subjects (mean age 41 years ± 17.6 SD and range 18-88; 64% female) were assessed of whom 64 were tested in the sound booth and 30 within primary health-care clinics without a booth. In the sound booth 63.4% of conventional and smartphone thresholds indicated normal hearing (≤15 dBHL). Conventional thresholds exceeding 15 dB HL corresponded to smartphone thresholds within ≤10 dB in 80.6% of cases with an average threshold difference of -1.6 dB ± 9.9 SD. In primary health-care clinics 13.7% of conventional and smartphone thresholds indicated normal hearing (≤15 dBHL). Conventional thresholds exceeding 15 dBHL corresponded to smartphone thresholds within ≤10 dB in 92.9% of cases with an average threshold difference of -1.0 dB ± 7.1 SD. Accurate air-conduction audiometry can be conducted in a sound booth and without a sound booth in an underserved community health-care clinic using a smartphone.

  17. BP neural network optimized by genetic algorithm approach for titanium and iron content prediction in EDXRF

    International Nuclear Information System (INIS)

    Wang Jun; Liu Mingzhe; Li Zhe; Li Lei; Shi Rui; Tuo Xianguo

    2015-01-01

    The quantitative elemental content analysis is difficult due to the uniform effect, particle effect and the element matrix effect, etc, when using energy dispersive X-ray fluorescence (EDXRF) technique. In this paper, a hybrid approach of genetic algorithm (GA) and back propagation (BP) neural network was proposed without considering the complex relationship between the concentration and intensity. The aim of GA optimized BP was to get better network initial weights and thresholds. The basic idea was that the reciprocal of the mean square error of the initialization BP neural network was set as the fitness value of the individual in GA, and the initial weights and thresholds were replaced by individuals, and then the optimal individual was sought by selection, crossover and mutation operations, finally a new BP neural network model was created with the optimal initial weights and thresholds. The calculation results of quantitative analysis of titanium and iron contents for five types of ore bodies in Panzhihua Mine show that the results of classification prediction are far better than that of overall forecasting, and relative errors of 76.7% samples are less than 2% compared with chemical analysis values, which demonstrates the effectiveness of the proposed method. (authors)

  18. Is the diagnostic threshold for bulimia nervosa clinically meaningful?

    Science.gov (United States)

    Chapa, Danielle A N; Bohrer, Brittany K; Forbush, Kelsie T

    2018-01-01

    The DSM-5 differentiates full- and sub-threshold bulimia nervosa (BN) according to average weekly frequencies of binge eating and inappropriate compensatory behaviors. This study was the first to evaluate the modified frequency criterion for BN published in the DSM-5. The purpose of this study was to test whether community-recruited adults (N=125; 83.2% women) with current full-threshold (n=77) or sub-threshold BN (n=48) differed in comorbid psychopathology and eating disorder (ED) illness duration, symptom severity, and clinical impairment. Participants completed the Clinical Impairment Assessment and participated in semi-structured clinical interviews of ED- and non-ED psychopathology. Differences between the sub- and full-threshold BN groups were assessed using MANOVA and Chi-square analyses. ED illness duration, age-of-onset, body mass index (BMI), alcohol and drug misuse, and the presence of current and lifetime mood or anxiety disorders did not differ between participants with sub- and full-threshold BN. Participants with full-threshold BN had higher levels of clinical impairment and weight concern than those with sub-threshold BN. However, minimal clinically important difference analyses suggested that statistically significant differences between participants with sub- and full-threshold BN on clinical impairment and weight concern were not clinically significant. In conclusion, sub-threshold BN did not differ from full-threshold BN in clinically meaningful ways. Future studies are needed to identify an improved frequency criterion for BN that better distinguishes individuals in ways that will more validly inform prognosis and effective treatment planning for BN. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch.

    Science.gov (United States)

    Yurtkuran, Alkın; Emel, Erdal

    2016-01-01

    The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  20. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch

    Directory of Open Access Journals (Sweden)

    Alkın Yurtkuran

    2016-01-01

    Full Text Available The artificial bee colony (ABC algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  1. A study of optimization techniques in HDR brachytherapy for the prostate

    Science.gov (United States)

    Pokharel, Ghana Shyam

    . Based on our study, DVH based objective function performed better than traditional variance based objective function in creating a clinically acceptable plan when executed under identical conditions. Thirdly, we studied the multiobjective optimization strategy using both DVH and variance based objective functions. The optimization strategy was to create several Pareto optimal solutions by scanning the clinically relevant part of the Pareto front. This strategy was adopted to decouple optimization from decision such that user could select final solution from the pool of alternative solutions based on his/her clinical goals. The overall quality of treatment plan improved using this approach compared to traditional class solution approach. In fact, the final optimized plan selected using decision engine with DVH based objective was comparable to typical clinical plan created by an experienced physicist. Next, we studied the hybrid technique comprising both stochastic and deterministic algorithm to optimize both dwell positions and dwell times. The simulated annealing algorithm was used to find optimal catheter distribution and the DVH based algorithm was used to optimize 3D dose distribution for given catheter distribution. This unique treatment planning and optimization tool was capable of producing clinically acceptable highly reproducible treatment plans in clinically reasonable time. As this algorithm was able to create clinically acceptable plans within clinically reasonable time automatically, it is really appealing for real time procedures. Next, we studied the feasibility of multiobjective optimization using evolutionary algorithm for real time HDR brachytherapy for the prostate. The algorithm with properly tuned algorithm specific parameters was able to create clinically acceptable plans within clinically reasonable time. However, the algorithm was let to run just for limited number of generations not considered optimal, in general, for such algorithms. This was

  2. A new FPGA-based time-over-threshold system for the time of flight detectors at the BGO-OD experiment

    Energy Technology Data Exchange (ETDEWEB)

    Freyermuth, Oliver [Physikalisches Institut, Nussallee 12, D-53115 Bonn (Germany); Collaboration: BGO-OD-Collaboration

    2015-07-01

    The BGO-OD experiment at the ELSA accelerator facility at Bonn is built for the systematic investigation of meson photoproduction in the GeV region. It features the unique combination of a central, highly segmented BGO crystal calorimeter covering almost 4π in acceptance and a forward magnetic spectrometer complemented by time of flight walls. The readout of the ToF scintillator bars was upgraded to an FPGA-based VME-board equipped with discriminator mezzanines including per-channel remotely adjustable thresholds. A firmware was developed combining a time-over-threshold (ToT) measurement by implementing a dual-edge TDC, a configurable meantimer trigger logic including a special cosmics trigger, adjustable input delays and gateable scalers, all inside a single electronics module. An experimentally obtained relation between ToT and slope of a PMT signal can be used for a time walk correction to achieve time resolutions comparable to a classical chain of CFD and standard TDC. Additionally, the time-over-threshold information can be exploited for gain matching and allows to monitor online the gain-stability and check for electronics problems such as pulse reflections or baseline jitter. The system is well-suited for a wide range of PMT-based fast detectors with many channels and further applications foreseen.

  3. NEUTRON SPECTRUM MEASUREMENTS USING MULTIPLE THRESHOLD DETECTORS

    Energy Technology Data Exchange (ETDEWEB)

    Gerken, William W.; Duffey, Dick

    1963-11-15

    From American Nuclear Society Meeting, New York, Nov. 1963. The use of threshold detectors, which simultaneously undergo reactions with thermal neutrons and two or more fast neutron threshold reactions, was applied to measurements of the neutron spectrum in a reactor. A number of different materials were irradiated to determine the most practical ones for use as multiple threshold detectors. These results, as well as counting techniques and corrections, are presented. Some materials used include aluminum, alloys of Al -Ni, aluminum-- nickel oxides, and magesium orthophosphates. (auth)

  4. Cost-effectiveness thresholds: pros and cons.

    Science.gov (United States)

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  5. Cost–effectiveness thresholds: pros and cons

    Science.gov (United States)

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  6. Length scale and manufacturability in density-based topology optimization

    DEFF Research Database (Denmark)

    Lazarov, Boyan Stefanov; Wang, Fengwen; Sigmund, Ole

    2016-01-01

    Since its original introduction in structural design, density-based topology optimization has been applied to a number of other fields such as microelectromechanical systems, photonics, acoustics and fluid mechanics. The methodology has been well accepted in industrial design processes where it can...... provide competitive designs in terms of cost, materials and functionality under a wide set of constraints. However, the optimized topologies are often considered as conceptual due to loosely defined topologies and the need of postprocessing. Subsequent amendments can affect the optimized design...

  7. In acceptance we trust? Conceptualising acceptance as a viable approach to NGO security management.

    Science.gov (United States)

    Fast, Larissa A; Freeman, C Faith; O'Neill, Michael; Rowley, Elizabeth

    2013-04-01

    This paper documents current understanding of acceptance as a security management approach and explores issues and challenges non-governmental organisations (NGOs) confront when implementing an acceptance approach to security management. It argues that the failure of organisations to systematise and clearly articulate acceptance as a distinct security management approach and a lack of organisational policies and procedures concerning acceptance hinder its efficacy as a security management approach. The paper identifies key and cross-cutting components of acceptance that are critical to its effective implementation in order to advance a comprehensive and systematic concept of acceptance. The key components of acceptance illustrate how organisational and staff functions affect positively or negatively an organisation's acceptance, and include: an organisation's principles and mission, communications, negotiation, programming, relationships and networks, stakeholder and context analysis, staffing, and image. The paper contends that acceptance is linked not only to good programming, but also to overall organisational management and structures. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  8. Summary report of a workshop on establishing cumulative effects thresholds : a suggested approach for establishing cumulative effects thresholds in a Yukon context

    International Nuclear Information System (INIS)

    2003-01-01

    Increasingly, thresholds are being used as a land and cumulative effects assessment and management tool. To assist in the management of wildlife species such as woodland caribou, the Department of Indian and Northern Affairs (DIAND) Environment Directorate, Yukon sponsored a workshop to develop and use cumulative thresholds in the Yukon. The approximately 30 participants reviewed recent initiatives in the Yukon and other jurisdictions. The workshop is expected to help formulate a strategic vision for implementing cumulative effects thresholds in the Yukon. The key to success resides in building relationships with Umbrella Final Agreement (UFA) Boards, the Development Assessment Process (DAP), and the Yukon Environmental and Socio-Economic Assessment Act (YESAA). Broad support is required within an integrated resource management framework. The workshop featured discussions on current science and theory of cumulative effects thresholds. Potential data and implementation issues were also discussed. It was concluded that thresholds are useful and scientifically defensible. The threshold research results obtained in Alberta, British Columbia and the Northwest Territories are applicable to the Yukon. One of the best tools for establishing and tracking thresholds is habitat effectiveness. Effects must be monitored and tracked. Biologists must share their information with decision makers. Interagency coordination and assistance should be facilitated through the establishment of working groups. Regional land use plans should include thresholds. 7 refs.

  9. Ultralow percolation threshold of single walled carbon nanotube-epoxy composites synthesized via an ionic liquid dispersant/initiator

    International Nuclear Information System (INIS)

    Watters, Arianna L; Palmese, Giuseppe R

    2014-01-01

    Uniform dispersion of single walled carbon nanotubes (SWNTs) in an epoxy was achieved by a streamlined mechano-chemical processing method. SWNT-epoxy composites were synthesized using a room temperature ionic liquid (IL) with an imidazolium cation and dicyanamide anion. The novel approach of using ionic liquid that behaves as a dispersant for SWNTs and initiator for epoxy polymerization greatly simplifies nanocomposite synthesis. The material was processed using simple and scalable three roll milling. The SWNT dispersion of the resultant composite was evaluated by electron microscopy and electrical conductivity measurements in conjunction with percolation theory. Processing conditions were optimized to achieve the lowest possible percolation threshold, 4.29 × 10 −5 volume fraction SWNTs. This percolation threshold is among the best reported in literature yet it was obtained using a streamlined method that greatly simplifies processing. (paper)

  10. Identifying Threshold Concepts for Information Literacy: A Delphi Study

    OpenAIRE

    Lori Townsend; Amy R. Hofer; Silvia Lin Hanick; Korey Brunetti

    2016-01-01

    This study used the Delphi method to engage expert practitioners on the topic of threshold concepts for information literacy. A panel of experts considered two questions. First, is the threshold concept approach useful for information literacy instruction? The panel unanimously agreed that the threshold concept approach holds potential for information literacy instruction. Second, what are the threshold concepts for information literacy instruction? The panel proposed and discussed over fift...

  11. Melanin microcavitation threshold in the near infrared

    Science.gov (United States)

    Schmidt, Morgan S.; Kennedy, Paul K.; Vincelette, Rebecca L.; Schuster, Kurt J.; Noojin, Gary D.; Wharmby, Andrew W.; Thomas, Robert J.; Rockwell, Benjamin A.

    2014-02-01

    Thresholds for microcavitation of isolated bovine and porcine melanosomes were determined using single nanosecond (ns) laser pulses in the NIR (1000 - 1319 nm) wavelength regime. Average fluence thresholds for microcavitation increased non-linearly with increasing wavelength. Average fluence thresholds were also measured for 10-ns pulses at 532 nm, and found to be comparable to visible ns pulse values published in previous reports. Fluence thresholds were used to calculate melanosome absorption coefficients, which decreased with increasing wavelength. This trend was found to be comparable to the decrease in retinal pigmented epithelial (RPE) layer absorption coefficients reported over the same wavelength region. Estimated corneal total intraocular energy (TIE) values were determined and compared to the current and proposed maximum permissible exposure (MPE) safe exposure levels. Results from this study support the proposed changes to the MPE levels.

  12. Two-Dimensional IIR Filter Design Using Simulated Annealing Based Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Supriya Dhabal

    2014-01-01

    Full Text Available We present a novel hybrid algorithm based on particle swarm optimization (PSO and simulated annealing (SA for the design of two-dimensional recursive digital filters. The proposed method, known as SA-PSO, integrates the global search ability of PSO with the local search ability of SA and offsets the weakness of each other. The acceptance criterion of Metropolis is included in the basic algorithm of PSO to increase the swarm’s diversity by accepting sometimes weaker solutions also. The experimental results reveal that the performance of the optimal filter designed by the proposed SA-PSO method is improved. Further, the convergence behavior as well as optimization accuracy of proposed method has been improved significantly and computational time is also reduced. In addition, the proposed SA-PSO method also produces the best optimal solution with lower mean and variance which indicates that the algorithm can be used more efficiently in realizing two-dimensional digital filters.

  13. Resonances, cusp effects and a virtual state in e/sup -/-He scattering near the n = 3 thresholds. [Variational methods, resonance, threshold structures

    Energy Technology Data Exchange (ETDEWEB)

    Nesbet, R K [International Business Machines Corp., San Jose, Calif. (USA). Research Lab.

    1978-01-14

    Variational calculations locate and identify resonances and new threshold structures in electron impact excitation of He metastable states, in the region of the 3/sup 3/S and 3/sup 1/S excitation thresholds. A virtual state is found at the 3/sup 3/S threshold.

  14. Particle Swarm Optimization with Various Inertia Weight Variants for Optimal Power Flow Solution

    Directory of Open Access Journals (Sweden)

    Prabha Umapathy

    2010-01-01

    Full Text Available This paper proposes an efficient method to solve the optimal power flow problem in power systems using Particle Swarm Optimization (PSO. The objective of the proposed method is to find the steady-state operating point which minimizes the fuel cost, while maintaining an acceptable system performance in terms of limits on generator power, line flow, and voltage. Three different inertia weights, a constant inertia weight (CIW, a time-varying inertia weight (TVIW, and global-local best inertia weight (GLbestIW, are considered with the particle swarm optimization algorithm to analyze the impact of inertia weight on the performance of PSO algorithm. The PSO algorithm is simulated for each of the method individually. It is observed that the PSO algorithm with the proposed inertia weight yields better results, both in terms of optimal solution and faster convergence. The proposed method has been tested on the standard IEEE 30 bus test system to prove its efficacy. The algorithm is computationally faster, in terms of the number of load flows executed, and provides better results than other heuristic techniques.

  15. Modeling length of stay as an optimized two-dass prediction problem

    NARCIS (Netherlands)

    Verduijn, M.; Peek, N.; Voorbraak, F.; de Jonge, E.; de Mol, B. A. J. M.

    2007-01-01

    Objectives. To develop a predictive model for the outcome length of stay at the Intensive Care Unit (ICU LOS), including the choice of an optimal dichotomization threshold for this outcome. Reduction of prediction problems of this type of outcome to a two-doss problem is a common strategy to

  16. CTOD-based acceptance criteria for heat exchanger head staybolts

    International Nuclear Information System (INIS)

    Lam, P.S.; Sindelar, R.L.; Barnes, D.M.; Awadalla, N.G.

    1992-01-01

    The primary coolant piping system of the Savannah River Site (SRS) reactors contains twelve heat exchangers to remove the waste heat from the nuclear materials production. A large break at the inlet or outlet heads of the heat exchangers would occur if the restraint members of the heads become inactive. The heat exchanger head is attached to the tubesheet by 84 staybolts. The structural integrity of the heads is demonstrated by showing the redundant capacity of the staybolts to restrain the head at design conditions and under seismic loadings. The beat exchanger head is analyzed with a three- dimensional finite element model. The restraint provided by the staybolts is evaluated for several postulated cases of inactive or missing staybolts, that is, bolts that have a flaw exceeding the ultrasonic testing (UT) threshold depth of 25% of the bolt diameter. A limit of 6 inactive staybolts is reached with a fracture criterion based on the maximum allowable local displacement at the active staybolts which corresponds to the crack tip opening displacement (CTOD) of 0.032 inches. An acceptance criteria methodology has been developed to disposition flaws reported in the staybolt inspections while ensuring adequate restraint capacity of the staybolts to maintain integrity of the heat exchanger heads against collapse. The methodology includes an approach for the baseline and periodic inspections of the staybolts. A total of up to 6 staybolts, reported as containing flaws with depths at or exceeding 25% would be acceptable in the heat exchanger

  17. Ambient high temperature and mortality in Jinan, China: A study of heat thresholds and vulnerable populations.

    Science.gov (United States)

    Li, Jing; Xu, Xin; Yang, Jun; Liu, Zhidong; Xu, Lei; Gao, Jinghong; Liu, Xiaobo; Wu, Haixia; Wang, Jun; Yu, Jieqiong; Jiang, Baofa; Liu, Qiyong

    2017-07-01

    Understanding the health consequences of continuously rising temperatures-as is projected for China-is important in terms of developing heat-health adaptation and intervention programs. This study aimed to examine the association between mortality and daily maximum (T max ), mean (T mean ), and minimum (T min ) temperatures in warmer months; to explore threshold temperatures; and to identify optimal heat indicators and vulnerable populations. Daily data on temperature and mortality were obtained for the period 2007-2013. Heat thresholds for condition-specific mortality were estimated using an observed/expected analysis. We used a generalised additive model with a quasi-Poisson distribution to examine the association between mortality and T max /T min /T mean values higher than the threshold values, after adjustment for covariates. T max /T mean /T min thresholds were 32/28/24°C for non-accidental deaths; 32/28/24°C for cardiovascular deaths; 35/31/26°C for respiratory deaths; and 34/31/28°C for diabetes-related deaths. For each 1°C increase in T max /T mean /T min above the threshold, the mortality risk of non-accidental-, cardiovascular-, respiratory, and diabetes-related death increased by 2.8/5.3/4.8%, 4.1/7.2/6.6%, 6.6/25.3/14.7%, and 13.3/30.5/47.6%, respectively. Thresholds for mortality differed according to health condition when stratified by sex, age, and education level. For non-accidental deaths, effects were significant in individuals aged ≥65 years (relative risk=1.038, 95% confidence interval: 1.026-1.050), but not for those ≤64 years. For most outcomes, women and people ≥65 years were more vulnerable. High temperature significantly increases the risk of mortality in the population of Jinan, China. Climate change with rising temperatures may bring about the situation worse. Public health programs should be improved and implemented to prevent and reduce health risks during hot days, especially for the identified vulnerable groups. Copyright

  18. Regulatory acceptance and use of 3R models for pharmaceuticals and chemicals: expert opinions on the state of affairs and the way forward.

    Science.gov (United States)

    Schiffelers, Marie-Jeanne W A; Blaauboer, Bas J; Bakker, Wieger E; Beken, Sonja; Hendriksen, Coenraad F M; Koëter, Herman B W M; Krul, Cyrille

    2014-06-01

    Pharmaceuticals and chemicals are subjected to regulatory safety testing accounting for approximately 25% of laboratory animal use in Europe. This testing meets various objections and has led to the development of a range of 3R models to Replace, Reduce or Refine the animal models. However, these models must overcome many barriers before being accepted for regulatory risk management purposes. This paper describes the barriers and drivers and options to optimize this acceptance process as identified by two expert panels, one on pharmaceuticals and one on chemicals. To untangle the complex acceptance process, the multilevel perspective on technology transitions is applied. This perspective defines influences at the micro-, meso- and macro level which need alignment to induce regulatory acceptance of a 3R model. This paper displays that there are many similar mechanisms within both sectors that prevent 3R models from becoming accepted for regulatory risk assessment and management. Shared barriers include the uncertainty about the value of the new 3R models (micro level), the lack of harmonization of regulatory requirements and acceptance criteria (meso level) and the high levels of risk aversion (macro level). In optimizing the process commitment, communication, cooperation and coordination are identified as critical drivers. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. HOW TO DEAL WITH WASTE ACCEPTANCE UNCERTAINTY USING THE WASTE ACCEPTANCE CRITERIA FORECASTING AND ANALYSIS CAPABILITY SYSTEM (WACFACS)

    Energy Technology Data Exchange (ETDEWEB)

    Redus, K. S.; Hampshire, G. J.; Patterson, J. E.; Perkins, A. B.

    2002-02-25

    The Waste Acceptance Criteria Forecasting and Analysis Capability System (WACFACS) is used to plan for, evaluate, and control the supply of approximately 1.8 million yd3 of low-level radioactive, TSCA, and RCRA hazardous wastes from over 60 environmental restoration projects between FY02 through FY10 to the Oak Ridge Environmental Management Waste Management Facility (EMWMF). WACFACS is a validated decision support tool that propagates uncertainties inherent in site-related contaminant characterization data, disposition volumes during EMWMF operations, and project schedules to quantitatively determine the confidence that risk-based performance standards are met. Trade-offs in schedule, volumes of waste lots, and allowable concentrations of contaminants are performed to optimize project waste disposition, regulatory compliance, and disposal cell management.

  20. HOW TO DEAL WITH WASTE ACCEPTANCE UNCERTAINTY USING THE WASTE ACCEPTANCE CRITERIA FORECASTING AND ANALYSIS CAPABILITY SYSTEM (WACFACS)

    International Nuclear Information System (INIS)

    Redus, K. S.; Hampshire, G. J.; Patterson, J. E.; Perkins, A. B.

    2002-01-01

    The Waste Acceptance Criteria Forecasting and Analysis Capability System (WACFACS) is used to plan for, evaluate, and control the supply of approximately 1.8 million yd3 of low-level radioactive, TSCA, and RCRA hazardous wastes from over 60 environmental restoration projects between FY02 through FY10 to the Oak Ridge Environmental Management Waste Management Facility (EMWMF). WACFACS is a validated decision support tool that propagates uncertainties inherent in site-related contaminant characterization data, disposition volumes during EMWMF operations, and project schedules to quantitatively determine the confidence that risk-based performance standards are met. Trade-offs in schedule, volumes of waste lots, and allowable concentrations of contaminants are performed to optimize project waste disposition, regulatory compliance, and disposal cell management

  1. Effects of pulse duration on magnetostimulation thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Saritas, Emine U., E-mail: saritas@ee.bilkent.edu.tr [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of Electrical and Electronics Engineering, Bilkent University, Bilkent, Ankara 06800 (Turkey); National Magnetic Resonance Research Center (UMRAM), Bilkent University, Bilkent, Ankara 06800 (Turkey); Goodwill, Patrick W. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Conolly, Steven M. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of EECS, University of California, Berkeley, California 94720-1762 (United States)

    2015-06-15

    Purpose: Medical imaging techniques such as magnetic resonance imaging and magnetic particle imaging (MPI) utilize time-varying magnetic fields that are subject to magnetostimulation limits, which often limit the speed of the imaging process. Various human-subject experiments have studied the amplitude and frequency dependence of these thresholds for gradient or homogeneous magnetic fields. Another contributing factor was shown to be number of cycles in a magnetic pulse, where the thresholds decreased with longer pulses. The latter result was demonstrated on two subjects only, at a single frequency of 1.27 kHz. Hence, whether the observed effect was due to the number of cycles or due to the pulse duration was not specified. In addition, a gradient-type field was utilized; hence, whether the same phenomenon applies to homogeneous magnetic fields remained unknown. Here, the authors investigate the pulse duration dependence of magnetostimulation limits for a 20-fold range of frequencies using homogeneous magnetic fields, such as the ones used for the drive field in MPI. Methods: Magnetostimulation thresholds were measured in the arms of six healthy subjects (age: 27 ± 5 yr). Each experiment comprised testing the thresholds at eight different pulse durations between 2 and 125 ms at a single frequency, which took approximately 30–40 min/subject. A total of 34 experiments were performed at three different frequencies: 1.2, 5.7, and 25.5 kHz. A solenoid coil providing homogeneous magnetic field was used to induce stimulation, and the field amplitude was measured in real time. A pre-emphasis based pulse shaping method was employed to accurately control the pulse durations. Subjects reported stimulation via a mouse click whenever they felt a twitching/tingling sensation. A sigmoid function was fitted to the subject responses to find the threshold at a specific frequency and duration, and the whole procedure was repeated at all relevant frequencies and pulse durations

  2. Effects of pulse duration on magnetostimulation thresholds

    International Nuclear Information System (INIS)

    Saritas, Emine U.; Goodwill, Patrick W.; Conolly, Steven M.

    2015-01-01

    Purpose: Medical imaging techniques such as magnetic resonance imaging and magnetic particle imaging (MPI) utilize time-varying magnetic fields that are subject to magnetostimulation limits, which often limit the speed of the imaging process. Various human-subject experiments have studied the amplitude and frequency dependence of these thresholds for gradient or homogeneous magnetic fields. Another contributing factor was shown to be number of cycles in a magnetic pulse, where the thresholds decreased with longer pulses. The latter result was demonstrated on two subjects only, at a single frequency of 1.27 kHz. Hence, whether the observed effect was due to the number of cycles or due to the pulse duration was not specified. In addition, a gradient-type field was utilized; hence, whether the same phenomenon applies to homogeneous magnetic fields remained unknown. Here, the authors investigate the pulse duration dependence of magnetostimulation limits for a 20-fold range of frequencies using homogeneous magnetic fields, such as the ones used for the drive field in MPI. Methods: Magnetostimulation thresholds were measured in the arms of six healthy subjects (age: 27 ± 5 yr). Each experiment comprised testing the thresholds at eight different pulse durations between 2 and 125 ms at a single frequency, which took approximately 30–40 min/subject. A total of 34 experiments were performed at three different frequencies: 1.2, 5.7, and 25.5 kHz. A solenoid coil providing homogeneous magnetic field was used to induce stimulation, and the field amplitude was measured in real time. A pre-emphasis based pulse shaping method was employed to accurately control the pulse durations. Subjects reported stimulation via a mouse click whenever they felt a twitching/tingling sensation. A sigmoid function was fitted to the subject responses to find the threshold at a specific frequency and duration, and the whole procedure was repeated at all relevant frequencies and pulse durations

  3. Systematic and robust design of photonic crystal waveguides by topology optimization

    DEFF Research Database (Denmark)

    Wang, Fengwen; Jensen, Jakob Søndergaard; Sigmund, Ole

    2010-01-01

    on a threshold projection. The objective is formulated to minimize the maximum error between actual group indices and a prescribed group index among these three designs. Novel photonic crystal waveguide facilitating slow light with a group index of n(g) = 40 is achieved by the robust optimization approach......A robust topology optimization method is presented to consider manufacturing uncertainties in tailoring dispersion properties of photonic crystal waveguides. The under, normal and over-etching scenarios in manufacturing process are represented by dilated, intermediate and eroded designs based....... The numerical result illustrates that the robust topology optimization provides a systematic and robust design methodology for photonic crystal waveguide design....

  4. Infinite horizon optimal impulsive control with applications to Internet congestion control

    Science.gov (United States)

    Avrachenkov, Konstantin; Habachi, Oussama; Piunovskiy, Alexey; Zhang, Yi

    2015-04-01

    We investigate infinite-horizon deterministic optimal control problems with both gradual and impulsive controls, where any finitely many impulses are allowed simultaneously. Both discounted and long-run time-average criteria are considered. We establish very general and at the same time natural conditions, under which the dynamic programming approach results in an optimal feedback policy. The established theoretical results are applied to the Internet congestion control, and by solving analytically and nontrivially the underlying optimal control problems, we obtain a simple threshold-based active queue management scheme, which takes into account the main parameters of the transmission control protocols, and improves the fairness among the connections in a given network.

  5. Integrated Optimization of Bus Line Fare and Operational Strategies Using Elastic Demand

    Directory of Open Access Journals (Sweden)

    Chunyan Tang

    2017-01-01

    Full Text Available An optimization approach for designing a transit service system is proposed. Its objective would be the maximization of total social welfare, by providing a profitable fare structure and tailoring operational strategies to passenger demand. These operational strategies include full route operation (FRO, limited stop, short turn, and a mix of the latter two strategies. The demand function is formulated to reflect the attributes of these strategies, in-vehicle crowding, and fare effects on demand variation. The fare is either a flat fare or a differential fare structure; the latter is based on trip distance and achieved service levels. This proposed methodology is applied to a case study of Dalian, China. The optimal results indicate that an optimal combination of operational strategies integrated with a differential fare structure results in the highest potential for increasing total social welfare, if the value of parameter ε related to additional service fee is low. When this value increases up to more than a threshold, strategies with a flat fare show greater benefits. If this value increases beyond yet another threshold, the use of skipped stop strategies is not recommended.

  6. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  7. SU-E-T-110: An Investigation On Monitor Unit Threshold and Effects On IMPT Delivery in Proton Pencil Beam Planning System

    International Nuclear Information System (INIS)

    Syh, J; Ding, X; Syh, J; Patel, B; Rosen, L; Wu, H

    2015-01-01

    Purpose: An approved proton pencil beam scanning (PBS) treatment plan might not be able to deliver because of existed extremely low monitor unit per beam spot. A dual hybrid plan with higher efficiency of higher spot monitor unit and the efficacy of less number of energy layers were searched and optimized. The range of monitor unit threshold setting was investigated and the plan quality was evaluated by target dose conformity. Methods: Certain limitations and requirements need to be checks and tested before a nominal proton PBS treatment plan can be delivered. The plan needs to be met the machine characterization, specification in record and verification to deliver the beams. Minimal threshold of monitor unit, e.g. 0.02, per spot was set to filter the low counts and plan was re-computed. Further MU threshold increment was tested in sequence without sacrificing the plan quality. The number of energy layer was also alternated due to elimination of low count layer(s). Results: Minimal MU/spot threshold, spot spacing in each energy layer and total number of energy layer and the MU weighting of beam spots of each beam were evaluated. Plan optimization between increases of the spot MU (efficiency) and less energy layers of delivery (efficacy) was adjusted. 5% weighting limit of total monitor unit per beam was feasible. Scarce spreading of beam spots was not discouraging as long as target dose conformity within 3% criteria. Conclusion: Each spot size is equivalent to the relative dose in the beam delivery system. The energy layer is associated with the depth of the targeting tumor. Our work is crucial to maintain the best possible quality plan. To keep integrity of all intrinsic elements such as spot size, spot number, layer number and the carried weighting of spots in each layer is important in this study

  8. The gradual nature of threshold switching

    International Nuclear Information System (INIS)

    Wimmer, M; Salinga, M

    2014-01-01

    The recent commercialization of electronic memories based on phase change materials proved the usability of this peculiar family of materials for application purposes. More advanced data storage and computing concepts, however, demand a deeper understanding especially of the electrical properties of the amorphous phase and the switching behaviour. In this work, we investigate the temporal evolution of the current through the amorphous state of the prototypical phase change material, Ge 2 Sb 2 Te 5 , under constant voltage. A custom-made electrical tester allows the measurement of delay times over five orders of magnitude, as well as the transient states of electrical excitation prior to the actual threshold switching. We recognize a continuous current increase over time prior to the actual threshold-switching event to be a good measure for the electrical excitation. A clear correlation between a significant rise in pre-switching-current and the later occurrence of threshold switching can be observed. This way, we found experimental evidence for the existence of an absolute minimum for the threshold voltage (or electric field respectively) holding also for time scales far beyond the measurement range. (paper)

  9. Optimization in the nuclear fuel cycle II: Surface contamination

    International Nuclear Information System (INIS)

    Pereira, W.S.; Silva, A.X.; Lopes, J.M.; Carmo, A.S.; Fernandes, T.S.; Mello, C.R.; Kelecom, A.

    2017-01-01

    Optimization is one of the bases of radioprotection and aims to move doses away from the dose limit that is the borderline of acceptable radiological risk. This work aims to use the monitoring of surface contamination as a tool of the optimization process. 53 surface contamination points were analyzed at a nuclear fuel cycle facility. Three sampling points were identified with monthly mean values of contamination higher than 1 Bq ∙ cm -2 , points 28, 42 and 47. These points were indicated for the beginning of the optimization process

  10. Identifying thresholds for ecosystem-based management.

    Directory of Open Access Journals (Sweden)

    Jameal F Samhouri

    Full Text Available BACKGROUND: One of the greatest obstacles to moving ecosystem-based management (EBM from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. METHODOLOGY/PRINCIPAL FINDINGS: To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity and functional (e.g., resilience attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1 fishing and (2 nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. CONCLUSIONS/SIGNIFICANCE: For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management.

  11. Single event upset threshold estimation based on local laser irradiation

    International Nuclear Information System (INIS)

    Chumakov, A.I.; Egorov, A.N.; Mavritsky, O.B.; Yanenko, A.V.

    1999-01-01

    An approach for estimation of ion-induced SEU threshold based on local laser irradiation is presented. Comparative experiment and software simulation research were performed at various pulse duration and spot size. Correlation of single event threshold LET to upset threshold laser energy under local irradiation was found. The computer analysis of local laser irradiation of IC structures was developed for SEU threshold LET estimation. The correlation of local laser threshold energy with SEU threshold LET was shown. Two estimation techniques were suggested. The first one is based on the determination of local laser threshold dose taking into account the relation of sensitive area to local irradiated area. The second technique uses the photocurrent peak value instead of this relation. The agreement between the predicted and experimental results demonstrates the applicability of this approach. (authors)

  12. MEASUREMENT FOR ACCEPTANCE OF SUPPLY CHAIN SIMULATOR APPLICATION USING TECHNOLOGY ACCEPTANCE MODEL

    Directory of Open Access Journals (Sweden)

    Mulyati E.

    2018-03-01

    Full Text Available The aim of this research for was to measure the user acceptance of simulator application which was built as a tool for student in learning of supply chain, particularly in bullwhip effect problem. The measurements used for the acceptance of supply chain simulator application in this research was the Technology Acceptance Model from 162 samples which were analyzed with Confirmatory Factor Analysis and Structural Equation Modelling. The result of this research indicated that the user acceptance (shown by customer participation of supply chain simulator was directly influence by perceived usefulness of supply chain simulator application used (positive and significant; the user acceptance of supply chain simulator was indirectly influenced by perceived ease of use in using supply chain simulator application (positive but not significant; the user acceptance of supply chain simulator was indirectly influenced by perceived enjoyment when the supply chain simulator application was used. The research would give a better understanding about a bullwhip effect and better experience for students, which would not be obtained through conventional learning, when the tools were not used.

  13. Image thresholding in the high resolution target movement monitor

    Science.gov (United States)

    Moss, Randy H.; Watkins, Steve E.; Jones, Tristan H.; Apel, Derek B.; Bairineni, Deepti

    2009-03-01

    Image thresholding in the High Resolution Target Movement Monitor (HRTMM) is examined. The HRTMM was developed at the Missouri University of Science and Technology to detect and measure wall movements in underground mines to help reduce fatality and injury rates. The system detects the movement of a target with sub-millimeter accuracy based on the images of one or more laser dots projected on the target and viewed by a high-resolution camera. The relative position of the centroid of the laser dot (determined by software using thresholding concepts) in the images is the key factor in detecting the target movement. Prior versions of the HRTMM set the image threshold based on a manual, visual examination of the images. This work systematically examines the effect of varying threshold on the calculated centroid position and describes an algorithm for determining a threshold setting. First, the thresholding effects on the centroid position are determined for a stationary target. Plots of the centroid positions as a function of varying thresholds are obtained to identify clusters of thresholds for which the centroid position does not change for stationary targets. Second, the target is moved away from the camera in sub-millimeter increments and several images are obtained at each position and analyzed as a function of centroid position, target movement and varying threshold values. With this approach, the HRTMM can accommodate images in batch mode without the need for manual intervention. The capability for the HRTMM to provide automated, continuous monitoring of wall movement is enhanced.

  14. A Robust Threshold for Iterative Channel Estimation in OFDM Systems

    Directory of Open Access Journals (Sweden)

    A. Kalaycioglu

    2010-04-01

    Full Text Available A novel threshold computation method for pilot symbol assisted iterative channel estimation in OFDM systems is considered. As the bits are transmitted in packets, the proposed technique is based on calculating a particular threshold for each data packet in order to select the reliable decoder output symbols to improve the channel estimation performance. Iteratively, additional pilot symbols are established according to the threshold and the channel is re-estimated with the new pilots inserted to the known channel estimation pilot set. The proposed threshold calculation method for selecting additional pilots performs better than non-iterative channel estimation, no threshold and fixed threshold techniques in poor HF channel simulations.

  15. Threshold-adaptive canny operator based on cross-zero points

    Science.gov (United States)

    Liu, Boqi; Zhang, Xiuhua; Hong, Hanyu

    2018-03-01

    Canny edge detection[1] is a technique to extract useful structural information from different vision objects and dramatically reduce the amount of data to be processed. It has been widely applied in various computer vision systems. There are two thresholds have to be settled before the edge is segregated from background. Usually, by the experience of developers, two static values are set as the thresholds[2]. In this paper, a novel automatic thresholding method is proposed. The relation between the thresholds and Cross-zero Points is analyzed, and an interpolation function is deduced to determine the thresholds. Comprehensive experimental results demonstrate the effectiveness of proposed method and advantageous for stable edge detection at changing illumination.

  16. SART-Type Half-Threshold Filtering Approach for CT Reconstruction.

    Science.gov (United States)

    Yu, Hengyong; Wang, Ge

    2014-01-01

    The [Formula: see text] regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the [Formula: see text] norm (0 < p < 1) and solve the [Formula: see text] minimization problem. Very recently, Xu et al. developed an analytic solution for the [Formula: see text] regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering.

  17. Alternative method for determining anaerobic threshold in rowers

    Directory of Open Access Journals (Sweden)

    Giovani Dos Santos Cunha

    2008-01-01

    Full Text Available http://dx.doi.org/10.5007/1980-0037.2008v10n4p367 In rowing, the standard breathing that athletes are trained to use makes it difficult, or even impossible, to detect ventilatory limits, due to the coupling of the breath with the technical movement. For this reason, some authors have proposed determining the anaerobic threshold from the respiratory exchange ratio (RER, but there is not yet consensus on what value of RER should be used. The objective of this study was to test what value of RER corresponds to the anaerobic threshold and whether this value can be used as an independent parameter for determining the anaerobic threshold of rowers. The sample comprised 23 male rowers. They were submitted to a maximal cardiorespiratory test on a rowing ergometer with concurrent ergospirometry in order to determine VO2máx and the physiological variables corresponding to their anaerobic threshold. The anaerobic threshold was determined using the Dmax (maximal distance method. The physiological variables were classified into maximum values and anaerobic threshold values. The maximal state of these rowers reached VO2 (58.2±4.4 ml.kg-1.min-1, lactate (8.2±2.1 mmol.L-1, power (384±54.3 W and RER (1.26±0.1. At the anaerobic threshold they reached VO2 (46.9±7.5 ml.kg-1.min-1, lactate (4.6±1.3 mmol.L-1, power (300± 37.8 W and RER (0.99±0.1. Conclusions - the RER can be used as an independent method for determining the anaerobic threshold of rowers, adopting a value of 0.99, however, RER should exhibit a non-linear increase above this figure.

  18. Comparative Pessimism or Optimism: Depressed Mood, Risk-Taking, Social Utility and Desirability.

    Science.gov (United States)

    Milhabet, Isabelle; Le Barbenchon, Emmanuelle; Cambon, Laurent; Molina, Guylaine

    2015-03-05

    Comparative optimism can be defined as a self-serving, asymmetric judgment of the future. It is often thought to be beneficial and socially accepted, whereas comparative pessimism is correlated with depression and socially rejected. Our goal was to examine the social acceptance of comparative optimism and the social rejection of comparative pessimism in two dimensions of social judgment, social desirability and social utility, considering the attributions of dysphoria and risk-taking potential (studies 2 and 3) on outlooks on the future. In three experiments, the participants assessed either one (study 1) or several (studies 2 and 3) fictional targets in two dimensions, social utility and social desirability. Targets exhibiting comparatively optimistic or pessimistic outlooks on the future were presented as non-depressed, depressed, or neither (control condition) (study 1); non-depressed or depressed (study 2); and non-depressed or in control condition (study 3). Two significant results were obtained: (1) social rejection of comparative pessimism in the social desirability dimension, which can be explained by its depressive feature; and (2) comparative optimism was socially accepted on the social utility dimension, which can be explained by the perception that comparatively optimistic individuals are potential risk-takers.

  19. Public acceptance of a hypothetical Ebola virus vaccine in Aceh, Indonesia: A hospital-based survey

    Directory of Open Access Journals (Sweden)

    Harapan Harapan

    2017-04-01

    Full Text Available Objective: To determine the acceptance towards a hypothetical Ebola virus vaccine (EVV and associated factors in a non-affected country, Indonesia. Methods: A hospital-based, cross-sectional study was conducted in four regencies of Aceh, Indonesia. A set of pre-tested questionnaires was used to obtain information on acceptance towards EVV and a range of explanatory variables. Associations between EVV acceptance and explanatory variables were tested using multi-steps logistic regression analysis and the Spearman's rank correlation. Results: Participants who had knowledge on Ebola virus disease (EVD were 45.3% (192/424 and none of the participants achieved 80% correct answers on the knowledge regarding to EVD. About 73% of participants expressed their willingness to receive the EVV. Education attainment, occupation, monthly income, have heard regarding to EVD previously, socioeconomic level, attitude towards vaccination practice and knowledge regarding to EVD were associated significantly with acceptance towards EVV in univariate analysis (P < 0.05. In the final multivariate model, socio-economic level, attitude towards vaccination practice and knowledge regarding to EVD were the independent explanatory variables for EVV acceptance. Conclusions: The knowledge of EVD was low, but this minimally affected the acceptance towards EVV. However, to facilitate optimal uptake of EVV, dissemination of vaccine-related information prior to its introduction is required.

  20. Fatigue threshold studies in Fe, Fe-Si, and HSLA steel: Part II. Thermally activated behavior of the effective stress intensity at threshold

    International Nuclear Information System (INIS)

    Yu, W.; Esaklul, K.; Gerberich, W.W.

    1984-01-01

    It is shown that closure mechanisms alone cannot fully explain increasing fatigue thresholds with decreasing test temperature. Implications are that fatigue crack propagation near threshold is a thermally activated process. The effective threshold stress intensity correlate to the thermal component of the flow stress. A fractographic study of the fatigue surface was performed. Water vapor in room air promotes the formation of oxide and intergranular crack growth. At lower temperatures, a brittle-type cyclic cleavage fatigue surface was observed but the ductile process persisted even at 123 K. Arrest marks found on all three modes of fatigue crack growth suggest that fatigue crack growth controlled by the subcell structure near threshold. The effective fatigue threshold may be related to the square root of (one plus the strain rate sensitivity)

  1. The Acceptance Strategy for Nuclear Power Plant In Indonesia

    International Nuclear Information System (INIS)

    Suhaemi, Tjipta; Syaukat, Achmad

    2010-01-01

    Indonesia has planned to build nuclear power plants. Some feasibility studies have been conducted intensively. However, the processes of NPP introduction are still uncertain. National Energy Plan in Indonesia, which has been made by some governmental agencies, does not yet give positive impact to the government decision to construct the nuclear power plant (NPP). This paper discusses the process of NPP introduction in Indonesia, which has been colored with debate of stakeholder and has delayed decision for go-nuclear. The technology paradigm is used to promote NPP as an alternative of reliable energy resources. This paradigm should be complemented with international politic-economic point of view. The international politic-economic point of view shows that structural powers, consisting of security, production, finance, and knowledge structures, within which the NPP is introduced, have dynamic characteristics. The process of NPP introduction in Indonesia contains some infrastructure development (R and D, legislation, regulation, energy planning, site study, public acceptance efforts, etc), but they need a better coherent NPP implementation program and NPP Acceptance Program. Strategic patterns for NPP acceptance described in this paper are made by considering nuclear regulation development and the interest of basic domestic participation. The first NPP program in Indonesia having proven technology and basic domestic participation is and important milestone toward and optimal national energy-mix.

  2. Acceptable levels of digital image compression in chest radiology

    International Nuclear Information System (INIS)

    Smith, I.

    2000-01-01

    The introduction of picture archival and communications systems (PACS) and teleradiology has prompted an examination of techniques that optimize the storage capacity and speed of digital storage and distribution networks. The general acceptance of the move to replace conventional screen-film capture with computed radiography (CR) is an indication that clinicians within the radiology community are willing to accept images that have been 'compressed'. The question to be answered, therefore, is what level of compression is acceptable. The purpose of the present study is to provide an assessment of the ability of a group of imaging professionals to determine whether an image has been compressed. To undertake this study a single mobile chest image, selected for the presence of some subtle pathology in the form of a number of septal lines in both costphrenic angles, was compressed to levels of 10:1, 20:1 and 30:1. These images were randomly ordered and shown to the observers for interpretation. Analysis of the responses indicates that in general it was not possible to distinguish the original image from its compressed counterparts. Furthermore, a preference appeared to be shown for images that have undergone low levels of compression. This preference can most likely be attributed to the 'de-noising' effect of the compression algorithm at low levels. Copyright (1999) Blackwell Science Pty. Ltd

  3. Taste perception with age: pleasantness and its relationships with threshold sensitivity and supra-threshold intensity of five taste qualities

    NARCIS (Netherlands)

    Mojet, J.; Christ-Hazelhof, E.; Heidema, J.

    2005-01-01

    The relationships between threshold sensitivity, supra-threshold intensity of NaCl, KCl, sucrose, aspartame, acetic acid, citric acid, caffeine, quinine HCl, monosodium glutamate (MSG) and inosine 5¿-monophosphate (IMP), and the pleasantness of these stimuli in products, were studied in 21 young

  4. Schubert calculus and threshold polynomials of affine fusion

    International Nuclear Information System (INIS)

    Irvine, S.E.; Walton, M.A.

    2000-01-01

    We show how the threshold level of affine fusion, the fusion of Wess-Zumino-Witten (WZW) conformal field theories, fits into the Schubert calculus introduced by Gepner. The Pieri rule can be modified in a simple way to include the threshold level, so that calculations may be done for all (non-negative integer) levels at once. With the usual Giambelli formula, the modified Pieri formula deforms the tensor product coefficients (and the fusion coefficients) into what we call threshold polynomials. We compare them with the q-deformed tensor product coefficients and fusion coefficients that are related to q-deformed weight multiplicities. We also discuss the meaning of the threshold level in the context of paths on graphs

  5. Simulation of the Optimized Structure of a Laterally Coupled Distributed Feedback (LC-DFB Semiconductor Laser Above Threshold

    Directory of Open Access Journals (Sweden)

    M. Seifouri

    2013-10-01

    Full Text Available In this paper, the laterally coupled distributed feedback semiconductor laser is studied. In the simulations performed, variations of structural parameters such as the grating amplitude a, the ridge width W, the thickness of the active region d, and other structural properties are considered. It is concluded that for certain values ​​of structural parameters, the laser maintains the highest output power, the lowest distortion Bragg frequency δL and the smallest changes in the wavelength λ. Above threshold, output power more than 40mW and SMSR values greater than 50 dB were achieved.

  6. Investigation of the Cause of Low Blister Threshold Temperatures in the RERTR-12 and AFIP-4 Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell K Meyer

    2012-06-01

    Blister–threshold testing of fuel plates is a standard method through which the safety margin for operation of plate-type in research and test reactors is assessed. The blister-threshold temperature is indicative of the ability of fuel to operate at high temperatures for short periods of time (transient conditions) without failure. This method of testing was applied to the newly developed U-Mo monolithic fuel system. Blister annealing studies on the U-Mo monolithic fuel plates began in 2007, with the Reduced Enrichment for Research and Test Reactors (RERTR)-6 experiment, and they have continued as the U-Mo fuel system has evolved through the research and development process. Blister anneal threshold temperatures from early irradiation experiments (RERTR-6 through RERTR-10) ranged from 400 to 500°C. These temperatures were projected to be acceptable for NRC-licensed research reactors and the high-power Advanced Test Reactor (ATR) and the High Flux Isotope Reactor (HFIR) based on current safety-analysis reports (SARs). Initial blister testing results from the RERTR-12 experiment capsules X1 and X2 showed a decrease in the blister-threshold temperatures. Blister threshold temperatures from this experiment ranged from 300 to 400°C. Selected plates from the AFIP-4 experiment, which was fabricated using a process similar to that used to fabricate the RERTR-12 experiment, also underwent blister testing to determine whether results would be similar. The measured blister-threshold temperatures from the AFIP-4 plates fell within the same blister-threshold temperature range measured in the RERTR-12 plates. Investigation of the cause of this decrease in bister threshold temperature is being conducted under the guidance of Idaho National Laboratory PLN-4155, “Analysis of Low Blister Threshold Temperatures in the RERTR-12 and AFIP-4 Experiments,” and is driven by hypotheses. The main focus of the investigation is in the following areas: 1. Fabrication variables 2. Pre

  7. Determination of acceptable risk criteria for nuclear waste management

    International Nuclear Information System (INIS)

    Cohen, J.J.

    1977-01-01

    The initial phase of the work performed during FY 1977 consisted of performing a ''scoping'' study to define issues, determine an optimal methodology for their resolution, and compile a data base for acceptable risk criteria development. The issues, spanning technical, psychological, and ethical dimensions, were categorized in seven major areas: (1) unplanned or accidental events, (2) present vs future risks, (3) institutional controls and retrievability, (4) dose-response mechanism and uncertainty, (5) spatial distribution of exposed populations, (6) different types of nuclear wastes, and (7) public perception. The optimum methodology for developing ARC was determined to be multi-attribute decision analysis encompassing numerous specific techniques for choosing, from among several alternatives, the optimal course of action when the alternatives are constrained to meet specified attributes. The data base developed during the study comprises existing regulations and guidelines, maximum permissible dose, natural geologic hazards, nonradioactive hazardous waste practices, bioethical perspectives, and data from an opinion survey

  8. Determination of acceptable risk criteria for nuclear waste management

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.J.

    1977-10-21

    The initial phase of the work performed during FY 1977 consisted of performing a ''scoping'' study to define issues, determine an optimal methodology for their resolution, and compile a data base for acceptable risk criteria development. The issues, spanning technical, psychological, and ethical dimensions, were categorized in seven major areas: (1) unplanned or accidental events, (2) present vs future risks, (3) institutional controls and retrievability, (4) dose-response mechanism and uncertainty, (5) spatial distribution of exposed populations, (6) different types of nuclear wastes, and (7) public perception. The optimum methodology for developing ARC was determined to be multi-attribute decision analysis encompassing numerous specific techniques for choosing, from among several alternatives, the optimal course of action when the alternatives are constrained to meet specified attributes. The data base developed during the study comprises existing regulations and guidelines, maximum permissible dose, natural geologic hazards, nonradioactive hazardous waste practices, bioethical perspectives, and data from an opinion survey.

  9. Accelerator based production of fissile nuclides, threshold uranium price and perspectives; Akceleratorska proizvodnja fisibilnih nuklida, granicna cijena urana i perspektive

    Energy Technology Data Exchange (ETDEWEB)

    Djordjevic, D [INIS-Inzenjering, Sarajevo (Yugoslavia); Knapp, V [Elektrotehnicki fakultet, zagreb (Yugoslavia)

    1988-07-01

    Accelerator breeder system characteristics are considered in this work. One such system which produces fissile nuclides can supply several thermal reactors with fissile fuel, so this system becomes analogous to an uranium enrichment facility with difference that fissile nuclides are produced by conversion of U-238 rather than by separation from natural uranium. This concept, with other long-term perspective for fission technology on the basis of development only one simpler technology. The influence of basic system characteristics on threshold uranium price is examined. Conditions for economically acceptable production are established. (author)

  10. An Overlay Architecture for Throughput Optimal Multipath Routing

    Science.gov (United States)

    2017-01-14

    maximum throughput. Finally, we propose a threshold-based policy (BP-T) and a heuristic policy (OBP), which dynamically control traffic bifurcations...network stability region is available . Second, given any subset of nodes that are controllable, we also wish to develop an optimal routing policy that...case when tunnels do not overlap. We also develop a heuristic overlay control policy for use on general topologies, and show through simulation that

  11. Liquid–Solid Dual-Gate Organic Transistors with Tunable Threshold Voltage for Cell Sensing

    KAUST Repository

    Zhang, Yu

    2017-10-17

    Liquid electrolyte-gated organic field effect transistors and organic electrochemical transistors have recently emerged as powerful technology platforms for sensing and simulation of living cells and organisms. For such applications, the transistors are operated at a gate voltage around or below 0.3 V because prolonged application of a higher voltage bias can lead to membrane rupturing and cell death. This constraint often prevents the operation of the transistors at their maximum transconductance or most sensitive regime. Here, we exploit a solid–liquid dual-gate organic transistor structure, where the threshold voltage of the liquid-gated conduction channel is controlled by an additional gate that is separated from the channel by a metal-oxide gate dielectric. With this design, the threshold voltage of the “sensing channel” can be linearly tuned in a voltage window exceeding 0.4 V. We have demonstrated that the dual-gate structure enables a much better sensor response to the detachment of human mesenchymal stem cells. In general, the capability of tuning the optimal sensing bias will not only improve the device performance but also broaden the material selection for cell-based organic bioelectronics.

  12. Breaking the current density threshold in spin-orbit-torque magnetic random access memory

    Science.gov (United States)

    Zhang, Yin; Yuan, H. Y.; Wang, X. S.; Wang, X. R.

    2018-04-01

    Spin-orbit-torque magnetic random access memory (SOT-MRAM) is a promising technology for the next generation of data storage devices. The main bottleneck of this technology is the high reversal current density threshold. This outstanding problem is now solved by a new strategy in which the magnitude of the driven current density is fixed while the current direction varies with time. The theoretical limit of minimal reversal current density is only a fraction (the Gilbert damping coefficient) of the threshold current density of the conventional strategy. The Euler-Lagrange equation for the fastest magnetization reversal path and the optimal current pulse is derived for an arbitrary magnetic cell and arbitrary spin-orbit torque. The theoretical limit of minimal reversal current density and current density for a GHz switching rate of the new reversal strategy for CoFeB/Ta SOT-MRAMs are, respectively, of the order of 105 A/cm 2 and 106 A/cm 2 far below 107 A/cm 2 and 108 A/cm 2 in the conventional strategy. Furthermore, no external magnetic field is needed for a deterministic reversal in the new strategy.

  13. Liquid-Solid Dual-Gate Organic Transistors with Tunable Threshold Voltage for Cell Sensing.

    Science.gov (United States)

    Zhang, Yu; Li, Jun; Li, Rui; Sbircea, Dan-Tiberiu; Giovannitti, Alexander; Xu, Junling; Xu, Huihua; Zhou, Guodong; Bian, Liming; McCulloch, Iain; Zhao, Ni

    2017-11-08

    Liquid electrolyte-gated organic field effect transistors and organic electrochemical transistors have recently emerged as powerful technology platforms for sensing and simulation of living cells and organisms. For such applications, the transistors are operated at a gate voltage around or below 0.3 V because prolonged application of a higher voltage bias can lead to membrane rupturing and cell death. This constraint often prevents the operation of the transistors at their maximum transconductance or most sensitive regime. Here, we exploit a solid-liquid dual-gate organic transistor structure, where the threshold voltage of the liquid-gated conduction channel is controlled by an additional gate that is separated from the channel by a metal-oxide gate dielectric. With this design, the threshold voltage of the "sensing channel" can be linearly tuned in a voltage window exceeding 0.4 V. We have demonstrated that the dual-gate structure enables a much better sensor response to the detachment of human mesenchymal stem cells. In general, the capability of tuning the optimal sensing bias will not only improve the device performance but also broaden the material selection for cell-based organic bioelectronics.

  14. Sensory acceptance of mixed nectar of papaya, passion fruit and acerola

    Directory of Open Access Journals (Sweden)

    Matsuura Fernando César Akira Urbano

    2004-01-01

    Full Text Available Nectars are beverages formulated with the juice or pulp of one or more fruits, plus water and sugar in concentrations resulting in a "ready-to-drink" product. Recently, the market for such products has greatly expanded. Fruit mixtures present a series of advantages, such as the combination of different aromas and flavors and the sum of their nutritional components. The objective of this work was to develop a nectar based on papaya pulp and passion fruit juice, enriched with the vitamin C present in acerola pulp, optimizing the formulation using sensory consumer tests and a response surface statistical methodology. Eleven formulations were prepared using different concentrations of papaya pulp and passion fruit juice and sucrose, and maintaining the concentration of acerola pulp constant. The sensory tests were carried out with 22 non-trained panelists using a structured 9-point hedonic scale to evaluate overall acceptance. The acceptance means were submitted to regression analysis, by first calculating a polynomial quadratic equation. A predictive model was adjusted considering only those parameters where P < 0.05, and a response surface was generated. The overall acceptance of nectars of different formulations varied from 5 ("neither liked nor disliked" to more than 7 ("liked moderately", showing that some products can be considered adequate to consumers, like the nectar produced with 37.5% papaya pulp, 7.5% passion fruit juice, and 5.0% acerola pulp, added of 15% sucrose. A quadratic predictive overall acceptance model, with a regression coefficient of 0.97 was obtained. The sensory acceptance of nectars was positively affected by increases in the concentrations of papaya pulp and of sucrose. Thus, some products presented good sensory acceptance suggesting commercial potential.

  15. Effects of programming threshold and maplaw settings on acoustic thresholds and speech discrimination with the MED-EL COMBI 40+ cochlear implant.

    Science.gov (United States)

    Boyd, Paul J

    2006-12-01

    The principal task in the programming of a cochlear implant (CI) speech processor is the setting of the electrical dynamic range (output) for each electrode, to ensure that a comfortable loudness percept is obtained for a range of input levels. This typically involves separate psychophysical measurement of electrical threshold ([theta] e) and upper tolerance levels using short current bursts generated by the fitting software. Anecdotal clinical experience and some experimental studies suggest that the measurement of [theta]e is relatively unimportant and that the setting of upper tolerance limits is more critical for processor programming. The present study aims to test this hypothesis and examines in detail how acoustic thresholds and speech recognition are affected by setting of the lower limit of the output ("Programming threshold" or "PT") to understand better the influence of this parameter and how it interacts with certain other programming parameters. Test programs (maps) were generated with PT set to artificially high and low values and tested on users of the MED-EL COMBI 40+ CI system. Acoustic thresholds and speech recognition scores (sentence tests) were measured for each of the test maps. Acoustic thresholds were also measured using maps with a range of output compression functions ("maplaws"). In addition, subjective reports were recorded regarding the presence of "background threshold stimulation" which is occasionally reported by CI users if PT is set to relatively high values when using the CIS strategy. Manipulation of PT was found to have very little effect. Setting PT to minimum produced a mean 5 dB (S.D. = 6.25) increase in acoustic thresholds, relative to thresholds with PT set normally, and had no statistically significant effect on speech recognition scores on a sentence test. On the other hand, maplaw setting was found to have a significant effect on acoustic thresholds (raised as maplaw is made more linear), which provides some theoretical

  16. Flux threshold measurements of He-ion beam induced nanofuzz formation on hot tungsten surfaces

    International Nuclear Information System (INIS)

    Meyer, F W; Hijazi, H; Bannister, M E; Unocic, K A; Garrison, L M; Parish, C M

    2016-01-01

    We report measurements of the energy dependence of flux thresholds and incubation fluences for He-ion induced nano-fuzz formation on hot tungsten surfaces at UHV conditions over a wide energy range using real-time sample imaging of tungsten target emissivity change to monitor the spatial extent of nano-fuzz growth, corroborated by ex situ SEM and FIB/SEM analysis, in conjunction with accurate ion-flux profile measurements. The measurements were carried out at the multicharged ion research facility (MIRF) at energies from 218 eV to 8.5 keV, using a high-flux deceleration module and beam flux monitor for optimizing the decel optics on the low energy MIRF beamline. The measurements suggest that nano-fuzz formation proceeds only if a critical rate of change of trapped He density in the W target is exceeded. To understand the energy dependence of the observed flux thresholds, the energy dependence of three contributing factors: ion reflection, ion range and target damage creation, were determined using the SRIM simulation code. The observed energy dependence can be well reproduced by the combined energy dependences of these three factors. The incubation fluences deduced from first visual appearance of surface emissivity change were (2–4) × 10 23 m −2 at 218 eV, and roughly a factor of 10 less at the higher energies, which were all at or above the displacement energy threshold. The role of trapping at C impurity sites is discussed. (paper)

  17. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation

    International Nuclear Information System (INIS)

    Vedam, S.; Archambault, L.; Starkschall, G.; Mohan, R.; Beddar, S.

    2007-01-01

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the delivery gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of simulation

  18. Comparing population and incident data for optimal air ambulance base locations in Norway.

    Science.gov (United States)

    Røislien, Jo; van den Berg, Pieter L; Lindner, Thomas; Zakariassen, Erik; Uleberg, Oddvar; Aardal, Karen; van Essen, J Theresia

    2018-05-24

    Helicopter emergency medical services are important in many health care systems. Norway has a nationwide physician manned air ambulance service servicing a country with large geographical variations in population density and incident frequencies. The aim of the study was to compare optimal air ambulance base locations using both population and incident data. We used municipality population and incident data for Norway from 2015. The 428 municipalities had a median (5-95 percentile) of 4675 (940-36,264) inhabitants and 10 (2-38) incidents. Optimal helicopter base locations were estimated using the Maximal Covering Location Problem (MCLP) optimization model, exploring the number and location of bases needed to cover various fractions of the population for time thresholds 30 and 45 min, in green field scenarios and conditioned on the existing base structure. The existing bases covered 96.90% of the population and 91.86% of the incidents for time threshold 45 min. Correlation between municipality population and incident frequencies was -0.0027, and optimal base locations varied markedly between the two data types, particularly when lowering the target time. The optimal solution using population density data put focus on the greater Oslo area, where one third of Norwegians live, while using incident data put focus on low population high incident areas, such as northern Norway and winter sport resorts. Using population density data as a proxy for incident frequency is not recommended, as the two data types lead to different optimal base locations. Lowering the target time increases the sensitivity to choice of data.

  19. Ear surgery techniques results on hearing threshold improvement

    Directory of Open Access Journals (Sweden)

    Farhad Mokhtarinejad

    2013-01-01

    Full Text Available Background: Bone conduction (BC threshold depression is not always by means of sensory neural hearing loss and sometimes it is an artifact caused by middle ear pathologies and ossicular chain problems. In this research, the influences of ear surgeries on bone conduction were evaluated. Materials and Methods: This study was conducted as a clinical trial study. The ear surgery performed on 83 patients classified in four categories: Stapedectomy, tympanomastoid surgery and ossicular reconstruction partially or totally; Partial Ossicular Replacement Prosthesis (PORP and Total Ossicular Replacement Prosthesis (TORP. Bone conduction thresholds assessed in frequencies of 250, 500, 1000, 2000 and 4000 Hz pre and post the surgery. Results: In stapedectomy group, the average of BC threshold in all frequencies improved approximately 6 dB in frequency of 2000 Hz. In tympanomastoid group, BC threshold in the frequency of 500, 1000 and 2000 Hz changed 4 dB (P-value < 0.05. Moreover, In the PORP group, 5 dB enhancement was seen in 1000 and 2000 Hz. In TORP group, the results confirmed that BC threshold improved in all frequencies especially at 4000 Hz about 6.5 dB. Conclusion: In according to results of this study, BC threshold shift was seen after several ear surgeries such as stapedectomy, tympanoplasty, PORP and TORP. The average of BC improvement was approximately 5 dB. It must be considered that BC depression might happen because of ossicular chain problems. Therefore; by resolving middle ear pathologies, the better BC threshold was obtained, the less hearing problems would be faced.

  20. Crossing Thresholds: Identifying conceptual transitions in postsecondary teaching

    Directory of Open Access Journals (Sweden)

    Susan Wilcox

    2013-12-01

    Full Text Available In this paper we report on research we conducted to begin the process of identifying threshold concepts in the field of postsecondary teaching. Meyer & Land (2006 propose that within all disciplinary fields there seem to be particular threshold concepts that serve as gateways, opening up new and previously inaccessible ways of thinking and practicing. We developed a series of questions focusing on the “troublesome” and “transformative” characteristics of threshold concepts and asked these questions of several constituent groups, including those who are new to practice and the body of knowledge in postsecondary teaching and those who are already knowledgeable and/or experienced in the field. Based on our interpretation of participants’ responses, we identified four recognized concepts in the field of postsecondary teaching as potential threshold concepts in this field: Assessment for/as learning; Learning-centred teaching; Accommodation for diversity; and, Context-driven practice. Our findings suggest that threshold concepts are relevant to the field of postsecondary teaching. Through this work, we hope to help educational developers and faculty members consider what is involved in learning to teach and developing teaching expertise, and to encourage critical discussion about the teaching development “curriculum” in postsecondary settings. Threshold concepts arise as a field develops and are defined as practitioners and scholars in the field define their field. At this stage, we believe the real value of threshold concepts for postsecondary teaching lies in the discussion that arises in the process of identifying and naming the concepts.

  1. [Relationship between Occlusal Discomfort Syndrome and Occlusal Threshold].

    Science.gov (United States)

    Munakata, Motohiro; Ono, Yumie; Hayama, Rika; Kataoka, Kanako; Ikuta, Ryuhei; Tamaki, Katsushi

    2016-03-01

    Occlusal dysesthesia has been defined as persistent uncomfortable feelings of intercuspal position continuing for more than 6 months without evidence of physical occlusal discrepancy. The problem often occurs after occlusal intervention by dental care. Although various dental treatments (e. g. occlusal adjustment, orthodontic treatment and prosthetic reconstruction) are attempted to solve occlusal dysesthesia, they rarely reach a satisfactory result, neither for patients nor dentists. In Japan, these symptoms are defined by the term "Occlusal discomfort syndrome" (ODS). The aim of this study was to investigate the characteristics of ODS with the simple occlusal sensory perceptive and discriminative test. Twenty-one female dental patients with ODS (mean age 55.8 ± 19.2 years) and 21 age- and gender-matched dental patients without ODS (mean age 53.1 ± 16.8 years) participated in the study. Upon grinding occlusal registration foils that were stacked to different thicknesses, participants reported the thicknesses at which they recognized the foils (recognition threshold) and felt discomfort (discomfort threshold). Although there was no significant difference in occlusal recognition thresholds between the two patient groups, the discomfort threshold was significantly smaller in the patients with ODS than in those without ODS. Moreover, the recognition threshold showed an age-dependent increase in patients without ODS, whereas it remained comparable between the younger (patient subgroups with ODS. These results suggest that occlusal discomfort threshold rather than recognition threshold is an issue in ODS. The foil grinding procedure is a simple and useful method to evaluate occlusal perceptive and discriminative abilities in patients with ODS.

  2. Use of Videoconferencing for Lactation Consultation: An Online Cross-Sectional Survey of Mothers' Acceptance in the United States.

    Science.gov (United States)

    Habibi, Mona F; Springer, Cary M; Spence, Marsha L; Hansen-Petrik, Melissa B; Kavanagh, Katherine F

    2018-05-01

    Suboptimal breastfeeding duration and exclusivity rates are a public health concern. Therefore, there is a need for identifying effective tools for use in interventions targeting specific barriers to optimal breastfeeding outcomes. Research aim: This study aimed to assess the relationship between acceptance of remote lactation consultation using videoconferencing and (a) maternal demographic factors, (b) technology acceptance subscales, (c) maternal learning style preferences, and (d) other potentially explanatory maternal factors. This was a cross-sectional, online study. English-speaking mothers of at least 18 years of age, with an infant age 4 months or younger, and who reported initiating breastfeeding were eligible to participate. Mothers were recruited from 27 randomly selected states. One hundred one mothers completed the survey, resulting in a response rate of 71%. The main outcome was acceptance of videoconferencing use for lactation consultation. No significant differences were found in acceptance by maternal demographic factors or learning style preferences. Acceptance was significantly related to perceived ease of use ( r = .680, p acceptance of videoconferencing for lactation consultation ( r = .432, p model ( R 2 = .616, p acceptance, maternal age was inversely related. This sample of mothers indicated general acceptance of videoconferencing for lactation consultation, with younger mothers and those perceiving it to be more useful demonstrating greater acceptance.

  3. A critical experimental study of the classical tactile threshold theory

    Directory of Open Access Journals (Sweden)

    Medina Leonel E

    2010-06-01

    Full Text Available Abstract Background The tactile sense is being used in a variety of applications involving tactile human-machine interfaces. In a significant number of publications the classical threshold concept plays a central role in modelling and explaining psychophysical experimental results such as in stochastic resonance (SR phenomena. In SR, noise enhances detection of sub-threshold stimuli and the phenomenon is explained stating that the required amplitude to exceed the sensory threshold barrier can be reached by adding noise to a sub-threshold stimulus. We designed an experiment to test the validity of the classical vibrotactile threshold. Using a second choice experiment, we show that individuals can order sensorial events below the level known as the classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance level. Nevertheless, our experimental results are above that chance level contradicting the definition of the classical tactile threshold. Results We performed a three alternative forced choice detection experiment on 6 subjects asking them first and second choices. In each trial, only one of the intervals contained a stimulus and the others contained only noise. According to the classical threshold assumptions, a correct second choice response corresponds to a guess attempt with a statistical frequency of 50%. Results show an average of 67.35% (STD = 1.41% for the second choice response that is not explained by the classical threshold definition. Additionally, for low stimulus amplitudes, second choice correct detection is above chance level for any detectability level. Conclusions Using a second choice experiment, we show that individuals can order sensorial events below the level known as a classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance

  4. The usage of filtration as fulfillment of acceptable indoor and optimal energy management

    International Nuclear Information System (INIS)

    Burroughs, H.E.

    1993-01-01

    The role of filtration is a significant factor in the prevention and mitigation of indoor air quality problems. ASHRAE Standard 62-89. Ventilation for Acceptable Indoor Air Quality, makes broad and non-specific references to filtration. This paper provides guidelines for the usage of filtration as a means of fulfillment of the Standard's requirements. The paper also references the specific authorities as iterated in the Standard. The discussion will include the usage of filtration in treating contaminated outside air, protection of equipment and systems, protection of occupants, reduction of ventilation air, and source control. The reduction of ventilation air through filtration has significant and positive energy management benefits. Other energy benefits accrue from clean heat exchange surfaces

  5. Optimization of input parameters of supra-threshold stochastic resonance image processing algorithm for the detection of abdomino-pelvic tumors on PET/CT scan

    International Nuclear Information System (INIS)

    Pandey, Anil Kumar; Saroha, Kartik; Patel, C.D.; Bal, C.S.; Kumar, Rakesh

    2016-01-01

    Administration of diuretics increases the urine output to clear radioactive urine from kidneys and bladder. Hence post-diuretic pelvic PET/CT scan enhances the probability of detection of abdomino-pelvic tumor. However, it causes discomfort in patients and has some side effects also. Application of supra threshold stochastic resonance (SSR) image processing algorithm on Pre-diuretic PET/CT scan may also increase the probability of detection of these tumors. Amount of noise and threshold are two variable parameters that effect the final image quality. This study was conducted to investigate the effect of these two variable parameters on the detection of abdomen-pelvic tumor

  6. Factors associated with the acceptance of sugar and sugar substitutes by the public.

    Science.gov (United States)

    Mackay, D A

    1985-09-01

    Acceptance is described in both market and sensory research terminology and recent developments in the fields of applied psychology and physiology are examined for their pertinence to public acceptance of sucrose and its substitutes. Information on the function of sucrose in foods other than beverages is presented with emphasis on salivation as an acceptance factor and attention is drawn to its possible dental significance. Distinctions are made between the sweetening and bulking properties of sucrose and sugar substitutes. Factors having a bearing on the acceptance of sweet foods and the determination of their optimal sugar content are described in detail. While major decreases in sucrose intake in the US resulted from high-fructose corn-sweetener usage in soft drinks, no evidence is yet available to suggest that the use of sugar substitutes of the intense artificial sweetener type has caused any decrease in ordinary sugar consumption. Neither is the consumption of polyols (sorbitol, mannitol, xylitol) high enough in confectionery categories to cause any discernible decrease in sugar usage. The evidence suggests not so much that sugar substitutes may have stopped the growth in sucrose usage, but that new product categories such as diet foods and "sugarless' confections may have been created. These categories were never available to fermentable carbohydrate sweeteners and equivalence in acceptance to sucrose-sweetened products was not an important factor in their growth.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. A threshold for dissipative fission

    International Nuclear Information System (INIS)

    Thoennessen, M.; Bertsch, G.F.

    1993-01-01

    The empirical domain of validity of statistical theory is examined as applied to fission data on pre-fission data on pre-fission neutron, charged particle, and γ-ray multiplicities. Systematics are found of the threshold excitation energy for the appearance of nonstatistical fission. From the data on systems with not too high fissility, the relevant phenomenological parameter is the ratio of the threshold temperature T thresh to the (temperature-dependent) fission barrier height E Bar (T). The statistical model reproduces the data for T thresh /E Bar (T) thresh /E Bar (T) independent of mass and fissility of the systems

  8. Threshold current for fireball generation

    Science.gov (United States)

    Dijkhuis, Geert C.

    1982-05-01

    Fireball generation from a high-intensity circuit breaker arc is interpreted here as a quantum-mechanical phenomenon caused by severe cooling of electrode material evaporating from contact surfaces. According to the proposed mechanism, quantum effects appear in the arc plasma when the radius of one magnetic flux quantum inside solid electrode material has shrunk to one London penetration length. A formula derived for the threshold discharge current preceding fireball generation is found compatible with data reported by Silberg. This formula predicts linear scaling of the threshold current with the circuit breaker's electrode radius and concentration of conduction electrons.

  9. Regional Seismic Threshold Monitoring

    National Research Council Canada - National Science Library

    Kvaerna, Tormod

    2006-01-01

    ... model to be used for predicting the travel times of regional phases. We have applied these attenuation relations to develop and assess a regional threshold monitoring scheme for selected subregions of the European Arctic...

  10. Optimized Reputable Sensing Participants Extraction for Participatory Sensor Networks

    Directory of Open Access Journals (Sweden)

    Weiwei Yuan

    2014-01-01

    Full Text Available By collecting data via sensors embedded personal smart devices, sensing participants play a key role in participatory sensor networks. Using information provided by reputable sensing participants ensures the reliability of participatory sensing data. Setting a threshold for the reputation, and those whose reputations are bigger than this value are regarded as reputable. The bigger the threshold value is, the more reliable the extracted reputable sensing participant is. However, if the threshold value is too big, only very limited participatory sensing data can be involved. This may cause unexpected bias in information collection. Existing works did not consider the relationship between the reliability of extracted reputable sensing participants and the ratio of usable participatory sensing data. In this work, we propose a criterion for optimized reputable sensing participant extraction in participatory sensor networks. This is achieved based on the mathematical analysis on the ratio of available participatory sensing data and the reliability of extracted reputable sensing participants. Our suggested threshold value for reputable sensing participant extraction is only related to the power of sensing participant’s reputation distribution. It is easy to be applied in real applications. Simulation results tested on real application data further verified the effectiveness of our proposed method.

  11. Application of Improved Wavelet Thresholding Function in Image Denoising Processing

    Directory of Open Access Journals (Sweden)

    Hong Qi Zhang

    2014-07-01

    Full Text Available Wavelet analysis is a time – frequency analysis method, time-frequency localization problems are well solved, this paper analyzes the basic principles of the wavelet transform and the relationship between the signal singularity Lipschitz exponent and the local maxima of the wavelet transform coefficients mold, the principles of wavelet transform in image denoising are analyzed, the disadvantages of traditional wavelet thresholding function are studied, wavelet threshold function, the discontinuity of hard threshold and constant deviation of soft threshold are improved, image is denoised through using the improved threshold function.

  12. Threshold law for positron-atom impact ionisation

    International Nuclear Information System (INIS)

    Temkin, A.

    1982-01-01

    The threshold law for ionisation of atoms by positron impact is adduced in analogy with the author's approach to the electron-atom ionisation. It is concluded the Coulomb-dipole region of potential gives the essential part of the interaction in both cases and leads to the same kind of result: a modulated linear law. An additional process which enters positron ionisation is positronium formation in the continuum, but that will not dominate the threshold yield. The result is in sharp contrast to the positron threshold law as recently derived by Klar (J. Phys. B.; 14:4165 (1981)) on the basis of a Wannier-type (Phys. Rev.; 90:817 (1953)) analysis. (author)

  13. Baryon-antibaryon threshold and ω-baryonium mixing

    International Nuclear Information System (INIS)

    Gavai, R.V.

    1981-01-01

    It is shown that in any dual-topological-unitarization model of ω-baryonium (B) mixing at the cylinder level, in which the production of baryon-antibaryon (bb-bar) pairs can take place only above a certain threshold energy, the phenomenologically relevant ω and B trajectories do not mix below bb-bar threshold. However, their couplings to external particles do get modified. The ω-B mixing angle theta/sub omegahyphenB/, which characterizes these coupling modification effects below bb-bar threshold at t = 0, is estimated in some models. These estimates are found to agree reasonably well with the existing phenomenological bound on theta/sub omegahyphenB/

  14. Application of Neural Network Optimized by Mind Evolutionary Computation in Building Energy Prediction

    Science.gov (United States)

    Song, Chen; Zhong-Cheng, Wu; Hong, Lv

    2018-03-01

    Building Energy forecasting plays an important role in energy management and plan. Using mind evolutionary algorithm to find the optimal network weights and threshold, to optimize the BP neural network, can overcome the problem of the BP neural network into a local minimum point. The optimized network is used for time series prediction, and the same month forecast, to get two predictive values. Then two kinds of predictive values are put into neural network, to get the final forecast value. The effectiveness of the method was verified by experiment with the energy value of three buildings in Hefei.

  15. Chinese Nurses' Acceptance of PDA: A Cross-Sectional Survey Using a Technology Acceptance Model.

    Science.gov (United States)

    Wang, Yanling; Xiao, Qian; Sun, Liu; Wu, Ying

    2016-01-01

    This study explores Chinese nurses' acceptance of PDA, using a questionnaire based on the framework of Technology Acceptance Model (TAM). 357 nurses were involved in the study. The results reveal the scores of the nurses' acceptance of PDA were means 3.18~3.36 in four dimensions. The younger of nurses, the higher nurses' title, the longer previous usage time, the more experienced using PDA, and the more acceptance of PDA. Therefore, the hospital administrators may change strategies to enhance nurses' acceptance of PDA, and promote the wide application of PDA.

  16. Noise thresholds for optical quantum computers.

    Science.gov (United States)

    Dawson, Christopher M; Haselgrove, Henry L; Nielsen, Michael A

    2006-01-20

    In this Letter we numerically investigate the fault-tolerant threshold for optical cluster-state quantum computing. We allow both photon loss noise and depolarizing noise (as a general proxy for all local noise), and obtain a threshold region of allowed pairs of values for the two types of noise. Roughly speaking, our results show that scalable optical quantum computing is possible for photon loss probabilities <3 x 10(-3), and for depolarization probabilities <10(-4).

  17. Thresholds of Toxicological Concern - Setting a threshold for testing below which there is little concern.

    Science.gov (United States)

    Hartung, Thomas

    2017-01-01

    Low dose, low risk; very low dose, no real risk. Setting a pragmatic threshold below which concerns become negligible is the purpose of thresholds of toxicological concern (TTC). The idea is that such threshold values do not need to be established for each and every chemical based on experimental data, but that by analyzing the distribution of lowest or no-effect doses of many chemicals, a TTC can be defined - typically using the 5th percentile of this distribution and lowering it by an uncertainty factor of, e.g., 100. In doing so, TTC aims to compare exposure information (dose) with a threshold below which any hazard manifestation is very unlikely to occur. The history and current developments of this concept are reviewed and the application of TTC for different regulated products and their hazards is discussed. TTC lends itself as a pragmatic filter to deprioritize testing needs whenever real-life exposures are much lower than levels where hazard manifestation would be expected, a situation that is called "negligible exposure" in the REACH legislation, though the TTC concept has not been fully incorporated in its implementation (yet). Other areas and regulations - especially in the food sector and for pharmaceutical impurities - are more proactive. Large, curated databases on toxic effects of chemicals provide us with the opportunity to set TTC for many hazards and substance classes and thus offer a precautionary second tier for risk assessments if hazard cannot be excluded. This allows focusing testing efforts better on relevant exposures to chemicals.

  18. Parental social coaching promotes adolescent peer acceptance across the middle school transition.

    Science.gov (United States)

    Gregson, Kim D; Tu, Kelly M; Erath, Stephen A; Pettit, Gregory S

    2017-09-01

    The present study investigated longitudinal associations between behavioral and cognitive dimensions of parental social coaching (i.e., advice about how to behave or think about peer challenges) and young adolescents' peer acceptance, and whether such associations are moderated by youths' social skills. Time 1 (T1) participants included 123 young adolescents (M age = 12.03 years; 50% boys; 58.5% European American). Parents gave open-ended reports about their social coaching to hypothetical peer stress scenarios, which were coded from low to high quality on behavioral and cognitive dimensions. Parents and teachers reported on adolescent prosocial behavior (i.e., social-behavioral skills), and adolescents reported on their social appraisals and social self-efficacy (i.e., social-cognitive skills). At T1 (before the first year of middle school) and Time 2 (approximately 10 months later, after the first year of middle school), parents and teachers rated adolescent peer acceptance. Analyses revealed that parents' prosocial behavioral advice and benign cognitive framing independently predicted adolescents' higher peer acceptance prospectively (controlling for earlier levels of peer acceptance). Furthermore, adolescent social skills moderated links between coaching and peer acceptance. Specifically, adolescents with higher, but not lower, social-cognitive skills became more accepted in the context of higher-quality coaching, supporting a "capitalization" pattern, such that these youth may be better able to utilize coaching suggestions. Results underscore the utility of parents' behavioral advice and cognitive framing for adolescent peer adjustment across the middle school transition and suggest that optimal social-coaching strategies may depend in part on adolescent social skill level. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Contraction Options and Optimal Multiple-Stopping in Spectrally Negative Lévy Models

    Energy Technology Data Exchange (ETDEWEB)

    Yamazaki, Kazutoshi, E-mail: kyamazak@kansai-u.ac.jp [Kansai University, Department of Mathematics, Faculty of Engineering Science (Japan)

    2015-08-15

    This paper studies the optimal multiple-stopping problem arising in the context of the timing option to withdraw from a project in stages. The profits are driven by a general spectrally negative Lévy process. This allows the model to incorporate sudden declines of the project values, generalizing greatly the classical geometric Brownian motion model. We solve the one-stage case as well as the extension to the multiple-stage case. The optimal stopping times are of threshold-type and the value function admits an expression in terms of the scale function. A series of numerical experiments are conducted to verify the optimality and to evaluate the efficiency of the algorithm.

  20. Contraction Options and Optimal Multiple-Stopping in Spectrally Negative Lévy Models

    International Nuclear Information System (INIS)

    Yamazaki, Kazutoshi

    2015-01-01

    This paper studies the optimal multiple-stopping problem arising in the context of the timing option to withdraw from a project in stages. The profits are driven by a general spectrally negative Lévy process. This allows the model to incorporate sudden declines of the project values, generalizing greatly the classical geometric Brownian motion model. We solve the one-stage case as well as the extension to the multiple-stage case. The optimal stopping times are of threshold-type and the value function admits an expression in terms of the scale function. A series of numerical experiments are conducted to verify the optimality and to evaluate the efficiency of the algorithm