WorldWideScience

Sample records for sequentially simulated outcomes

  1. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis

    NARCIS (Netherlands)

    A. Tran-Duy (An); A. Boonen (Annelies); M.A.F.J. van de Laar (Mart); A. Franke (Andre); J.L. Severens (Hans)

    2011-01-01

    textabstractObjective: To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Methods: Discrete event simulation paradigm was selected for model

  2. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis

    NARCIS (Netherlands)

    Tran-Duy, A.; Boonen, A.; Laar, M.A.F.J.; Franke, A.C.; Severens, J.L.

    2011-01-01

    Objective To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Methods Discrete event simulation paradigm was selected for model development. Drug

  3. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis.

    Science.gov (United States)

    Tran-Duy, An; Boonen, Annelies; van de Laar, Mart A F J; Franke, Angelinus C; Severens, Johan L

    2011-12-01

    To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Discrete event simulation paradigm was selected for model development. Drug efficacy was modelled as changes in disease activity (Bath Ankylosing Spondylitis Disease Activity Index (BASDAI)) and functional status (Bath Ankylosing Spondylitis Functional Index (BASFI)), which were linked to costs and health utility using statistical models fitted based on an observational AS cohort. Published clinical data were used to estimate drug efficacy and time to events. Two strategies were compared: (1) five available non-steroidal anti-inflammatory drugs (strategy 1) and (2) same as strategy 1 plus two tumour necrosis factor α inhibitors (strategy 2). 13,000 patients were followed up individually until death. For probability sensitivity analysis, Monte Carlo simulations were performed with 1000 sets of parameters sampled from the appropriate probability distributions. The models successfully generated valid data on treatments, BASDAI, BASFI, utility, quality-adjusted life years (QALYs) and costs at time points with intervals of 1-3 months during the simulation length of 70 years. Incremental cost per QALY gained in strategy 2 compared with strategy 1 was €35,186. At a willingness-to-pay threshold of €80,000, it was 99.9% certain that strategy 2 was cost-effective. The modelling framework provides great flexibility to implement complex algorithms representing treatment selection, disease progression and changes in costs and utilities over time of patients with AS. Results obtained from the simulation are plausible.

  4. Accelerating Sequential Gaussian Simulation with a constant path

    Science.gov (United States)

    Nussbaumer, Raphaël; Mariethoz, Grégoire; Gravey, Mathieu; Gloaguen, Erwan; Holliger, Klaus

    2018-03-01

    Sequential Gaussian Simulation (SGS) is a stochastic simulation technique commonly employed for generating realizations of Gaussian random fields. Arguably, the main limitation of this technique is the high computational cost associated with determining the kriging weights. This problem is compounded by the fact that often many realizations are required to allow for an adequate uncertainty assessment. A seemingly simple way to address this problem is to keep the same simulation path for all realizations. This results in identical neighbourhood configurations and hence the kriging weights only need to be determined once and can then be re-used in all subsequent realizations. This approach is generally not recommended because it is expected to result in correlation between the realizations. Here, we challenge this common preconception and make the case for the use of a constant path approach in SGS by systematically evaluating the associated benefits and limitations. We present a detailed implementation, particularly regarding parallelization and memory requirements. Extensive numerical tests demonstrate that using a constant path allows for substantial computational gains with very limited loss of simulation accuracy. This is especially the case for a constant multi-grid path. The computational savings can be used to increase the neighbourhood size, thus allowing for a better reproduction of the spatial statistics. The outcome of this study is a recommendation for an optimal implementation of SGS that maximizes accurate reproduction of the covariance structure as well as computational efficiency.

  5. A path-level exact parallelization strategy for sequential simulation

    Science.gov (United States)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  6. Sequentially linear analysis for simulating brittle failure

    NARCIS (Netherlands)

    van de Graaf, A.V.

    2017-01-01

    The numerical simulation of brittle failure at structural level with nonlinear finite
    element analysis (NLFEA) remains a challenge due to robustness issues. We attribute these problems to the dimensions of real-world structures combined with softening behavior and negative tangent stiffness at

  7. Sequential Computerized Mastery Tests--Three Simulation Studies

    Science.gov (United States)

    Wiberg, Marie

    2006-01-01

    A simulation study of a sequential computerized mastery test is carried out with items modeled with the 3 parameter logistic item response theory model. The examinees' responses are either identically distributed, not identically distributed, or not identically distributed together with estimation errors in the item characteristics. The…

  8. The use of sequential indicator simulation to characterize geostatistical uncertainty

    International Nuclear Information System (INIS)

    Hansen, K.M.

    1992-10-01

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds

  9. Memory and decision making: Effects of sequential presentation of probabilities and outcomes in risky prospects.

    Science.gov (United States)

    Millroth, Philip; Guath, Mona; Juslin, Peter

    2018-06-07

    The rationality of decision making under risk is of central concern in psychology and other behavioral sciences. In real-life, the information relevant to a decision often arrives sequentially or changes over time, implying nontrivial demands on memory. Yet, little is known about how this affects the ability to make rational decisions and a default assumption is rather that information about outcomes and probabilities are simultaneously available at the time of the decision. In 4 experiments, we show that participants receiving probability- and outcome information sequentially report substantially (29 to 83%) higher certainty equivalents than participants with simultaneous presentation. This holds also for monetary-incentivized participants with perfect recall of the information. Participants in the sequential conditions often violate stochastic dominance in the sense that they pay more for a lottery with low probability of an outcome than participants in the simultaneous condition pay for a high probability of the same outcome. Computational modeling demonstrates that Cumulative Prospect Theory (Tversky & Kahneman, 1992) fails to account for the effects of sequential presentation, but a model assuming anchoring-and adjustment constrained by memory can account for the data. By implication, established assumptions of rationality may need to be reconsidered to account for the effects of memory in many real-life tasks. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    International Nuclear Information System (INIS)

    Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh

    2007-01-01

    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  11. Simultaneous versus Sequential Bilateral Cataract Surgery for Infants with Congenital Cataracts: Visual Outcomes and Economic Costs

    Science.gov (United States)

    Dave, Hreem; Phoenix, Vidya; Becker, Edmund R.; Lambert, Scott R.

    2015-01-01

    OBJECTIVES To compare the incidence of adverse events, visual outcomes and economic costs of sequential versus simultaneous bilateral cataract surgery for infants with congenital cataracts. METHODS We retrospectively reviewed the incidence of adverse events, visual outcomes and medical payments associated with simultaneous versus sequential bilateral cataract surgery for infants with congenital cataracts who underwent cataract surgery when 6 months of age or younger at our institution. RESULTS Records were available for 10 children who underwent sequential surgery at a mean age of 49 days for the first eye and 17 children who underwent simultaneous surgery at a mean age of 68 days (p=.25). We found a similar incidence of adverse events between the two treatment groups. Intraoperative or postoperative complications occurred in 14 eyes. The most common postoperative complication was glaucoma. No eyes developed endophthalmitis. The mean absolute interocular difference in logMAR visual acuities between the two treatment groups was 0.47±0.76 for the sequential group and 0.44±0.40 for the simultaneous group (p=.92). Hospital, drugs, supplies and professional payments were on average 21.9% lower per patient in the simultaneous group. CONCLUSIONS Simultaneous bilateral cataract surgery for infants with congenital cataracts was associated with a 21.9% reduction in medical payments and no discernible difference in the incidence of adverse events or visual outcome. PMID:20697007

  12. Sequential use of simulation and optimization in analysis and planning

    Science.gov (United States)

    Hans R. Zuuring; Jimmie D. Chew; J. Greg Jones

    2000-01-01

    Management activities are analyzed at landscape scales employing both simulation and optimization. SIMPPLLE, a stochastic simulation modeling system, is initially applied to assess the risks associated with a specific natural process occurring on the current landscape without management treatments, but with fire suppression. These simulation results are input into...

  13. Simulation based sequential Monte Carlo methods for discretely observed Markov processes

    OpenAIRE

    Neal, Peter

    2014-01-01

    Parameter estimation for discretely observed Markov processes is a challenging problem. However, simulation of Markov processes is straightforward using the Gillespie algorithm. We exploit this ease of simulation to develop an effective sequential Monte Carlo (SMC) algorithm for obtaining samples from the posterior distribution of the parameters. In particular, we introduce two key innovations, coupled simulations, which allow us to study multiple parameter values on the basis of a single sim...

  14. Cystic periventricular leukomalacia in the neonate: analysis of sequential sonographic findings and neurologic outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Seok; Yoo, Dong Soo [Dankook University College of Medicine, Cheonan (Korea, Republic of)

    2003-07-01

    To analyse the sequential sonographic findings of cystic PVL and to evaluate relationship between sonographic grading of PVL and patterns of neurologic outcomes. Authors have retrospectively analysed the sequential sonographic findings of 36 cases of PVL in the preterm neonates. Initial sonographic features done within 3 days of life were divided into 3 patients such as normal, localized, and diffuse hyperechogenic flare. Grading of PVL confirmed by follow-up studies was classified as involvement of one lobe (grade 1), two lobes (grade 2) and more than extent of grade 2 (grade 3). The relationship between sonographic grading of leukomalacia and later neurologic outcomes were also analysed. Initial sonographic patterns according to grading of PVL were normal pattern in seven of nine (77.8%) of grade 1, diffuse hyperechogenic flares in five of eight cases of grade 2 and in 13 of 16 cases of grade 3. There was a significant difference between the grades and frequency of pattern of diffuse hyperechoic flare (p=0.021). Average detection timing of cystic PVL was 38.4{+-}18.9 days in grade 1, 29.8{+-}14 days in grade 2, and 19.1{+-}5.6 days in grade 3 with a significant statistical difference between the detection time and grades (p=0.037). Cerebral palsy has occurred in 62.5% of grade 1 and 100% of grade 2 and grade 3 (p=0.043). Frequency of spastic quadriplegia was higher in grade 3 (76.5%) than in grade 1 (25%) and grade 2 (12.5%) (p=0.001). Most of grade 1 cystic PVL revealed normal pattern of white matter echogenicity in initial ultrasonography and needed follow up examination over one month period. Spastic quadriplegia occured mainly in patients with grade 3 cystic PVL.

  15. Cystic periventricular leukomalacia in the neonate: analysis of sequential sonographic findings and neurologic outcomes

    International Nuclear Information System (INIS)

    Lee, Young Seok; Yoo, Dong Soo

    2003-01-01

    To analyse the sequential sonographic findings of cystic PVL and to evaluate relationship between sonographic grading of PVL and patterns of neurologic outcomes. Authors have retrospectively analysed the sequential sonographic findings of 36 cases of PVL in the preterm neonates. Initial sonographic features done within 3 days of life were divided into 3 patients such as normal, localized, and diffuse hyperechogenic flare. Grading of PVL confirmed by follow-up studies was classified as involvement of one lobe (grade 1), two lobes (grade 2) and more than extent of grade 2 (grade 3). The relationship between sonographic grading of leukomalacia and later neurologic outcomes were also analysed. Initial sonographic patterns according to grading of PVL were normal pattern in seven of nine (77.8%) of grade 1, diffuse hyperechogenic flares in five of eight cases of grade 2 and in 13 of 16 cases of grade 3. There was a significant difference between the grades and frequency of pattern of diffuse hyperechoic flare (p=0.021). Average detection timing of cystic PVL was 38.4±18.9 days in grade 1, 29.8±14 days in grade 2, and 19.1±5.6 days in grade 3 with a significant statistical difference between the detection time and grades (p=0.037). Cerebral palsy has occurred in 62.5% of grade 1 and 100% of grade 2 and grade 3 (p=0.043). Frequency of spastic quadriplegia was higher in grade 3 (76.5%) than in grade 1 (25%) and grade 2 (12.5%) (p=0.001). Most of grade 1 cystic PVL revealed normal pattern of white matter echogenicity in initial ultrasonography and needed follow up examination over one month period. Spastic quadriplegia occured mainly in patients with grade 3 cystic PVL

  16. [Using sequential indicator simulation method to define risk areas of soil heavy metals in farmland.

    Science.gov (United States)

    Yang, Hao; Song, Ying Qiang; Hu, Yue Ming; Chen, Fei Xiang; Zhang, Rui

    2018-05-01

    The heavy metals in soil have serious impacts on safety, ecological environment and human health due to their toxicity and accumulation. It is necessary to efficiently identify the risk area of heavy metals in farmland soil, which is of important significance for environment protection, pollution warning and farmland risk control. We collected 204 samples and analyzed the contents of seven kinds of heavy metals (Cu, Zn, Pb, Cd, Cr, As, Hg) in Zengcheng District of Guangzhou, China. In order to overcame the problems of the data, including the limitation of abnormal values and skewness distribution and the smooth effect with the traditional kriging methods, we used sequential indicator simulation method (SISIM) to define the spatial distribution of heavy metals, and combined Hakanson index method to identify potential ecological risk area of heavy metals in farmland. The results showed that: (1) Based on the similar accuracy of spatial prediction of soil heavy metals, the SISIM had a better expression of detail rebuild than ordinary kriging in small scale area. Compared to indicator kriging, the SISIM had less error rate (4.9%-17.1%) in uncertainty evaluation of heavy-metal risk identification. The SISIM had less smooth effect and was more applicable to simulate the spatial uncertainty assessment of soil heavy metals and risk identification. (2) There was no pollution in Zengcheng's farmland. Moderate potential ecological risk was found in the southern part of study area due to enterprise production, human activities, and river sediments. This study combined the sequential indicator simulation with Hakanson risk index method, and effectively overcame the outlier information loss and smooth effect of traditional kriging method. It provided a new way to identify the soil heavy metal risk area of farmland in uneven sampling.

  17. Simultaneous vs sequential bilateral cataract surgery for infants with congenital cataracts: Visual outcomes, adverse events, and economic costs.

    Science.gov (United States)

    Dave, Hreem; Phoenix, Vidya; Becker, Edmund R; Lambert, Scott R

    2010-08-01

    To compare the incidence of adverse events and visual outcomes and to compare the economic costs of sequential vs simultaneous bilateral cataract surgery for infants with congenital cataracts. Retrospective review of simultaneous vs sequential bilateral cataract surgery for infants with congenital cataracts who underwent cataract surgery when 6 months or younger at our institution. Records were available for 10 children who underwent sequential surgery at a mean age of 49 days for the first eye and 17 children who underwent simultaneous surgery at a mean age of 68 days (P = .25). We found a similar incidence of adverse events between the 2 treatment groups. Intraoperative or postoperative complications occurred in 14 eyes. The most common postoperative complication was glaucoma. No eyes developed endophthalmitis. The mean (SD) absolute interocular difference in logMAR visual acuities between the 2 treatment groups was 0.47 (0.76) for the sequential group and 0.44 (0.40) for the simultaneous group (P = .92). Payments for the hospital, drugs, supplies, and professional services were on average 21.9% lower per patient in the simultaneous group. Simultaneous bilateral cataract surgery for infants with congenital cataracts is associated with a 21.9% reduction in medical payments and no discernible difference in the incidence of adverse events or visual outcomes. However, our small sample size limits our ability to make meaningful comparisons of the relative risks and visual benefits of the 2 procedures.

  18. Improving multiple-point-based a priori models for inverse problems by combining Sequential Simulation with the Frequency Matching Method

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Lange, Katrine

    In order to move beyond simplified covariance based a priori models, which are typically used for inverse problems, more complex multiple-point-based a priori models have to be considered. By means of marginal probability distributions ‘learned’ from a training image, sequential simulation has...... proven to be an efficient way of obtaining multiple realizations that honor the same multiple-point statistics as the training image. The frequency matching method provides an alternative way of formulating multiple-point-based a priori models. In this strategy the pattern frequency distributions (i.......e. marginals) of the training image and a subsurface model are matched in order to obtain a solution with the same multiple-point statistics as the training image. Sequential Gibbs sampling is a simulation strategy that provides an efficient way of applying sequential simulation based algorithms as a priori...

  19. Simulation Study of Real Time 3-D Synthetic Aperture Sequential Beamforming for Ultrasound Imaging

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Rasmussen, Morten Fischer; Stuart, Matthias Bo

    2014-01-01

    in the main system. The real-time imaging capability is achieved using a synthetic aperture beamforming technique, utilizing the transmit events to generate a set of virtual elements that in combination can generate an image. The two core capabilities in combination is named Synthetic Aperture Sequential......This paper presents a new beamforming method for real-time three-dimensional (3-D) ultrasound imaging using a 2-D matrix transducer. To obtain images with sufficient resolution and contrast, several thousand elements are needed. The proposed method reduces the required channel count from...... Beamforming (SASB). Simulations are performed to evaluate the image quality of the presented method in comparison to Parallel beamforming utilizing 16 receive beamformers. As indicators for image quality the detail resolution and Cystic resolution are determined for a set of scatterers at a depth of 90mm...

  20. Sequential treatment with fluoxetine and relapse--prevention CBT to improve outcomes in pediatric depression.

    Science.gov (United States)

    Kennard, Betsy D; Emslie, Graham J; Mayes, Taryn L; Nakonezny, Paul A; Jones, Jessica M; Foxwell, Aleksandra A; King, Jessica

    2014-10-01

    The authors evaluated a sequential treatment strategy of fluoxetine and relapse-prevention cognitive-behavioral therapy (CBT) to determine effects on remission and relapse in youths with major depressive disorder. Youths 8-17 years of age with major depression were treated openly with fluoxetine for 6 weeks. Those with an adequate response (defined as a reduction of 50% or more on the Children's Depression Rating Scale-Revised [CDRS-R]) were randomly assigned to receive continued medication management alone or continued medication management plus CBT for an additional 6 months. The CBT was modified to address residual symptoms and was supplemented by well-being therapy. Primary outcome measures were time to remission (with remission defined as a CDRS-R score of 28 or less) and rate of relapse (with relapse defined as either a CDRS-R score of 40 or more with a history of 2 weeks of symptom worsening, or clinical deterioration). Of the 200 participants enrolled in acute-phase treatment, 144 were assigned to continuation treatment with medication management alone (N=69) or medication management plus CBT (N=75). During the 30-week continuation treatment period, time to remission did not differ significantly between treatment groups (hazard ratio=1.26, 95% CI=0.87, 1.82). However, the medication management plus CBT group had a significantly lower risk of relapse than the medication management only group (hazard ratio=0.31, 95% CI=0.13, 0.75). The estimated probability of relapse by week 30 was lower with medication management plus CBT than with medication management only (9% compared with 26.5%). Continuation-phase relapse-prevention CBT was effective in reducing the risk of relapse but not in accelerating time to remission in children and adolescents with major depressive disorder.

  1. A computationally efficient Bayesian sequential simulation approach for the assimilation of vast and diverse hydrogeophysical datasets

    Science.gov (United States)

    Nussbaumer, Raphaël; Gloaguen, Erwan; Mariéthoz, Grégoire; Holliger, Klaus

    2016-04-01

    Bayesian sequential simulation (BSS) is a powerful geostatistical technique, which notably has shown significant potential for the assimilation of datasets that are diverse with regard to the spatial resolution and their relationship. However, these types of applications of BSS require a large number of realizations to adequately explore the solution space and to assess the corresponding uncertainties. Moreover, such simulations generally need to be performed on very fine grids in order to adequately exploit the technique's potential for characterizing heterogeneous environments. Correspondingly, the computational cost of BSS algorithms in their classical form is very high, which so far has limited an effective application of this method to large models and/or vast datasets. In this context, it is also important to note that the inherent assumption regarding the independence of the considered datasets is generally regarded as being too strong in the context of sequential simulation. To alleviate these problems, we have revisited the classical implementation of BSS and incorporated two key features to increase the computational efficiency. The first feature is a combined quadrant spiral - superblock search, which targets run-time savings on large grids and adds flexibility with regard to the selection of neighboring points using equal directional sampling and treating hard data and previously simulated points separately. The second feature is a constant path of simulation, which enhances the efficiency for multiple realizations. We have also modified the aggregation operator to be more flexible with regard to the assumption of independence of the considered datasets. This is achieved through log-linear pooling, which essentially allows for attributing weights to the various data components. Finally, a multi-grid simulating path was created to enforce large-scale variance and to allow for adapting parameters, such as, for example, the log-linear weights or the type

  2. Sequential Gaussian co-simulation of rate decline parameters of longwall gob gas ventholes

    Science.gov (United States)

    Karacan, C. Özgen; Olea, Ricardo A.

    2013-01-01

    Gob gas ventholes (GGVs) are used to control methane inflows into a longwall mining operation by capturing the gas within the overlying fractured strata before it enters the work environment. Using geostatistical co-simulation techniques, this paper maps the parameters of their rate decline behaviors across the study area, a longwall mine in the Northern Appalachian basin. Geostatistical gas-in-place (GIP) simulations were performed, using data from 64 exploration boreholes, and GIP data were mapped within the fractured zone of the study area. In addition, methane flowrates monitored from 10 GGVs were analyzed using decline curve analyses (DCA) techniques to determine parameters of decline rates. Surface elevation showed the most influence on methane production from GGVs and thus was used to investigate its relation with DCA parameters using correlation techniques on normal-scored data. Geostatistical analysis was pursued using sequential Gaussian co-simulation with surface elevation as the secondary variable and with DCA parameters as the primary variables. The primary DCA variables were effective percentage decline rate, rate at production start, rate at the beginning of forecast period, and production end duration. Co-simulation results were presented to visualize decline parameters at an area-wide scale. Wells located at lower elevations, i.e., at the bottom of valleys, tend to perform better in terms of their rate declines compared to those at higher elevations. These results were used to calculate drainage radii of GGVs using GIP realizations. The calculated drainage radii are close to ones predicted by pressure transient tests.

  3. Adjuvant sequential chemo and radiotherapy improves the oncological outcome in high risk endometrial cancer

    Science.gov (United States)

    Signorelli, Mauro; Lissoni, Andrea Alberto; De Ponti, Elena; Grassi, Tommaso; Ponti, Serena

    2015-01-01

    Objective Evaluation of the impact of sequential chemoradiotherapy in high risk endometrial cancer (EC). Methods Two hundred fifty-four women with stage IB grade 3, II and III EC (2009 FIGO staging), were included in this retrospective study. Results Stage I, II, and III was 24%, 28.7%, and 47.3%, respectively. Grade 3 tumor was 53.2% and 71.3% had deep myometrial invasion. One hundred sixty-five women (65%) underwent pelvic (+/- aortic) lymphadenectomy and 58 (22.8%) had nodal metastases. Ninety-eight women (38.6%) underwent radiotherapy, 59 (23.2%) chemotherapy, 42 (16.5%) sequential chemoradiotherapy, and 55 (21.7%) were only observed. After a median follow-up of 101 months, 78 women (30.7%) relapsed and 91 women (35.8%) died. Sequential chemoradiotherapy improved survival rates in women who did not undergo nodal evaluation (disease-free survival [DFS], p=0.040; overall survival [OS], p=0.024) or pelvic (+/- aortic) lymphadenectomy (DFS, p=0.008; OS, p=0.021). Sequential chemoradiotherapy improved both DFS (p=0.015) and OS (p=0.014) in stage III, while only a trend was found for DFS (p=0.210) and OS (p=0.102) in stage I-II EC. In the multivariate analysis, only age (≤65 years) and sequential chemoradiotherapy were statistically related to the prognosis. Conclusion Sequential chemoradiotherapy improves survival rates in high risk EC compared with chemotherapy or radiotherapy alone, in particular in stage III. PMID:26197768

  4. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  5. Group-Sequential Strategies in Clinical Trials with Multiple Co-Primary Outcomes

    Science.gov (United States)

    Hamasaki, Toshimitsu; Asakura, Koko; Evans, Scott R; Sugimoto, Tomoyuki; Sozu, Takashi

    2015-01-01

    We discuss the decision-making frameworks for clinical trials with multiple co-primary endpoints in a group-sequential setting. The decision-making frameworks can account for flexibilities such as a varying number of analyses, equally or unequally spaced increments of information and fixed or adaptive Type I error allocation among endpoints. The frameworks can provide efficiency, i.e., potentially fewer trial participants, than the fixed sample size designs. We investigate the operating characteristics of the decision-making frameworks and provide guidance on constructing efficient group-sequential strategies in clinical trials with multiple co-primary endpoints. PMID:25844122

  6. Inhomogeneities detection in annual precipitation time series in Portugal using direct sequential simulation

    Science.gov (United States)

    Caineta, Júlio; Ribeiro, Sara; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2014-05-01

    Climate data homogenisation is of major importance in monitoring climate change, the validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. This happens because non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on geostatistical simulation (DSS - direct sequential simulation), where local probability density functions (pdf) are calculated at candidate monitoring stations, using spatial and temporal neighbouring observations, and then are used for detection of inhomogeneities. This approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). This study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneities detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall test, Wald-Wolfowitz runs test, Von Neumann ratio test, Standard normal homogeneity test (SNHT) for a single break, Pettit test, and Buishand range test). Moreover, a sensibility analysis is implemented to investigate the number of simulated realisations that should be used to accurately infer the local pdfs. Accordingly, the number of simulations per iteration is increased from 50 to 500, which resulted in a more representative local pdf. A set of default and recommended settings is provided, which will help

  7. Assessment of groundwater level estimation uncertainty using sequential Gaussian simulation and Bayesian bootstrapping

    Science.gov (United States)

    Varouchakis, Emmanouil; Hristopulos, Dionissios

    2015-04-01

    Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs

  8. A comparison of an algorithm for automated sequential beam orientation selection (Cycle) with simulated annealing

    International Nuclear Information System (INIS)

    Woudstra, Evert; Heijmen, Ben J M; Storchi, Pascal R M

    2008-01-01

    Some time ago we developed and published a new deterministic algorithm (called Cycle) for automatic selection of beam orientations in radiotherapy. This algorithm is a plan generation process aiming at the prescribed PTV dose within hard dose and dose-volume constraints. The algorithm allows a large number of input orientations to be used and selects only the most efficient orientations, surviving the selection process. Efficiency is determined by a score function and is more or less equal to the extent of uninhibited access to the PTV for a specific beam during the selection process. In this paper we compare the capabilities of fast-simulated annealing (FSA) and Cycle for cases where local optima are supposed to be present. Five pancreas and five oesophagus cases previously treated in our institute were selected for this comparison. Plans were generated for FSA and Cycle, using the same hard dose and dose-volume constraints, and the largest possible achieved PTV doses as obtained from these algorithms were compared. The largest achieved PTV dose values were generally very similar for the two algorithms. In some cases FSA resulted in a slightly higher PTV dose than Cycle, at the cost of switching on substantially more beam orientations than Cycle. In other cases, when Cycle generated the solution with the highest PTV dose using only a limited number of non-zero weight beams, FSA seemed to have some difficulty in switching off the unfavourable directions. Cycle was faster than FSA, especially for large-dimensional feasible spaces. In conclusion, for the cases studied in this paper, we have found that despite the inherent drawback of sequential search as used by Cycle (where Cycle could probably get trapped in a local optimum), Cycle is nevertheless able to find comparable or sometimes slightly better treatment plans in comparison with FSA (which in theory finds the global optimum) especially in large-dimensional beam weight spaces

  9. Sequential cranial ultrasound and cerebellar diffusion weighted imaging contribute to the early prognosis of neurodevelopmental outcome in preterm infants.

    Directory of Open Access Journals (Sweden)

    Margaretha J Brouwer

    Full Text Available OBJECTIVE: To evaluate the contribution of sequential cranial ultrasound (cUS and term-equivalent age magnetic resonance imaging (TEA-MRI including diffusion weighted imaging (DWI to the early prognosis of neurodevelopmental outcome in a cohort of very preterm infants (gestational age [GA] <31 weeks. STUDY DESIGN: In total, 93 preterm infants (median [range] GA in weeks: 28.3 [25.0-30.9] were enrolled in this prospective cohort study and underwent early and term cUS as well as TEA-MRI including DWI. Early cUS abnormalities were classified as normal, mild, moderate or severe. Term cUS was evaluated for ex-vacuo ventriculomegaly (VM and enlargement of the extracerebral cerebrospinal fluid (eCSF space. Abnormalities on T1- and T2-weighted TEA-MRI were scored according to Kidokoro et al. Using DWI at TEA, apparent diffusion coefficients (ADCs were measured in four white matter regions bilaterally and both cerebellar hemispheres. Neurodevelopmental outcome was assessed at two years' corrected age (CA using the Bayley Scales of Infant and Toddler Development, third edition. Linear regression analysis was conducted to explore the correlation between the different neuroimaging modalities and outcome. RESULTS: Moderate/severe abnormalities on early cUS, ex-vacuo VM and enlargement of the eCSF space on term cUS and increased cerebellar ADC values on term DWI were independently associated with worse motor outcome (p<.05. Ex-vacuo VM on term cUS was also related to worse cognitive performance at two years' CA (p<.01. CONCLUSION: These data support the clinical value of sequential cUS and recommend repeating cUS at TEA. In particular, assessment of moderate/severe early cUS abnormalities and ex-vacuo VM on term cUS provides important prognostic information. Cerebellar ADC values may further aid in the prognostication of gross motor function.

  10. Using Zebra-speech to study sequential and simultaneous speech segregation in a cochlear-implant simulation.

    Science.gov (United States)

    Gaudrain, Etienne; Carlyon, Robert P

    2013-01-01

    Previous studies have suggested that cochlear implant users may have particular difficulties exploiting opportunities to glimpse clear segments of a target speech signal in the presence of a fluctuating masker. Although it has been proposed that this difficulty is associated with a deficit in linking the glimpsed segments across time, the details of this mechanism are yet to be explained. The present study introduces a method called Zebra-speech developed to investigate the relative contribution of simultaneous and sequential segregation mechanisms in concurrent speech perception, using a noise-band vocoder to simulate cochlear implants. One experiment showed that the saliency of the difference between the target and the masker is a key factor for Zebra-speech perception, as it is for sequential segregation. Furthermore, forward masking played little or no role, confirming that intelligibility was not limited by energetic masking but by across-time linkage abilities. In another experiment, a binaural cue was used to distinguish the target and the masker. It showed that the relative contribution of simultaneous and sequential segregation depended on the spectral resolution, with listeners relying more on sequential segregation when the spectral resolution was reduced. The potential of Zebra-speech as a segregation enhancement strategy for cochlear implants is discussed.

  11. Using sequential indicator simulation to assess the uncertainty of delineating heavy-metal contaminated soils

    International Nuclear Information System (INIS)

    Juang, Kai-Wei; Chen, Yue-Shin; Lee, Dar-Yuan

    2004-01-01

    Mapping the spatial distribution of soil pollutants is essential for delineating contaminated areas. Currently, geostatistical interpolation, kriging, is increasingly used to estimate pollutant concentrations in soils. The kriging-based approach, indicator kriging (IK), may be used to model the uncertainty of mapping. However, a smoothing effect is usually produced when using kriging in pollutant mapping. The detailed spatial patterns of pollutants could, therefore, be lost. The local uncertainty of mapping pollutants derived by the IK technique is referred to as the conditional cumulative distribution function (ccdf) for one specific location (i.e. single-location uncertainty). The local uncertainty information obtained by IK is not sufficient as the uncertainty of mapping at several locations simultaneously (i.e. multi-location uncertainty or spatial uncertainty) is required to assess the reliability of the delineation of contaminated areas. The simulation approach, sequential indicator simulation (SIS), which has the ability to model not only single, but also multi-location uncertainties, was used, in this study, to assess the uncertainty of the delineation of heavy metal contaminated soils. To illustrate this, a data set of Cu concentrations in soil from Taiwan was used. The results show that contour maps of Cu concentrations generated by the SIS realizations exhausted all the spatial patterns of Cu concentrations without the smoothing effect found when using the kriging method. Based on the SIS realizations, the local uncertainty of Cu concentrations at a specific location of x', refers to the probability of the Cu concentration z(x') being higher than the defined threshold level of contamination (z c ). This can be written as Prob SIS [z(x')>z c ], representing the probability of contamination. The probability map of Prob SIS [z(x')>z c ] can then be used for delineating contaminated areas. In addition, the multi-location uncertainty of an area A

  12. Hierarchical Bayesian analysis of outcome- and process-based social preferences and beliefs in Dictator Games and sequential Prisoner's Dilemmas.

    Science.gov (United States)

    Aksoy, Ozan; Weesie, Jeroen

    2014-05-01

    In this paper, using a within-subjects design, we estimate the utility weights that subjects attach to the outcome of their interaction partners in four decision situations: (1) binary Dictator Games (DG), second player's role in the sequential Prisoner's Dilemma (PD) after the first player (2) cooperated and (3) defected, and (4) first player's role in the sequential Prisoner's Dilemma game. We find that the average weights in these four decision situations have the following order: (1)>(2)>(4)>(3). Moreover, the average weight is positive in (1) but negative in (2), (3), and (4). Our findings indicate the existence of strong negative and small positive reciprocity for the average subject, but there is also high interpersonal variation in the weights in these four nodes. We conclude that the PD frame makes subjects more competitive than the DG frame. Using hierarchical Bayesian modeling, we simultaneously analyze beliefs of subjects about others' utility weights in the same four decision situations. We compare several alternative theoretical models on beliefs, e.g., rational beliefs (Bayesian-Nash equilibrium) and a consensus model. Our results on beliefs strongly support the consensus effect and refute rational beliefs: there is a strong relationship between own preferences and beliefs and this relationship is relatively stable across the four decision situations. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Anemia During Sequential Induction Chemotherapy and Chemoradiation for Head and Neck Cancer: The Impact of Blood Transfusion on Treatment Outcome

    International Nuclear Information System (INIS)

    Bhide, Shreerang A.; Ahmed, Merina; Rengarajan, Vijayan; Powell, Ceri; Miah, Aisha; Newbold, Kate; Nutting, Christopher M.; Harrington, Kevin J.

    2009-01-01

    Purpose: Sequential treatment (chemotherapy followed by concomitant chemoradiation; CCRT) is increasingly being used for radical treatment of squamous cell cancer of the head and neck (SCCHN), which results in increased myelosuppression. In this study, we review the incidence of anemia and the effect of a policy of hemoglobin (Hb) maintenance by blood transfusion on disease outcomes in these patients. Methods and Materials: Retrospective review of the records of patients with SCCHN treated with sequential CCRT formed the basis of this study. The incidence of anemia and statistics on blood transfusion were documented. For the purpose of outcome analyses, patients were divided into four categories by (1) transfusion status, (2) nadir Hb concentration, (3) number of transfusion episodes, and (4) number of units of blood transfused (NOUT). Data on 3-year locoregional control (LRC), relapse-free survival (RFS), disease-specific survival (DSS), and overall survival (OS) were analyzed. Results: One hundred and sixty-nine patients were identified. The median follow-up was 23.6 months. The RFS (52% vs. 41%, p = 0.03), DSS (71% vs. 66%, p = 0.02), and OS (58% vs. 42% p = 0.005) were significantly better for patients who did not have a transfusion vs. those who did. The LRC, RFS, DSS, and OS were also significantly better for patients with nadir Hb level >12 vs. 4. Conclusion: Our study seems to suggest that blood transfusion during radical treatment for SCCHN might be detrimental. Further research should be undertaken into the complex interactions among tumor hypoxia, anemia, and the treatment of anemia before making treatment recommendations

  14. Anemia During Sequential Induction Chemotherapy and Chemoradiation for Head and Neck Cancer: The Impact of Blood Transfusion on Treatment Outcome

    Energy Technology Data Exchange (ETDEWEB)

    Bhide, Shreerang A; Ahmed, Merina [Institute of Cancer Research, Royal Marsden National Health Service Foundation Trust Hospital, London (United Kingdom); Head and Neck Unit, Royal Marsden National Health Service Foundation Trust Hospital, London (United Kingdom); Rengarajan, Vijayan; Powell, Ceri; Miah, Aisha; Newbold, Kate [Head and Neck Unit, Royal Marsden National Health Service Foundation Trust Hospital, London (United Kingdom); Nutting, Christopher M [Institute of Cancer Research, Royal Marsden National Health Service Foundation Trust Hospital, London (United Kingdom); Head and Neck Unit, Royal Marsden National Health Service Foundation Trust Hospital, London (United Kingdom); Harrington, Kevin J [Institute of Cancer Research, Royal Marsden National Health Service Foundation Trust Hospital, London (United Kingdom); Head and Neck Unit, Royal Marsden National Health Service Foundation Trust Hospital, London (United Kingdom)], E-mail: kevinh@icr.ac.uk

    2009-02-01

    Purpose: Sequential treatment (chemotherapy followed by concomitant chemoradiation; CCRT) is increasingly being used for radical treatment of squamous cell cancer of the head and neck (SCCHN), which results in increased myelosuppression. In this study, we review the incidence of anemia and the effect of a policy of hemoglobin (Hb) maintenance by blood transfusion on disease outcomes in these patients. Methods and Materials: Retrospective review of the records of patients with SCCHN treated with sequential CCRT formed the basis of this study. The incidence of anemia and statistics on blood transfusion were documented. For the purpose of outcome analyses, patients were divided into four categories by (1) transfusion status, (2) nadir Hb concentration, (3) number of transfusion episodes, and (4) number of units of blood transfused (NOUT). Data on 3-year locoregional control (LRC), relapse-free survival (RFS), disease-specific survival (DSS), and overall survival (OS) were analyzed. Results: One hundred and sixty-nine patients were identified. The median follow-up was 23.6 months. The RFS (52% vs. 41%, p = 0.03), DSS (71% vs. 66%, p = 0.02), and OS (58% vs. 42% p = 0.005) were significantly better for patients who did not have a transfusion vs. those who did. The LRC, RFS, DSS, and OS were also significantly better for patients with nadir Hb level >12 vs. <12 g/dL and NOUT 1-4 vs. >4. Conclusion: Our study seems to suggest that blood transfusion during radical treatment for SCCHN might be detrimental. Further research should be undertaken into the complex interactions among tumor hypoxia, anemia, and the treatment of anemia before making treatment recommendations.

  15. Twenty-five year outcome of sequential abdominopelvic radiotherapy and alkylating agent chemotherapy for ovarian carcinoma

    International Nuclear Information System (INIS)

    Bellairs, Ellen E.; Twiggs, Leo B.; Potish, Roger A.

    1997-01-01

    Purpose: A prospective study of sequential surgery, abdominopelvic radiotherapy and single agent alkylating chemotherapy was conducted to evaluate survival and toxicity in the management of ovarian carcinoma. Methods: From 1970-1976, 95 women with stage I-III epithelial ovarian carcinoma were scheduled to receive postoperative radiotherapy consisting of 20.0 Gy to the whole abdomen (1.0 Gy/day), a 29.75 Gy pelvic boost (1.75 Gy/day) and 10 subsequent courses of Melphalan (1 mg/kg/course). Endpoints were overall survival, disease-free survival(DFS), and acute and chronic toxicity. Results: The evaluable 94 patients included 19 stage I, 25 stage II, and 50 stage III. Of the latter, 21 had no palpable disease postoperatively (IIIN) and 29 had postoperative palpable disease (IIIP). Overall survival at 5, 10, 15 and 20 years was 42%, 30%, 23% and 22%. DFS for the entire group was 54% at 5 years and remained 50% from 10 to 25 years. All but two recurrences were noted within the first 27 months. No recurrence or treatment-related deaths occurred after 8 years. After 10 years, the survival of the study group became parallel to the general population. Prognostic factors were only related to stage (p<.001) and the presence of postoperative palpable disease(p<.001). DFS at 25 years was 95 % for stage I, 71% at 5 years and 66% from 10 to 25 years for stage II, and 17% at 5 years and 11% thereafter for stage III patients(p<.001). Although no stage IIIP patients were cured, 25% lived beyond 2 years. Five year DFS was significantly better in IIIN (45%) vs. IIIP (0%) patients (p<.001). The 65 patients without postoperative palpable disease, (stage I-IIIN) achieved DFS at 5 and 25 years of 69%, and 61%, respectively. Of 31 patients undergoing a second-look surgery, 84% were found to be free of tumor. Two recurred at 3.5 and 7 years after surgery. Acute tolerance was acceptable. Chronic toxicity included an 11.7% rate of small bowel obstruction requiring surgery and a 3% rate of

  16. The effect of sequential coupling on radial displacement accuracy in electromagnetic inside-bead forming: simulation and experimental analysis using Maxwell and ABAQUS software

    Energy Technology Data Exchange (ETDEWEB)

    Chaharmiri, Rasoul; Arezoodar, Alireza Fallahi [Amirkabir University, Tehran (Iran, Islamic Republic of)

    2016-05-15

    Electromagnetic forming (EMF) is a high strain rate forming technology which can effectively deform and shape high electrically conductive materials at room temperature. In this study, the electromagnetic and mechanical parts of the process simulated using Maxwell and ABAQUS software, respectively. To provide a link between the software, two approaches include 'loose' and 'sequential' coupling were applied. This paper is aimed to investigate how sequential coupling would affect radial displacement accuracy, as an indicator of tube final shape, at various discharge voltages. The results indicated a good agreement for the both approaches at lower discharge voltages with more accurate results for sequential coupling, but at high discharge voltages, there was a non-negligible overestimation of about 43% for the loose coupling reduced to only 8.2% difference by applying sequential coupling in the case studied. Therefore, in order to reach more accurate predictions, applying sequential coupling especially at higher discharge voltages is strongly recommended.

  17. Simulating memory outcome before right selective amygdalohippocampectomy.

    Science.gov (United States)

    Patrikelis, Panayiotis; Lucci, Giuliana; Siatouni, Anna; Zalonis, Ioannis; Sakas, Damianos E; Gatzonis, Stylianos

    2013-01-01

    In this paper we present the case of a left-sided speech dominant patient with right medial temporal sclerosis (RMTS) and pharmacoresistant epilepsy who showed improved verbal memory during intracarotid amobarbital test (IAT) at his right hemisphere as compared with his own performance before the drug injection (baseline), as well as after right selective amygdalohippocampectomy. We suggest that the defective verbal memory shown by this patient is due to abnormal activity of his right hippocampus that interfered with the function of his left hippocampus. This hypothesis was demonstrated by the fact that disconnection of the two hippocampi, either by anesthetisation or by resection of the right hippocampus, disengaged the left hippocampus and, consequently improved its function. This paper main objective is twofold: first to contribute to the field of neuropsychology of epilepsy surgery by emphasising on postoperative memory outcomes in right medial temporal lobe epilepsy (RMTLE) patients, particularly those undergoing amygdalohippocampectomy, as the pattern of memory changes after resection of the right temporal lobe is less clear; second, by focusing on memory performance asymmetries during IAT, and comparatively considering them with neuropsychological memory performance, because of their possible prognostic-simulating value.

  18. Sequential application of non-pharmacological interventions reduces the severity of labour pain, delays use of pharmacological analgesia, and improves some obstetric outcomes: a randomised trial

    Directory of Open Access Journals (Sweden)

    Rubneide Barreto Silva Gallo

    2018-01-01

    Trial registration: NCT01389128. [Gallo RBS, Santana LS, Marcolin AC, Duarte G, Quintana SM (2018 Sequential application of non-pharmacological interventions reduces the severity of labour pain, delays use of pharmacological analgesia, and improves some obstetric outcomes: a randomised trial. Journal of Physiotherapy 64: 33–40

  19. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    Science.gov (United States)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  20. Sequential computerized tomography changes and related final outcome in severe head injury patients

    International Nuclear Information System (INIS)

    Lobato, R.D.; Gomez, P.A.; Alday, R.

    1997-01-01

    The authors analyzed the serial computerized tomography (CT) findings in a large series of severely head injured patients in order to assess the variability in gross intracranial pathology through the acute posttraumatic period and determine the most common patterns of CT change. A second aim was to compare the prognostic significance of the different CT diagnostic categories used in the study (Traumatic Coma Data Bank CT pathological classification) when gleaned either from the initial (postadmission) or the control CT scans, and determine the extent to which having a second CT scan provides more prognostic information than only one scan. 92 patients (13.3 % of the total population) died soon after injury. Of the 587 who survived long enough to have at least one control CT scan 23.6 % developed new diffuse brain swelling, and 20.9 % new focal mass lesions most of which had to be evacuated. The relative risk for requiring a delayed operation as related to the diagnostic category established by using the initial CT scans was by decreasing order: diffuse injury IV (30.7 %), diffuse injury III (30.5 %), non evacuated mass (20 %), evacuated mass (20.2 %), diffuse injury II (12.1 %), and diffuse injury I (8.6 %). Overall, 51.2 % of the patients developed significant CT changes (for worse or better) occurring either spontaneously or following surgery, and their final outcomes were more closely related to the control than to the initial CT diagnoses. In fact, the final outcome was more accurately predicted by using the control CT scans (81.2 % of the cases) than by using the initial CT scans (71.5 % of the cases only). Since the majority of relevant CT changes developed within 48 hours after injury a pathological categorization made by using an early control CT scan seems to be most useful for prognostic purposes. Prognosis associated with the CT pathological categories used in the study was similar independently of the moment of the acute posttraumatic period at which

  1. Sequential Organ Failure Assessment Score for Evaluating Organ Failure and Outcome of Severe Maternal Morbidity in Obstetric Intensive Care

    Directory of Open Access Journals (Sweden)

    Antonio Oliveira-Neto

    2012-01-01

    Full Text Available Objective. To evaluate the performance of Sequential Organ Failure Assessment (SOFA score in cases of severe maternal morbidity (SMM. Design. Retrospective study of diagnostic validation. Setting. An obstetric intensive care unit (ICU in Brazil. Population. 673 women with SMM. Main Outcome Measures. mortality and SOFA score. Methods. Organ failure was evaluated according to maximum score for each one of its six components. The total maximum SOFA score was calculated using the poorest result of each component, reflecting the maximum degree of alteration in systemic organ function. Results. highest total maximum SOFA score was associated with mortality, 12.06 ± 5.47 for women who died and 1.87 ± 2.56 for survivors. There was also a significant correlation between the number of failing organs and maternal mortality, ranging from 0.2% (no failure to 85.7% (≥3 organs. Analysis of the area under the receiver operating characteristic (ROC curve (AUC confirmed the excellent performance of total maximum SOFA score for cases of SMM (AUC = 0.958. Conclusions. Total maximum SOFA score proved to be an effective tool for evaluating severity and estimating prognosis in cases of SMM. Maximum SOFA score may be used to conceptually define and stratify the degree of severity in cases of SMM.

  2. Customized sequential designs for random simulation experiments: Kriging metamodeling and bootstrapping

    NARCIS (Netherlands)

    Beers, van W.C.M.; Kleijnen, J.P.C.

    2005-01-01

    This paper proposes a novel method to select an experimental design for interpolation in random simulation, especially discrete event simulation. (Though the paper focuses on Kriging, this design approach may also apply to other types of metamodels such as linear regression models.) Assuming that

  3. Sequential Monte Carlo simulation of collision risk in free flight air traffic

    NARCIS (Netherlands)

    Blom, H.A.P.; Bakker, G.; Krystul, J.; Everdij, M.H.C.; Klein Obbink, B.; Klompstra, M.B.

    2005-01-01

    Within HYBRIDGE a novel approach in speeding up Monte Carlo simulation of rare events has been developed. In the current report this method is extended for application to simulating collisions with a stochastic dynamical model of an air traffic operational concept. Subsequently this extended Monte

  4. Exploiting neurovascular coupling: a Bayesian sequential Monte Carlo approach applied to simulated EEG fNIRS data

    Science.gov (United States)

    Croce, Pierpaolo; Zappasodi, Filippo; Merla, Arcangelo; Chiarelli, Antonio Maria

    2017-08-01

    Objective. Electrical and hemodynamic brain activity are linked through the neurovascular coupling process and they can be simultaneously measured through integration of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Thanks to the lack of electro-optical interference, the two procedures can be easily combined and, whereas EEG provides electrophysiological information, fNIRS can provide measurements of two hemodynamic variables, such as oxygenated and deoxygenated hemoglobin. A Bayesian sequential Monte Carlo approach (particle filter, PF) was applied to simulated recordings of electrical and neurovascular mediated hemodynamic activity, and the advantages of a unified framework were shown. Approach. Multiple neural activities and hemodynamic responses were simulated in the primary motor cortex of a subject brain. EEG and fNIRS recordings were obtained by means of forward models of volume conduction and light propagation through the head. A state space model of combined EEG and fNIRS data was built and its dynamic evolution was estimated through a Bayesian sequential Monte Carlo approach (PF). Main results. We showed the feasibility of the procedure and the improvements in both electrical and hemodynamic brain activity reconstruction when using the PF on combined EEG and fNIRS measurements. Significance. The investigated procedure allows one to combine the information provided by the two methodologies, and, by taking advantage of a physical model of the coupling between electrical and hemodynamic response, to obtain a better estimate of brain activity evolution. Despite the high computational demand, application of such an approach to in vivo recordings could fully exploit the advantages of this combined brain imaging technology.

  5. Interpersonal Compatibility: Effect on Simulation Game Outcomes.

    Science.gov (United States)

    Yantis, Betty; Nixon, John E.

    1982-01-01

    Investigates the impact of interpersonal relationships on decision-making success in small groups using a business simulation game as a research vehicle. The study concludes that group decision making may be unfavorably affected by personality conflicts. (Author/JJD)

  6. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome

    DEFF Research Database (Denmark)

    Andersen, Simone Nyholm; Broberg, Ole

    2015-01-01

    of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models’ high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity......Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full......-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups’ high fidelity of room layout and affordance...

  7. Outcomes of sequential treatment with sorafenib followed by regorafenib for HCC: additional analyses from the phase 3 RESORCE trial.

    Science.gov (United States)

    Finn, Richard S; Merle, Philippe; Granito, Alessandro; Huang, Yi-Hsiang; Bodoky, György; Pracht, Marc; Yokosuka, Osamu; Rosmorduc, Olivier; Gerolami, René; Caparello, Chiara; Cabrera, Roniel; Chang, Charissa; Sun, Weijing; LeBerre, Marie-Aude; Baumhauer, Annette; Meinhardt, Gerold; Bruix, Jordi

    2018-04-25

    The RESORCE trial showed that regorafenib improves overall survival (OS) in patients with hepatocellular carcinoma progressing during sorafenib treatment (hazard ratio [HR] 0.62, 95% confidence interval [CI] 0.50, 0.78; P<.0001). This exploratory analysis describes outcomes of sequential treatment with sorafenib followed by regorafenib. In RESORCE, 573 patients were randomized 2:1 to regorafenib 160mg/day or placebo for 3 weeks on/1 week off. Efficacy and safety were evaluated by last sorafenib dose. The time from the start of sorafenib to death was assessed. Time to progression (TTP) in RESORCE was analyzed by TTP during prior sorafenib treatment. HRs (regorafenib/placebo) for OS by last sorafenib dose were similar (0.67 for 800 mg/day; 0.68 for <800 mg/day). Rates of grade 3, 4, and 5 adverse events with regorafenib by last sorafenib dose (800 mg/day versus <800 mg/day) were 52%, 11%, and 15% versus 60%, 10%, and 12%, respectively. Median times (95% CI) from the start of sorafenib to death were 26.0 months (22.6, 28.1) for regorafenib and 19.2 months (16.3, 22.8) for placebo. Median time from the start of sorafenib to progression on sorafenib was 7.2 months for the regorafenib arm and 7.1 months for the placebo arm. An analysis of TTP in RESORCE in subgroups defined by TTP during prior sorafenib in quartiles (Q) showed HRs (regorafenib/placebo; 95% CI) of 0.66 (0.45, 0.96; Q1); 0.26 (0.17, 0.40; Q2); 0.40 (0.27, 0.60; Q3); and 0.54 (0.36, 0.81; Q4). These exploratory analyses show that regorafenib conferred a clinical benefit regardless of the last sorafenib dose or TTP on prior sorafenib. Rates of adverse events were generally similar regardless of the last sorafenib dose. This analysis examined characteristics and outcomes of patients with HCC who were treated with regorafenib after they had disease progression during sorafenib treatment. Regorafenib provided clinical benefit to patients regardless of the pace of their disease progression during prior sorafenib

  8. Sequential UASB and dual media packed-bed reactors for domestic wastewater treatment - experiment and simulation.

    Science.gov (United States)

    Rodríguez-Gómez, Raúl; Renman, Gunno

    2016-01-01

    A wastewater treatment system composed of an upflow anaerobic sludge blanket (UASB) reactor followed by a packed-bed reactor (PBR) filled with Sorbulite(®) and Polonite(®) filter material was tested in a laboratory bench-scale experiment. The system was operated for 50 weeks and achieved very efficient total phosphorus (P) removal (99%), 7-day biochemical oxygen demand removal (99%) and pathogenic bacteria reduction (99%). However, total nitrogen was only moderately reduced in the system (40%). A model focusing on simulation of organic material, solids and size of granules was then implemented and validated for the UASB reactor. Good agreement between the simulated and measured results demonstrated the capacity of the model to predict the behaviour of solids and chemical oxygen demand, which is critical for successful P removal and recovery in the PBR.

  9. Simulation of skill acquisition in sequential learning of a computer game

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Nielsen, Finn Ravnsbjerg; Rasmussen, Jens

    1995-01-01

    The paper presents some theoretical assumptions about the cognitive control mechanisms of subjects learning to play a computer game. A simulation model has been developed to investigate these assumptions. The model is an automaton, reacting to instruction-like cue action rules. The prototypical...... performances of 23 experimental subjects at succeeding levels of training are compared to the performance of the model. The findings are interpreted in terms of a general taxonomy for cognitive task analysis....

  10. The influence of simultaneous or sequential test conditions in the properties of industrial polymers, submitted to PWR accident simulations

    International Nuclear Information System (INIS)

    Carlin, F.; Alba, C.; Chenion, J.; Gaussens, G.; Henry, J.Y.

    1986-10-01

    The effect of PWR plant normal and accident operating conditions on polymers forms the basis of nuclear qualification of safety-related containment equipment. This study was carried out on the request of safety organizations. Its purpose was to check whether accident simulations carried out sequentially during equipment qualification tests would lead to the same deterioration as that caused by an accident involving simultaneous irradiation and thermodynamic effects. The IPSN, DAS and the United States NRC have collaborated in preparing this study. The work carried out by ORIS Company as well as the results obtained from measurement of the mechanical properties of 8 industrial polymers are described in this report. The results are given in the conclusion. They tend to show that, overall, the most suitable test cycle for simulating accident operating conditions would be one which included irradiation and consecutive thermodynamic shock. The results of this study and the results obtained in a previous study, which included the same test cycles, except for more severe thermo-ageing, have been compared. This comparison, which was made on three elastomers, shows that ageing after the accident has a different effect on each material [fr

  11. Evaluating medical student engagement during virtual patient simulations: a sequential, mixed methods study.

    Science.gov (United States)

    McCoy, Lise; Pettit, Robin K; Lewis, Joy H; Allgood, J Aaron; Bay, Curt; Schwartz, Frederic N

    2016-01-16

    Student engagement is an important domain for medical education, however, it is difficult to quantify. The goal of this study was to investigate the utility of virtual patient simulations (VPS) for increasing medical student engagement. Our aims were specifically to investigate how and to what extent the VPS foster student engagement. This study took place at A.T. Still University School of Osteopathic Medicine in Arizona (ATSU-SOMA), in the USA. First year medical students (n = 108) worked in teams to complete a series of four in-class virtual patient case studies. Student engagement was measured, defined as flow, interest, and relevance. These dimensions were measured using four data collection instruments: researcher observations, classroom photographs, tutor feedback, and an electronic exit survey. Qualitative data were analyzed using a grounded theory approach. Triangulation of findings between the four data sources indicate that VPS foster engagement in three facets: 1) Flow. In general, students enjoyed the activities, and were absorbed in the task at hand. 2) Interest. Students demonstrated interest in the activities, as evidenced by enjoyment, active discussion, and humor. Students remarked upon elements that caused cognitive dissonance: excessive text and classroom noise generated by multi-media and peer conversations. 3) Relevance. VPS were relevant, in terms of situational clinical practice, exam preparation, and obtaining concrete feedback on clinical decisions. Researchers successfully introduced a new learning platform into the medical school curriculum. The data collected during this study were also used to improve new learning modules and techniques associated with implementing them in the classroom. Results of this study assert that virtual patient simulations foster engagement in terms of flow, relevance, and interest.

  12. Identification and Relative Quantification of Bioactive Peptides Sequentially Released during Simulated Gastrointestinal Digestion of Commercial Kefir.

    Science.gov (United States)

    Liu, Yufang; Pischetsrieder, Monika

    2017-03-08

    Health-promoting effects of kefir may be partially caused by bioactive peptides. To evaluate their formation or degradation during gastrointestinal digestion, we monitored changes of the peptide profile in a model of (1) oral, (2) gastric, and (3) small intestinal digestion of kefir. Matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy analyses revealed clearly different profiles between digests 2/3 and kefir/digest 1. Subsequent ultraperformance liquid chromatography-electrospray ionization-tandem mass spectrometry identified 92 peptides in total (25, 25, 43, and 30, partly overlapping in kefir and digests 1, 2, and 3, respectively), including 16 peptides with ascribed bioactivity. Relative quantification in scheduled multiple reaction monitoring mode showed that many bioactive peptides were released by simulated digestion. Most prominently, the concentration of angiotensin-converting enzyme inhibitor β-casein 203-209 increased approximately 10 000-fold after combined oral, gastric, and intestinal digestion. Thus, physiological digestive processes may promote bioactive peptide formation from proteins and oligopeptides in kefir. Furthermore, bioactive peptides present in certain compartments of the gastrointestinal tract may exert local physiological effects.

  13. Potential uranium supply system based upon computer simulation of sequential exploration and decisions under risk

    International Nuclear Information System (INIS)

    Ortiz-Vertiz, S.R.

    1991-01-01

    A Monte Carlo simulation system was used to estimate potential supply of roll-type deposits. The system takes a given uranium-endowment probability distribution and aims at two major and interrelated objectives: (1) to design a system that estimates potential supply even when prices are much higher than previous or current prices; and (2) to account fully for the cost of discovering and mining the individual mineral deposits contained in given endowment. Achievement of these objectives constitutes the major contribution of this study. To accomplish them, the system considers: cost of risk, return on investment, cost of failures during the search process, discovery depletion, and effect of physical characteristics of the deposits on exploration and mining costs. It also considers that when economic conditions, such as product price, are outside historical experience, existing behavioral rules - exploration drilling density, stopping rules, minimum attractive deposit size and grade, and mining parameters - are irrelevant. The system architecture is general and can be used with an exploration model prepared specifically for other minerals

  14. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome.

    Science.gov (United States)

    Andersen, Simone Nyholm; Broberg, Ole

    2015-11-01

    Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups' high fidelity of room layout and affordance of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models' high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity organization. Furthermore, the study addresses the form of the identified and evaluated conditions, being either identified challenges or tangible design criteria. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Sequential interactions-in which one player plays first and another responds-promote cooperation in evolutionary-dynamical simulations of single-shot Prisoner's Dilemma and Snowdrift games.

    Science.gov (United States)

    Laird, Robert A

    2018-05-21

    Cooperation is a central topic in evolutionary biology because (a) it is difficult to reconcile why individuals would act in a way that benefits others if such action is costly to themselves, and (b) it underpins many of the 'major transitions of evolution', making it essential for explaining the origins of successively higher levels of biological organization. Within evolutionary game theory, the Prisoner's Dilemma and Snowdrift games are the main theoretical constructs used to study the evolution of cooperation in dyadic interactions. In single-shot versions of these games, wherein individuals play each other only once, players typically act simultaneously rather than sequentially. Allowing one player to respond to the actions of its co-player-in the absence of any possibility of the responder being rewarded for cooperation or punished for defection, as in simultaneous or sequential iterated games-may seem to invite more incentive for exploitation and retaliation in single-shot games, compared to when interactions occur simultaneously, thereby reducing the likelihood that cooperative strategies can thrive. To the contrary, I use lattice-based, evolutionary-dynamical simulation models of single-shot games to demonstrate that under many conditions, sequential interactions have the potential to enhance unilaterally or mutually cooperative outcomes and increase the average payoff of populations, relative to simultaneous interactions-benefits that are especially prevalent in a spatially explicit context. This surprising result is attributable to the presence of conditional strategies that emerge in sequential games that can't occur in the corresponding simultaneous versions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Discrete event simulation: Modeling simultaneous complications and outcomes

    NARCIS (Netherlands)

    Quik, E.H.; Feenstra, T.L.; Krabbe, P.F.M.

    2012-01-01

    OBJECTIVES: To present an effective and elegant model approach to deal with specific characteristics of complex modeling. METHODS: A discrete event simulation (DES) model with multiple complications and multiple outcomes that each can occur simultaneously was developed. In this DES model parameters,

  17. Simulating and Communicating Outcomes in Disaster Management Situations

    Directory of Open Access Journals (Sweden)

    Michal Lichter

    2015-09-01

    Full Text Available An important, but overlooked component of disaster managment is raising the awareness and preparedness of potential stakeholders. We show how recent advances in agent-based modeling and geo-information analytics can be combined to this effect. Using a dynamic simulation model, we estimate the long run outcomes of two very different urban disasters with severe consequences: an earthquake and a missile attack. These differ in terms of duration, intensity, permanence, and focal points. These hypothetical shocks are simulated for the downtown area of Jerusalem. Outcomes are compared in terms of their potential for disaster mitigation. The spatial and temporal dynamics of the simulation yield rich outputs. Web-based mapping is used to visualize these results and communicate risk to policy makers, planners, and the informed public. The components and design of this application are described. Implications for participatory disaster management and planning are discussed.

  18. Using case study within a sequential explanatory design to evaluate the impact of specialist and advanced practice roles on clinical outcomes: the SCAPE study.

    Science.gov (United States)

    Lalor, Joan G; Casey, Dympna; Elliott, Naomi; Coyne, Imelda; Comiskey, Catherine; Higgins, Agnes; Murphy, Kathy; Devane, Declan; Begley, Cecily

    2013-04-08

    The role of the clinical nurse/midwife specialist and advanced nurse/midwife practitioner is complex not least because of the diversity in how the roles are operationalised across health settings and within multidisciplinary teams. This aim of this paper is to use The SCAPE Study: Specialist Clinical and Advanced Practitioner Evaluation in Ireland to illustrate how case study was used to strengthen a Sequential Explanatory Design. In Phase 1, clinicians identified indicators of specialist and advanced practice which were then used to guide the instrumental case study design which formed the second phase of the larger study. Phase 2 used matched case studies to evaluate the effectiveness of specialist and advanced practitioners on clinical outcomes for service users. Data were collected through observation, documentary analysis, and interviews. Observations were made of 23 Clinical Specialists or Advanced Practitioners, and 23 matched clinicians in similar matched non-postholding sites, while they delivered care. Forty-one service users, 41 clinicians, and 23 Directors of Nursing or Midwifery were interviewed, and 279 service users completed a survey based on the components of CS and AP practice identified in Phase 1. A coding framework, and the generation of cross tabulation matrices in NVivo, was used to make explicit how the outcome measures were confirmed and validated from multiple sources. This strengthened the potential to examine single cases that seemed 'different', and allowed for cases to be redefined. Phase 3 involved interviews with policy-makers to set the findings in context. Case study is a powerful research strategy to use within sequential explanatory mixed method designs, and adds completeness to the exploration of complex issues in clinical practice. The design is flexible, allowing the use of multiple data collection methods from both qualitative and quantitative paradigms. Multiple approaches to data collection are needed to evaluate the impact

  19. Sequential Dependencies in Driving

    Science.gov (United States)

    Doshi, Anup; Tran, Cuong; Wilder, Matthew H.; Mozer, Michael C.; Trivedi, Mohan M.

    2012-01-01

    The effect of recent experience on current behavior has been studied extensively in simple laboratory tasks. We explore the nature of sequential effects in the more naturalistic setting of automobile driving. Driving is a safety-critical task in which delayed response times may have severe consequences. Using a realistic driving simulator, we find…

  20. Simulation modeling analysis of sequential relations among therapeutic alliance, symptoms, and adherence to child-centered play therapy between a child with autism spectrum disorder and two therapists.

    Science.gov (United States)

    Goodman, Geoff; Chung, Hyewon; Fischel, Leah; Athey-Lloyd, Laura

    2017-07-01

    This study examined the sequential relations among three pertinent variables in child psychotherapy: therapeutic alliance (TA) (including ruptures and repairs), autism symptoms, and adherence to child-centered play therapy (CCPT) process. A 2-year CCPT of a 6-year-old Caucasian boy diagnosed with autism spectrum disorder was conducted weekly with two doctoral-student therapists, working consecutively for 1 year each, in a university-based community mental-health clinic. Sessions were video-recorded and coded using the Child Psychotherapy Process Q-Set (CPQ), a measure of the TA, and an autism symptom measure. Sequential relations among these variables were examined using simulation modeling analysis (SMA). In Therapist 1's treatment, unexpectedly, autism symptoms decreased three sessions after a rupture occurred in the therapeutic dyad. In Therapist 2's treatment, adherence to CCPT process increased 2 weeks after a repair occurred in the therapeutic dyad. The TA decreased 1 week after autism symptoms increased. Finally, adherence to CCPT process decreased 1 week after autism symptoms increased. The authors concluded that (1) sequential relations differ by therapist even though the child remains constant, (2) therapeutic ruptures can have an unexpected effect on autism symptoms, and (3) changes in autism symptoms can precede as well as follow changes in process variables.

  1. Sequential Oxygenation Index and Organ Dysfunction Assessment within the First 3 Days of Mechanical Ventilation Predict the Outcome of Adult Patients with Severe Acute Respiratory Failure

    Directory of Open Access Journals (Sweden)

    Hsu-Ching Kao

    2013-01-01

    Full Text Available Objective. To determine early predictors of outcomes of adult patients with severe acute respiratory failure. Method. 100 consecutive adult patients with severe acute respiratory failure were evaluated in this retrospective study. Data including comorbidities, Sequential Organ Failure Assessment (SOFA score, Acute Physiological Assessment and Chronic Health Evaluation II (APACHE II score, PaO2, FiO2, PaO2/FiO2, PEEP, mean airway pressure (mPaw, and oxygenation index (OI on the 1st and the 3rd day of mechanical ventilation, and change in OI within 3 days were recorded. Primary outcome was hospital mortality; secondary outcome measure was ventilator weaning failure. Results. 38 out of 100 (38% patients died within the study period. 48 patients (48% failed to wean from ventilator. Multivariate analysis showed day 3 OI ( and SOFA ( score were independent predictors of hospital mortality. Preexisting cerebrovascular accident (CVA ( was the predictor of weaning failure. Results from Kaplan-Meier method demonstrated that higher day 3 OI was associated with shorter survival time (log-Rank test, . Conclusion. Early OI (within 3 days and SOFA score were predictors of mortality in severe acute respiratory failure. In the future, prospective studies measuring serial OIs in a larger scale of study cohort is required to further consolidate our findings.

  2. Operational reliability evaluation of restructured power systems with wind power penetration utilizing reliability network equivalent and time-sequential simulation approaches

    DEFF Research Database (Denmark)

    Ding, Yi; Cheng, Lin; Zhang, Yonghong

    2014-01-01

    In the last two decades, the wind power generation has been rapidly and widely developed in many regions and countries for tackling the problems of environmental pollution and sustainability of energy supply. However, the high share of intermittent and fluctuating wind power production has also...... and reserve provides, fast reserve providers and transmission network in restructured power systems. A contingency management schema for real time operation considering its coupling with the day-ahead market is proposed. The time-sequential Monte Carlo simulation is used to model the chronological...

  3. Sequential Banking.

    OpenAIRE

    Bizer, David S; DeMarzo, Peter M

    1992-01-01

    The authors study environments in which agents may borrow sequentially from more than one leader. Although debt is prioritized, additional lending imposes an externality on prior debt because, with moral hazard, the probability of repayment of prior loans decreases. Equilibrium interest rates are higher than they would be if borrowers could commit to borrow from at most one bank. Even though the loan terms are less favorable than they would be under commitment, the indebtedness of borrowers i...

  4. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  5. Simultaneous sequential monitoring of efficacy and safety led to masking of effects.

    Science.gov (United States)

    van Eekelen, Rik; de Hoop, Esther; van der Tweel, Ingeborg

    2016-08-01

    Usually, sequential designs for clinical trials are applied on the primary (=efficacy) outcome. In practice, other outcomes (e.g., safety) will also be monitored and influence the decision whether to stop a trial early. Implications of simultaneous monitoring on trial decision making are yet unclear. This study examines what happens to the type I error, power, and required sample sizes when one efficacy outcome and one correlated safety outcome are monitored simultaneously using sequential designs. We conducted a simulation study in the framework of a two-arm parallel clinical trial. Interim analyses on two outcomes were performed independently and simultaneously on the same data sets using four sequential monitoring designs, including O'Brien-Fleming and Triangular Test boundaries. Simulations differed in values for correlations and true effect sizes. When an effect was present in both outcomes, competition was introduced, which decreased power (e.g., from 80% to 60%). Futility boundaries for the efficacy outcome reduced overall type I errors as well as power for the safety outcome. Monitoring two correlated outcomes, given that both are essential for early trial termination, leads to masking of true effects. Careful consideration of scenarios must be taken into account when designing sequential trials. Simulation results can help guide trial design. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Emotion, cognitive load and learning outcomes during simulation training.

    Science.gov (United States)

    Fraser, Kristin; Ma, Irene; Teteris, Elise; Baxter, Heather; Wright, Bruce; McLaughlin, Kevin

    2012-11-01

    Simulation training has emerged as an effective way to complement clinical training of medical students. Yet outcomes from simulation training must be considered suboptimal when 25-30% of students fail to recognise a cardiac murmur on which they were trained 1 hour previously. There are several possible explanations for failure to improve following simulation training, which include the impact of heightened emotions on learning and cognitive overload caused by interactivity with high-fidelity simulators. This study was conducted to assess emotion during simulation training and to explore the relationships between emotion and cognitive load, and diagnostic performance. We trained 84 Year 1 medical students on a scenario of chest pain caused by symptomatic aortic stenosis. After training, students were asked to rate their emotional state and cognitive load. We then provided training on a dyspnoea scenario before asking participants to diagnose the murmur in which they had been trained (aortic stenosis) and a novel murmur (mitral regurgitation). We used factor analysis to identify the principal components of emotion, and then studied the associations between these components of emotion and cognitive load and diagnostic performance. We identified two principal components of emotion, which we felt represented invigoration and tranquillity. Both of these were associated with cognitive load with adjusted regression coefficients of 0.63 (95% confidence interval [CI] 0.28-0.99; p = 0.001) and - 0.44 (95% CI - 0.77 to - 0.10; p = 0.009), respectively. We found a significant negative association between cognitive load and the odds of subsequently identifying the trained murmur (odds ratio 0.27, 95% CI 0.11-0.67; p = 0.004). We found that increased invigoration and reduced tranquillity during simulation training were associated with increased cognitive load, and that the likelihood of correctly identifying a trained murmur declined with increasing cognitive load. Further

  7. A Bayesian Theory of Sequential Causal Learning and Abstract Transfer.

    Science.gov (United States)

    Lu, Hongjing; Rojas, Randall R; Beckers, Tom; Yuille, Alan L

    2016-03-01

    Two key research issues in the field of causal learning are how people acquire causal knowledge when observing data that are presented sequentially, and the level of abstraction at which learning takes place. Does sequential causal learning solely involve the acquisition of specific cause-effect links, or do learners also acquire knowledge about abstract causal constraints? Recent empirical studies have revealed that experience with one set of causal cues can dramatically alter subsequent learning and performance with entirely different cues, suggesting that learning involves abstract transfer, and such transfer effects involve sequential presentation of distinct sets of causal cues. It has been demonstrated that pre-training (or even post-training) can modulate classic causal learning phenomena such as forward and backward blocking. To account for these effects, we propose a Bayesian theory of sequential causal learning. The theory assumes that humans are able to consider and use several alternative causal generative models, each instantiating a different causal integration rule. Model selection is used to decide which integration rule to use in a given learning environment in order to infer causal knowledge from sequential data. Detailed computer simulations demonstrate that humans rely on the abstract characteristics of outcome variables (e.g., binary vs. continuous) to select a causal integration rule, which in turn alters causal learning in a variety of blocking and overshadowing paradigms. When the nature of the outcome variable is ambiguous, humans select the model that yields the best fit with the recent environment, and then apply it to subsequent learning tasks. Based on sequential patterns of cue-outcome co-occurrence, the theory can account for a range of phenomena in sequential causal learning, including various blocking effects, primacy effects in some experimental conditions, and apparently abstract transfer of causal knowledge. Copyright © 2015

  8. Comparison of peripapillary retinal nerve fiber layer loss and visual outcome in fellow eyes following sequential bilateral non-arteritic anterior ischemic optic neuropathy.

    Science.gov (United States)

    Dotan, Gad; Kesler, Anat; Naftaliev, Elvira; Skarf, Barry

    2015-05-01

    To report on the correlation of structural damage to the axons of the optic nerve and visual outcome following bilateral non-arteritic anterior ischemic optic neuropathy. A retrospective review of the medical records of 25 patients with bilateral sequential non-arteritic anterior ischemic optic neuropathy was performed. Outcome measures were peripapillary retinal nerve fiber layer thickness measured with the Stratus optical coherence tomography scanner, visual acuity and visual field loss. Median peripapillary retinal nerve fiber layer (RNFL) thickness, mean deviation (MD) of visual field, and visual acuity of initially involved NAION eyes (54.00 µm, -17.77 decibels (dB), 0.4, respectively) were comparable to the same parameters measured following development of second NAION event in the other eye (53.70 µm, p = 0.740; -16.83 dB, p = 0.692; 0.4, p = 0.942, respectively). In patients with bilateral NAION, there was a significant correlation of peripapillary RNFL thickness (r = 0.583, p = 0.002) and MD of the visual field (r = 0.457, p = 0.042) for the pairs of affected eyes, whereas a poor correlation was found in visual acuity of these eyes (r = 0.279, p = 0.176). Peripapillary RNFL thickness following NAION was positively correlated with MD of visual field (r = 0.312, p = 0.043) and negatively correlated with logMAR visual acuity (r = -0.365, p = 0.009). In patients who experience bilateral NAION, the magnitude of RNFL loss is similar in each eye. There is a greater similarity in visual field loss than in visual acuity between the two affected eyes with NAION of the same individual.

  9. A simulation to study the feasibility of improving the temporal resolution of LAGEOS geodynamic solutions by using a sequential process noise filter

    Science.gov (United States)

    Hartman, Brian Davis

    1995-01-01

    A key drawback to estimating geodetic and geodynamic parameters over time based on satellite laser ranging (SLR) observations is the inability to accurately model all the forces acting on the satellite. Errors associated with the observations and the measurement model can detract from the estimates as well. These 'model errors' corrupt the solutions obtained from the satellite orbit determination process. Dynamical models for satellite motion utilize known geophysical parameters to mathematically detail the forces acting on the satellite. However, these parameters, while estimated as constants, vary over time. These temporal variations must be accounted for in some fashion to maintain meaningful solutions. The primary goal of this study is to analyze the feasibility of using a sequential process noise filter for estimating geodynamic parameters over time from the Laser Geodynamics Satellite (LAGEOS) SLR data. This evaluation is achieved by first simulating a sequence of realistic LAGEOS laser ranging observations. These observations are generated using models with known temporal variations in several geodynamic parameters (along track drag and the J(sub 2), J(sub 3), J(sub 4), and J(sub 5) geopotential coefficients). A standard (non-stochastic) filter and a stochastic process noise filter are then utilized to estimate the model parameters from the simulated observations. The standard non-stochastic filter estimates these parameters as constants over consecutive fixed time intervals. Thus, the resulting solutions contain constant estimates of parameters that vary in time which limits the temporal resolution and accuracy of the solution. The stochastic process noise filter estimates these parameters as correlated process noise variables. As a result, the stochastic process noise filter has the potential to estimate the temporal variations more accurately since the constraint of estimating the parameters as constants is eliminated. A comparison of the temporal

  10. Forced Sequence Sequential Decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis

    In this thesis we describe a new concatenated decoding scheme based on iterations between an inner sequentially decoded convolutional code of rate R=1/4 and memory M=23, and block interleaved outer Reed-Solomon codes with non-uniform profile. With this scheme decoding with good performance...... is possible as low as Eb/No=0.6 dB, which is about 1.7 dB below the signal-to-noise ratio that marks the cut-off rate for the convolutional code. This is possible since the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability...... of computational overflow. Analytical results for the probability that the first Reed-Solomon word is decoded after C computations are presented. This is supported by simulation results that are also extended to other parameters....

  11. Modelling and sequential simulation of multi-tubular metallic membrane and techno-economics of a hydrogen production process employing thin-layer membrane reactor

    KAUST Repository

    Shafiee, Alireza

    2016-09-24

    A theoretical model for multi-tubular palladium-based membrane is proposed in this paper and validated against experimental data for two different sized membrane modules that operate at high temperatures. The model is used in a sequential simulation format to describe and analyse pure hydrogen and hydrogen binary mixture separations, and then extended to simulate an industrial scale membrane unit. This model is used as a sub-routine within an ASPEN Plus model to simulate a membrane reactor in a steam reforming hydrogen production plant. A techno-economic analysis is then conducted using the validated model for a plant producing 300 TPD of hydrogen. The plant utilises a thin (2.5 μm) defect-free and selective layer (Pd75Ag25 alloy) membrane reactor. The economic sensitivity analysis results show usefulness in finding the optimum operating condition that achieves minimum hydrogen production cost at break-even point. A hydrogen production cost of 1.98 $/kg is estimated while the cost of the thin-layer selective membrane is found to constitute 29% of total process capital cost. These results indicate the competiveness of this thin-layer membrane process against conventional methods of hydrogen production. © 2016 Hydrogen Energy Publications LLC

  12. Preliminary results of sequential monitoring of simulated clandestine graves in Colombia, South America, using ground penetrating radar and botany.

    Science.gov (United States)

    Molina, Carlos Martin; Pringle, Jamie K; Saumett, Miguel; Hernández, Orlando

    2015-03-01

    In most Latin American countries there are significant numbers of missing people and forced disappearances, 68,000 alone currently in Colombia. Successful detection of shallow buried human remains by forensic search teams is difficult in varying terrain and climates. This research has created three simulated clandestine burial styles at two different depths commonly encountered in Latin America to gain knowledge of optimum forensic geophysics detection techniques. Repeated monitoring of the graves post-burial was undertaken by ground penetrating radar. Radar survey 2D profile results show reasonable detection of ½ clothed pig cadavers up to 19 weeks of burial, with decreasing confidence after this time. Simulated burials using skeletonized human remains were not able to be imaged after 19 weeks of burial, with beheaded and burnt human remains not being able to be detected throughout the survey period. Horizontal radar time slices showed good early results up to 19 weeks of burial as more area was covered and bi-directional surveys were collected, but these decreased in amplitude over time. Deeper burials were all harder to image than shallower ones. Analysis of excavated soil found soil moisture content almost double compared to those reported from temperate climate studies. Vegetation variations over the simulated graves were also noted which would provide promising indicators for grave detection. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Dietary fibers from mushroom Sclerotia: 2. In vitro mineral binding capacity under sequential simulated physiological conditions of the human gastrointestinal tract.

    Science.gov (United States)

    Wong, Ka-Hing; Cheung, Peter C K

    2005-11-30

    The in vitro mineral binding capacity of three novel dietary fibers (DFs) prepared from mushroom sclerotia, namely, Pleurotus tuber-regium, Polyporous rhinocerus, and Wolfiporia cocos, to Ca, Mg, Cu, Fe, and Zn under sequential simulated physiological conditions of the human stomach, small intestine, and colon was investigated and compared. Apart from releasing most of their endogenous Ca (ranged from 96.9 to 97.9% removal) and Mg (ranged from 95.9 to 96.7% removal), simulated physiological conditions of the stomach also attenuated the possible adverse binding effect of the three sclerotial DFs to the exogenous minerals by lowering their cation-exchange capacity (ranged from 20.8 to 32.3%) and removing a substantial amount of their potential mineral chelators including protein (ranged from 16.2 to 37.8%) and phytate (ranged from 58.5 to 64.2%). The in vitro mineral binding capacity of the three sclerotial DF under simulated physiological conditions of small intestine was found to be low, especially for Ca (ranged from 4.79 to 5.91% binding) and Mg (ranged from 3.16 to 4.18% binding), and was highly correlated (r > 0.97) with their residual protein contents. Under simulated physiological conditions of the colon with slightly acidic pH (5.80), only bound Ca was readily released (ranged from 34.2 to 72.3% releasing) from the three sclerotial DFs, and their potential enhancing effect on passive Ca absorption in the human large intestine was also discussed.

  14. Reliability analysis with the simulator S.ESCAF of a very complex sequential system: the electrical power supply system of a nuclear reactor

    International Nuclear Information System (INIS)

    Blot, M.

    1987-06-01

    The reliability analysis of complex sequential systems, in which the order of arrival of the events must be taken into account, can be very difficult, because the use of the classical modelling technique of Markov diagrams leads to an important limitation on the number of components which can be handled. The desk-top apparatus S.ESCAF, which electronically simulates very closely the behaviour of the system being studied, and is very easy to use, even by a non specialist in electronics, allows one to avoid these inconveniences and to enlarge considerably the analysis possibilities. This paper shows the application of the S.ESCAF method to the electrical power supply system of a nuclear reactor. This system requires the simulation of more than forty components with about sixty events such as failure, repair and refusal to start. A comparison of times necessary to perform the analysis by these means and by other methods is described, and the advantages of S.ESCAF are presented

  15. Semiautomatic methods for segmentation of the proliferative tumour volume on sequential FLT PET/CT images in head and neck carcinomas and their relation to clinical outcome

    Energy Technology Data Exchange (ETDEWEB)

    Arens, Anne I.J.; Grootjans, Willem; Oyen, Wim J.G.; Visser, Eric P. [Radboud University Medical Center, Department of Nuclear Medicine, P.O. Box 9101, Nijmegen (Netherlands); Troost, Esther G.C. [Radboud University Medical Center, Department of Radiation Oncology, Nijmegen (Netherlands); Maastricht University Medical Centre, MAASTRO clinic, GROW School for Oncology and Developmental Biology, Maastricht (Netherlands); Hoeben, Bianca A.W.; Bussink, Johan; Kaanders, Johannes H.A.M. [Radboud University Medical Center, Department of Radiation Oncology, Nijmegen (Netherlands); Lee, John A.; Gregoire, Vincent [St-Luc University Hospital, Department of Radiation Oncology, Universite Catholique de Louvain, Brussels (Belgium); Hatt, Mathieu; Visvikis, Dimitris [Laboratoire de Traitement de l' Information Medicale (LaTIM), INSERM UMR1101, Brest (France)

    2014-05-15

    Radiotherapy of head and neck cancer induces changes in tumour cell proliferation during treatment, which can be depicted by the PET tracer {sup 18}F-fluorothymidine (FLT). In this study, three advanced semiautomatic PET segmentation methods for delineation of the proliferative tumour volume (PV) before and during (chemo)radiotherapy were compared and related to clinical outcome. The study group comprised 46 patients with 48 squamous cell carcinomas of the head and neck, treated with accelerated (chemo)radiotherapy, who underwent FLT PET/CT prior to treatment and in the 2nd and 4th week of therapy. Primary gross tumour volumes were visually delineated on CT images (GTV{sub CT}). PVs were visually determined on all PET scans (PV{sub VIS}). The following semiautomatic segmentation methods were applied to sequential PET scans: background-subtracted relative-threshold level (PV{sub RTL}), a gradient-based method using the watershed transform algorithm and hierarchical clustering analysis (PV{sub W} and {sub C}), and a fuzzy locally adaptive Bayesian algorithm (PV{sub FLAB}). Pretreatment PV{sub VIS} correlated best with PV{sub FLAB} and GTV{sub CT}. Correlations with PV{sub RTL} and PV{sub W} and {sub C} were weaker although statistically significant. During treatment, the PV{sub VIS}, PV{sub W} and {sub C} and PV{sub FLAB} significant decreased over time with the steepest decline over time for PV{sub FLAB}. Among these advanced segmentation methods, PV{sub FLAB} was the most robust in segmenting volumes in the third scan (67 % of tumours as compared to 40 % for PV{sub W} and {sub C} and 27 % for PV{sub RTL}). A decrease in PV{sub FLAB} above the median between the pretreatment scan and the scan obtained in the 4th week was associated with better disease-free survival (4 years 90 % versus 53 %). In patients with head and neck cancer, FLAB proved to be the best performing method for segmentation of the PV on repeat FLT PET/CT scans during (chemo)radiotherapy. This may

  16. How Well Can We Detect Lineage-Specific Diversification-Rate Shifts? A Simulation Study of Sequential AIC Methods.

    Science.gov (United States)

    May, Michael R; Moore, Brian R

    2016-11-01

    Evolutionary biologists have long been fascinated by the extreme differences in species numbers across branches of the Tree of Life. This has motivated the development of statistical methods for detecting shifts in the rate of lineage diversification across the branches of phylogenic trees. One of the most frequently used methods, MEDUSA, explores a set of diversification-rate models, where each model assigns branches of the phylogeny to a set of diversification-rate categories. Each model is first fit to the data, and the Akaike information criterion (AIC) is then used to identify the optimal diversification model. Surprisingly, the statistical behavior of this popular method is uncharacterized, which is a concern in light of: (1) the poor performance of the AIC as a means of choosing among models in other phylogenetic contexts; (2) the ad hoc algorithm used to visit diversification models, and; (3) errors that we reveal in the likelihood function used to fit diversification models to the phylogenetic data. Here, we perform an extensive simulation study demonstrating that MEDUSA (1) has a high false-discovery rate (on average, spurious diversification-rate shifts are identified [Formula: see text] of the time), and (2) provides biased estimates of diversification-rate parameters. Understanding the statistical behavior of MEDUSA is critical both to empirical researchers-in order to clarify whether these methods can make reliable inferences from empirical datasets-and to theoretical biologists-in order to clarify the specific problems that need to be solved in order to develop more reliable approaches for detecting shifts in the rate of lineage diversification. [Akaike information criterion; extinction; lineage-specific diversification rates; phylogenetic model selection; speciation.]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  17. Simultaneous vs. sequential treatment for smoking and weight management in tobacco quitlines: 6 and 12 month outcomes from a randomized trial.

    Science.gov (United States)

    Bush, Terry; Lovejoy, Jennifer; Javitz, Harold; Torres, Alula Jimenez; Wassum, Ken; Tan, Marcia M; Spring, Bonnie

    2018-05-31

    Smoking cessation often results in weight gain which discourages many smokers from quitting and can increase health risks. Treatments to reduce cessation-related weight gain have been tested in highly controlled trials of in-person treatment, but have never been tested in a real-world setting, which has inhibited dissemination. The Best Quit Study (BQS) is a replication and "real world" translation using telephone delivery of a prior in-person efficacy trial. randomized control trial in a quitline setting. Eligible smokers (n = 2540) were randomized to the standard 5-call quitline intervention or quitline plus simultaneous or sequential weight management. Regression analyses tested effectiveness of treatments on self-reported smoking abstinence and weight change at 6 and 12 months. Study enrollees were from 10 commercial employer groups and three state quitlines. Participants were between ages 18-72, 65.8% female, 68.2% white; 23.0% Medicaid-insured, and 76.3% overweight/obese. The follow-up response rate was lower in the simultaneous group than the control group at 6 months (p = 0.01). While a completers analysis of 30-day point prevalence abstinence detected no differences among groups at 6 or 12 months, multiply imputed abstinence showed quit rate differences at 6 months for:simultaneous (40.3%) vs. sequential (48.3%), p = 0.034 and simultaneous vs. control (44.9%), p = 0.043. At 12 months, multiply imputed abstinence, was significantly lower for the simultaneous group (40.7%) vs. control (46.0%), p sequential (46.3%), p sequential group completed fewer total calls (3.75) vs. control (4.16) and vs. simultaneous group (3.83), p = 0.01, and fewer weight calls (0.94) than simultaneous (2.33), p Simultaneous (vs. sequential) delivery of phone/web weight management with cessation treatment in the quitline setting may adversely affect quit rate. Neither a simultaneous nor sequential approach to addressing weight produced any benefit on

  18. Synthetic Aperture Sequential Beamforming

    DEFF Research Database (Denmark)

    Kortbek, Jacob; Jensen, Jørgen Arendt; Gammelmark, Kim Løkke

    2008-01-01

    A synthetic aperture focusing (SAF) technique denoted Synthetic Aperture Sequential Beamforming (SASB) suitable for 2D and 3D imaging is presented. The technique differ from prior art of SAF in the sense that SAF is performed on pre-beamformed data contrary to channel data. The objective is to im......A synthetic aperture focusing (SAF) technique denoted Synthetic Aperture Sequential Beamforming (SASB) suitable for 2D and 3D imaging is presented. The technique differ from prior art of SAF in the sense that SAF is performed on pre-beamformed data contrary to channel data. The objective...... is to improve and obtain a more range independent lateral resolution compared to conventional dynamic receive focusing (DRF) without compromising frame rate. SASB is a two-stage procedure using two separate beamformers. First a set of Bmode image lines using a single focal point in both transmit and receive...... is stored. The second stage applies the focused image lines from the first stage as input data. The SASB method has been investigated using simulations in Field II and by off-line processing of data acquired with a commercial scanner. The performance of SASB with a static image object is compared with DRF...

  19. Changes in thermo-tolerance and survival under simulated gastrointestinal conditions of Salmonella Enteritidis PT4 and Salmonella Typhimurium PT4 in chicken breast meat after exposure to sequential stresses.

    Science.gov (United States)

    Melo, Adma Nadja Ferreira de; Souza, Geany Targino de; Schaffner, Donald; Oliveira, Tereza C Moreira de; Maciel, Janeeyre Ferreira; Souza, Evandro Leite de; Magnani, Marciane

    2017-06-19

    This study assessed changes in thermo-tolerance and capability to survive to simulated gastrointestinal conditions of Salmonella Enteritidis PT4 and Salmonella Typhimurium PT4 inoculated in chicken breast meat following exposure to stresses (cold, acid and osmotic) commonly imposed during food processing. The effects of the stress imposed by exposure to oregano (Origanum vulgare L.) essential oil (OVEO) on thermo-tolerance were also assessed. After exposure to cold stress (5°C for 5h) in chicken breast meat the test strains were sequentially exposed to the different stressing substances (lactic acid, NaCl or OVEO) at sub-lethal amounts, which were defined considering previously determined minimum inhibitory concentrations, and finally to thermal treatment (55°C for 30min). Resistant cells from distinct sequential treatments were exposed to simulated gastrointestinal conditions. The exposure to cold stress did not result in increased tolerance to acid stress (lactic acid: 5 and 2.5μL/g) for both strains. Cells of S. Typhimurium PT4 and S. Enteritidis PT4 previously exposed to acid stress showed higher (pthermo-tolerance in both strains. The cells that survived the sequential stress exposure (resistant) showed higher tolerance (pthermo-tolerance and enhance the survival under gastrointestinal conditions of S. Enteritidis PT4 and S. Typhimurium PT4. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Geostatistical modeling of the gas emission zone and its in-place gas content for Pittsburgh-seam mines using sequential Gaussian simulation

    Science.gov (United States)

    Karacan, C.O.; Olea, R.A.; Goodman, G.

    2012-01-01

    Determination of the size of the gas emission zone, the locations of gas sources within, and especially the amount of gas retained in those zones is one of the most important steps for designing a successful methane control strategy and an efficient ventilation system in longwall coal mining. The formation of the gas emission zone and the potential amount of gas-in-place (GIP) that might be available for migration into a mine are factors of local geology and rock properties that usually show spatial variability in continuity and may also show geometric anisotropy. Geostatistical methods are used here for modeling and prediction of gas amounts and for assessing their associated uncertainty in gas emission zones of longwall mines for methane control.This study used core data obtained from 276 vertical exploration boreholes drilled from the surface to the bottom of the Pittsburgh coal seam in a mining district in the Northern Appalachian basin. After identifying important coal and non-coal layers for the gas emission zone, univariate statistical and semivariogram analyses were conducted for data from different formations to define the distribution and continuity of various attributes. Sequential simulations performed stochastic assessment of these attributes, such as gas content, strata thickness, and strata displacement. These analyses were followed by calculations of gas-in-place and their uncertainties in the Pittsburgh seam caved zone and fractured zone of longwall mines in this mining district. Grid blanking was used to isolate the volume over the actual panels from the entire modeled district and to calculate gas amounts that were directly related to the emissions in longwall mines.Results indicated that gas-in-place in the Pittsburgh seam, in the caved zone and in the fractured zone, as well as displacements in major rock units, showed spatial correlations that could be modeled and estimated using geostatistical methods. This study showed that GIP volumes may

  1. Learning curves and long-term outcome of simulation-based thoracentesis training for medical students

    Science.gov (United States)

    2011-01-01

    Background Simulation-based medical education has been widely used in medical skills training; however, the effectiveness and long-term outcome of simulation-based training in thoracentesis requires further investigation. The purpose of this study was to assess the learning curve of simulation-based thoracentesis training, study skills retention and transfer of knowledge to a clinical setting following simulation-based education intervention in thoracentesis procedures. Methods Fifty-two medical students were enrolled in this study. Each participant performed five supervised trials on the simulator. Participant's performance was assessed by performance score (PS), procedure time (PT), and participant's confidence (PC). Learning curves for each variable were generated. Long-term outcome of the training was measured by the retesting and clinical performance evaluation 6 months and 1 year, respectively, after initial training on the simulator. Results Significant improvements in PS, PT, and PC were noted among the first 3 to 4 test trials (p 0.05). Clinical competency in thoracentesis was improved in participants who received simulation training relative to that of first year medical residents without such experience (p simulation-based thoracentesis training can significantly improve an individual's performance. The saturation of learning from the simulator can be achieved after four practice sessions. Simulation-based training can assist in long-term retention of skills and can be partially transferred to clinical practice. PMID:21696584

  2. Methods, safety, and early clinical outcomes of dose escalation using simultaneous integrated and sequential boosts in patients with locally advanced gynecologic malignancies.

    Science.gov (United States)

    Boyle, John; Craciunescu, Oana; Steffey, Beverly; Cai, Jing; Chino, Junzo

    2014-11-01

    To evaluate the safety of dose escalated radiotherapy using a simultaneous integrated boost technique in patients with locally advanced gynecological malignancies. Thirty-nine women with locally advanced gynecological malignancies were treated with intensity modulated radiation therapy utilizing a simultaneous integrated boost (SIB) technique for gross disease in the para-aortic and/or pelvic nodal basins, sidewall extension, or residual primary disease. Women were treated to 45Gy in 1.8Gy fractions to elective nodal regions. Gross disease was simultaneously treated to 55Gy in 2.2Gy fractions (n=44 sites). An additional sequential boost of 10Gy in 2Gy fractions was delivered if deemed appropriate (n=29 sites). Acute and late toxicity, local control in the treated volumes (LC), overall survival (OS), and distant metastases (DM) were assessed. All were treated with a SIB to a dose of 55Gy. Twenty-four patients were subsequently treated with a sequential boost to a median dose of 65Gy. Median follow-up was 18months. Rates of acute>grade 2 gastrointestinal (GI), genitourinary (GU), and hematologic (heme) toxicities were 2.5%, 0%, and 30%, respectively. There were no grade 4 acute toxicities. At one year, grade 1-2 late GI toxicities were 24.5%. There were no grade 3 or 4 late GI toxicities. Rates of grade 1-2 late GU toxicities were 12.7%. There were no grade 3 or 4 late GU toxicities. Dose escalated radiotherapy using a SIB results in acceptable rates of acute toxicity. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Estimation After a Group Sequential Trial.

    Science.gov (United States)

    Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Kenward, Michael G; Tsiatis, Anastasios A; Davidian, Marie; Verbeke, Geert

    2015-10-01

    Group sequential trials are one important instance of studies for which the sample size is not fixed a priori but rather takes one of a finite set of pre-specified values, dependent on the observed data. Much work has been devoted to the inferential consequences of this design feature. Molenberghs et al (2012) and Milanzi et al (2012) reviewed and extended the existing literature, focusing on a collection of seemingly disparate, but related, settings, namely completely random sample sizes, group sequential studies with deterministic and random stopping rules, incomplete data, and random cluster sizes. They showed that the ordinary sample average is a viable option for estimation following a group sequential trial, for a wide class of stopping rules and for random outcomes with a distribution in the exponential family. Their results are somewhat surprising in the sense that the sample average is not optimal, and further, there does not exist an optimal, or even, unbiased linear estimator. However, the sample average is asymptotically unbiased, both conditionally upon the observed sample size as well as marginalized over it. By exploiting ignorability they showed that the sample average is the conventional maximum likelihood estimator. They also showed that a conditional maximum likelihood estimator is finite sample unbiased, but is less efficient than the sample average and has the larger mean squared error. Asymptotically, the sample average and the conditional maximum likelihood estimator are equivalent. This previous work is restricted, however, to the situation in which the the random sample size can take only two values, N = n or N = 2 n . In this paper, we consider the more practically useful setting of sample sizes in a the finite set { n 1 , n 2 , …, n L }. It is shown that the sample average is then a justifiable estimator , in the sense that it follows from joint likelihood estimation, and it is consistent and asymptotically unbiased. We also show why

  4. New media simulation stories in nursing education: a quasi-experimental study exploring learning outcomes.

    Science.gov (United States)

    Webb-Corbett, Robin; Schwartz, Melissa Renee; Green, Bob; Sessoms, Andrea; Swanson, Melvin

    2013-04-01

    New media simulation stories are short multimedia presentations that combine simulation, digital technology, and story branching to depict a variety of healthcare-related scenarios. The purpose of this study was to explore whether learning outcomes were enhanced if students viewed the results of both correct and incorrect nursing actions demonstrated through new media simulation stories. A convenience sample of 109 undergraduate nursing students in a family-centered maternity course participated in the study. Study findings suggests that students who viewed both correct and incorrect depictions of maternity nursing actions scored better on tests than did those students who viewed only correct nursing actions.

  5. Cognitive Transfer Outcomes for a Simulation-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Backman, Matthew D.; Delmas, Robert C.; Garfield, Joan

    2017-01-01

    Cognitive transfer is the ability to apply learned skills and knowledge to new applications and contexts. This investigation evaluates cognitive transfer outcomes for a tertiary-level introductory statistics course using the CATALST curriculum, which exclusively used simulation-based methods to develop foundations of statistical inference. A…

  6. Evaluating Outcomes of High Fidelity Simulation Curriculum in a Community College Nursing Program

    Science.gov (United States)

    Denlea, Gregory Richard

    2017-01-01

    This study took place at a Wake Technical Community College, a multi-campus institution in Raleigh, North Carolina. An evaluation of the return on investment in high fidelity simulation used by an associate degree of nursing program was conducted with valid and reliable instruments. The study demonstrated that comparable student outcomes are…

  7. The Impact of Simulated Nature on Patient Outcomes: A Study of Photographic Sky Compositions.

    Science.gov (United States)

    Pati, Debajyoti; Freier, Patricia; O'Boyle, Michael; Amor, Cherif; Valipoor, Shabboo

    2016-01-01

    To examine whether incorporation of simulated nature, in the form of ceiling mounted photographic sky compositions, influences patient outcomes. Previous studies have shown that most forms of nature exposure have a positive influence on patients. However, earlier studies have mostly focused on wall-hung nature representations. The emergence of simulated nature products has raised the question regarding the effects of the new product on patient outcomes. A between-subject experimental design was adopted, where outcomes from five inpatient rooms with sky composition ceiling fixture were compared to corresponding outcomes in five identical rooms without the intervention. Data were collected from a total of 181 subjects on 11 outcomes. Independent sample tests were performed to identify differences in mean outcomes. Significant positive outcomes were observed in environmental satisfaction and diastolic blood pressure (BP). Environmental satisfaction in the experimental group was 12.4% higher than the control group. Direction of association for diastolic BP, nausea/indigestion medication, acute stress, anxiety, pain, and environmental satisfaction were consistent with a priori hypothesis. A post hoc exploratory assessment involving patients who did not self-request additional pain and sleep medication demonstrated confirmatory directions for all outcomes except Systolic BP, and statistically significant outcomes for Acute Stress and Anxiety-Acute Stress and Anxiety levels of the experimental group subjects was 53.4% and 34.79% lower, respectively, than that of the control group subjects. Salutogenic benefits of photographic sky compositions render them better than traditional ceiling tiles and offer an alternative to other nature interventions. © The Author(s) 2015.

  8. Perceived learning outcome: the relationship between experience, realism and situation awareness during simulator training.

    Science.gov (United States)

    Saus, Evelyn-Rose; Johnsen, Bjørn Helge; Eid, Jarle

    2010-01-01

    Navigation errors are a frequent cause of serious accidents and work-related injuries among seafarers. The present study investigated the effects of experience, perceived realism, and situation awareness (SA) on the perceived learning outcome of simulator-based navigation training. Thirty-two Norwegian Navy officer cadets were assigned to a low and a high mental workload conditions based on previous educational and navigational experience. In the low mental workload condition, experience (negatively associated), perceived realism, and subjective SA explained almost half of the total variance in perceived learning outcome. A hierarchical regression analysis showed that only subjective SA made a unique contribution to the learning outcome. In the high mental workload condition, perceived realism and subjective SA together explained almost half of the variance in perceived learning outcome. Furthermore, both perceived realism and subjective SA were shown to make an independent contribution to perceived learning outcomes. The results of this study show that in order to enhance the learning outcomes from simulator training it is necessary to design training procedures and scenarios that enable students to achieve functional fidelity and to generate and maintain SA during training. This can further improve safety and reduce the risk of maritime disasters.

  9. A critical review of simulation-based mastery learning with translational outcomes.

    Science.gov (United States)

    McGaghie, William C; Issenberg, Saul B; Barsuk, Jeffrey H; Wayne, Diane B

    2014-04-01

    This article has two objectives. Firstly, we critically review simulation-based mastery learning (SBML) research in medical education, evaluate its implementation and immediate results, and document measured downstream translational outcomes in terms of improved patient care practices, better patient outcomes and collateral effects. Secondly, we briefly address implementation science and its importance in the dissemination of innovations in medical education and health care. This is a qualitative synthesis of SBML with translational (T) science research reports spanning a period of 7 years (2006-2013). We use the 'critical review' approach proposed by Norman and Eva to synthesise findings from 23 medical education studies that employ the mastery learning model and measure downstream translational outcomes. Research in SBML in medical education has addressed a range of interpersonal and technical skills. Measured outcomes have been achieved in educational laboratories (T1), and as improved patient care practices (T2), patient outcomes (T3) and collateral effects (T4). Simulation-based mastery learning in medical education can produce downstream results. Such results derive from integrated education and health services research programmes that are thematic, sustained and cumulative. The new discipline of implementation science holds promise to explain why medical education innovations are adopted slowly and how to accelerate innovation dissemination. © 2014 John Wiley & Sons Ltd.

  10. Sequential charged particle reaction

    International Nuclear Information System (INIS)

    Hori, Jun-ichi; Ochiai, Kentaro; Sato, Satoshi; Yamauchi, Michinori; Nishitani, Takeo

    2004-01-01

    The effective cross sections for producing the sequential reaction products in F82H, pure vanadium and LiF with respect to the 14.9-MeV neutron were obtained and compared with the estimation ones. Since the sequential reactions depend on the secondary charged particles behavior, the effective cross sections are corresponding to the target nuclei and the material composition. The effective cross sections were also estimated by using the EAF-libraries and compared with the experimental ones. There were large discrepancies between estimated and experimental values. Additionally, we showed the contribution of the sequential reaction on the induced activity and dose rate in the boundary region with water. From the present study, it has been clarified that the sequential reactions are of great importance to evaluate the dose rates around the surface of cooling pipe and the activated corrosion products. (author)

  11. Simulation Performance and National Council Licensure Examination for Registered Nurses Outcomes: Field Research Perspectives.

    Science.gov (United States)

    Brackney, Dana E; Lane, Susan Hayes; Dawson, Tyia; Koontz, Angie

    2017-11-01

    This descriptive field study examines processes used to evaluate simulation for senior-level Bachelor of Science in Nursing (BSN) students in a capstone course, discusses challenges related to simulation evaluation, and reports the relationship between faculty evaluation of student performance and National Council Licensure Examination for Registered Nurses (NCLEX-RN) first-time passing rates. Researchers applied seven terms used to rank BSN student performance (n = 41, female, ages 22-24 years) in a senior-level capstone simulation. Faculty evaluation was correlated with students' NCLEX-RN outcomes. Students evaluated as "lacking confidence" and "flawed" were less likely to pass the NCLEX-RN on the first attempt. Faculty evaluation of capstone simulation performance provided additional evidence of student preparedness for practice in the RN role, as evidenced by the relationship between the faculty assessment and NCLEX-RN success. Simulation has been broadly accepted as a powerful educational tool that may also contribute to verification of student achievement of program outcomes and readiness for the RN role.

  12. Simulation as a new tool to establish benchmark outcome measures in obstetrics.

    Directory of Open Access Journals (Sweden)

    Matt M Kurrek

    Full Text Available There are not enough clinical data from rare critical events to calculate statistics to decide if the management of actual events might be below what could reasonably be expected (i.e. was an outlier.In this project we used simulation to describe the distribution of management times as an approach to decide if the management of a simulated obstetrical crisis scenario could be considered an outlier.Twelve obstetrical teams managed 4 scenarios that were previously developed. Relevant outcome variables were defined by expert consensus. The distribution of the response times from the teams who performed the respective intervention was graphically displayed and median and quartiles calculated using rank order statistics.Only 7 of the 12 teams performed chest compressions during the arrest following the 'cannot intubate/cannot ventilate' scenario. All other outcome measures were performed by at least 11 of the 12 teams. Calculation of medians and quartiles with 95% CI was possible for all outcomes. Confidence intervals, given the small sample size, were large.We demonstrated the use of simulation to calculate quantiles for management times of critical event. This approach could assist in deciding if a given performance could be considered normal and also point to aspects of care that seem to pose particular challenges as evidenced by a large number of teams not performing the expected maneuver. However sufficiently large sample sizes (i.e. from a national data base will be required to calculate acceptable confidence intervals and to establish actual tolerance limits.

  13. Patient outcomes in simulation-based medical education: a systematic review.

    Science.gov (United States)

    Zendejas, Benjamin; Brydges, Ryan; Wang, Amy T; Cook, David A

    2013-08-01

    Evaluating the patient impact of health professions education is a societal priority with many challenges. Researchers would benefit from a summary of topics studied and potential methodological problems. We sought to summarize key information on patient outcomes identified in a comprehensive systematic review of simulation-based instruction. Systematic search of MEDLINE, EMBASE, CINAHL, PsychINFO, Scopus, key journals, and bibliographies of previous reviews through May 2011. Original research in any language measuring the direct effects on patients of simulation-based instruction for health professionals, in comparison with no intervention or other instruction. Two reviewers independently abstracted information on learners, topics, study quality including unit of analysis, and validity evidence. We pooled outcomes using random effects. From 10,903 articles screened, we identified 50 studies reporting patient outcomes for at least 3,221 trainees and 16,742 patients. Clinical topics included airway management (14 studies), gastrointestinal endoscopy (12), and central venous catheter insertion (8). There were 31 studies involving postgraduate physicians and seven studies each involving practicing physicians, nurses, and emergency medicine technicians. Fourteen studies (28 %) used an appropriate unit of analysis. Measurement validity was supported in seven studies reporting content evidence, three reporting internal structure, and three reporting relations with other variables. The pooled Hedges' g effect size for 33 comparisons with no intervention was 0.47 (95 % confidence interval [CI], 0.31-0.63); and for nine comparisons with non-simulation instruction, it was 0.36 (95 % CI, -0.06 to 0.78). Focused field in education; high inconsistency (I(2) > 50 % in most analyses). Simulation-based education was associated with small-moderate patient benefits in comparison with no intervention and non-simulation instruction, although the latter did not reach statistical

  14. Interprofessional teamwork skills as predictors of clinical outcomes in a simulated healthcare setting.

    Science.gov (United States)

    Shrader, Sarah; Kern, Donna; Zoller, James; Blue, Amy

    2013-01-01

    Teaching interprofessional (IP) teamwork skills is a goal of interprofessional education. The purpose of this study was to examine the relationship between IP teamwork skills, attitudes and clinical outcomes in a simulated clinical setting. One hundred-twenty health professions students (medicine, pharmacy, physician assistant) worked in interprofessional teams to manage a "patient" in a health care simulation setting. Students completed the Interdisciplinary Education Perception Scale (IEPS) attitudinal survey instrument. Students' responses were averaged by team to create an IEPS attitudes score. Teamwork skills for each team were rated by trained observers using a checklist to calculate a teamwork score (TWS). Clinical outcome scores (COS) were determined by summation of completed clinical tasks performed by the team based on an expert developed checklist. Regression analyses were conducted to determine the relationship of IEPS and TWS with COS. IEPS score was not a significant predictor of COS (p=0.054), but TWS was a significant predictor (pstudents' interprofessional teamwork skills are significant predictors of positive clinical outcomes. Interprofessional curricular models that produce effective teamwork skills can improve student performance in clinical environments and likely improve teamwork practice to positively affect patient care outcomes.

  15. Immediate Sequential Bilateral Cataract Surgery

    DEFF Research Database (Denmark)

    Kessel, Line; Andresen, Jens; Erngaard, Ditte

    2015-01-01

    The aim of the present systematic review was to examine the benefits and harms associated with immediate sequential bilateral cataract surgery (ISBCS) with specific emphasis on the rate of complications, postoperative anisometropia, and subjective visual function in order to formulate evidence......-based national Danish guidelines for cataract surgery. A systematic literature review in PubMed, Embase, and Cochrane central databases identified three randomized controlled trials that compared outcome in patients randomized to ISBCS or bilateral cataract surgery on two different dates. Meta-analyses were...... performed using the Cochrane Review Manager software. The quality of the evidence was assessed using the GRADE method (Grading of Recommendation, Assessment, Development, and Evaluation). We did not find any difference in the risk of complications or visual outcome in patients randomized to ISBCS or surgery...

  16. Learning outcomes associated with patient simulation method in pharmacotherapy education: an integrative review.

    Science.gov (United States)

    Aura, Suvi M; Sormunen, Marjorita S T; Jordan, Sue E; Tossavainen, Kerttu A; Turunen, Hannele E

    2015-06-01

    The aims of this systematic integrative review were to identify evidence for the use of patient simulation teaching methods in pharmacotherapy education and to explore related learning outcomes. A systematic literature search was conducted using 6 databases as follows: CINAHL, PubMed, SCOPUS, ERIC, MEDIC, and the Cochrane Library, using the key words relating to patient simulation and pharmacotherapy. The methodological quality of each study was evaluated. Eighteen articles met the inclusion criteria. The earliest article was published in 2005. The selected research articles were subjected to qualitative content analysis. Patient simulation has been used in pharmacotherapy education for preregistration nursing, dental, medical, and pharmacy students and for the continuing education of nurses. Learning outcomes reported were summarized as follows: (1) commitment to pharmacotherapy learning, (2) development of pharmacotherapy evaluation skills, (3) improvement in pharmacotherapy application skills, and (4) knowledge and understanding of pharmacotherapy. To develop effective teaching methods and ensure health care professionals' competence in medication management, further research is needed to determine the educational and clinical effectiveness of simulation teaching methods.

  17. The Bacterial Sequential Markov Coalescent.

    Science.gov (United States)

    De Maio, Nicola; Wilson, Daniel J

    2017-05-01

    Bacteria can exchange and acquire new genetic material from other organisms directly and via the environment. This process, known as bacterial recombination, has a strong impact on the evolution of bacteria, for example, leading to the spread of antibiotic resistance across clades and species, and to the avoidance of clonal interference. Recombination hinders phylogenetic and transmission inference because it creates patterns of substitutions (homoplasies) inconsistent with the hypothesis of a single evolutionary tree. Bacterial recombination is typically modeled as statistically akin to gene conversion in eukaryotes, i.e. , using the coalescent with gene conversion (CGC). However, this model can be very computationally demanding as it needs to account for the correlations of evolutionary histories of even distant loci. So, with the increasing popularity of whole genome sequencing, the need has emerged for a faster approach to model and simulate bacterial genome evolution. We present a new model that approximates the coalescent with gene conversion: the bacterial sequential Markov coalescent (BSMC). Our approach is based on a similar idea to the sequential Markov coalescent (SMC)-an approximation of the coalescent with crossover recombination. However, bacterial recombination poses hurdles to a sequential Markov approximation, as it leads to strong correlations and linkage disequilibrium across very distant sites in the genome. Our BSMC overcomes these difficulties, and shows a considerable reduction in computational demand compared to the exact CGC, and very similar patterns in simulated data. We implemented our BSMC model within new simulation software FastSimBac. In addition to the decreased computational demand compared to previous bacterial genome evolution simulators, FastSimBac provides more general options for evolutionary scenarios, allowing population structure with migration, speciation, population size changes, and recombination hotspots. FastSimBac is

  18. Production of DagA and ethanol by sequential utilization of sugars in a mixed-sugar medium simulating microalgal hydrolysate.

    Science.gov (United States)

    Park, Juyi; Hong, Soon-Kwang; Chang, Yong Keun

    2015-09-01

    A novel two-step fermentation process using a mixed-sugar medium mimicking microalgal hydrolysate has been proposed to avoid glucose repression and thus to maximize substrate utilization efficiency. When DagA, a β-agarase was produced in one step in the mixed-sugar medium by using a recombinant Streptomyces lividans, glucose was found to have negative effects on the consumption of the other sugars and DagA biosynthesis causing low substrate utilization efficiency and low DagA productivity. To overcome such difficulties, a new strategy of sequential substrate utilization was developed. In the first step, glucose was consumed by Saccharomyces cerevisiae together with galactose and mannose producing ethanol, after which DagA was produced from the remaining sugars of xylose, rhamnose and ribose. Fucose was not consumed. By adopting this two-step process, the overall substrate utilization efficiency was increased approximately 3-fold with a nearly 2-fold improvement of DagA production, let alone the additional benefit of ethanol production. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Sequential stochastic optimization

    CERN Document Server

    Cairoli, Renzo

    1996-01-01

    Sequential Stochastic Optimization provides mathematicians and applied researchers with a well-developed framework in which stochastic optimization problems can be formulated and solved. Offering much material that is either new or has never before appeared in book form, it lucidly presents a unified theory of optimal stopping and optimal sequential control of stochastic processes. This book has been carefully organized so that little prior knowledge of the subject is assumed; its only prerequisites are a standard graduate course in probability theory and some familiarity with discrete-paramet

  20. Does teaching non-technical skills to medical students improve those skills and simulated patient outcome?

    Science.gov (United States)

    Hagemann, Vera; Herbstreit, Frank; Kehren, Clemens; Chittamadathil, Jilson; Wolfertz, Sandra; Dirkmann, Daniel; Kluge, Annette; Peters, Jürgen

    2017-03-29

    The purpose of this study is to evaluate the effects of a tailor-made, non-technical skills seminar on medical student's behaviour, attitudes, and performance during simulated patient treatment. Seventy-seven students were randomized to either a non-technical skills seminar (NTS group, n=43) or a medical seminar (control group, n=34). The human patient simulation was used as an evaluation tool. Before the seminars, all students performed the same simulated emergency scenario to provide baseline measurements. After the seminars, all students were exposed to a second scenario, and behavioural markers for evaluating their non-technical skills were rated. Furthermore, teamwork-relevant attitudes were measured before and after the scenarios, and perceived stress was measured following each simulation. All simulations were also evaluated for various medical endpoints. Non-technical skills concerning situation awareness (ptechnical skills to improve student's non-technical skills. In a next step, to improve student's handling of emergencies and patient outcomes, non-technical skills seminars should be accompanied by exercises and more broadly embedded in the medical school curriculum.

  1. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    International Nuclear Information System (INIS)

    Wimmer, Thomas; Srimathveeravalli, Govindarajan; Gutta, Narendra; Ezell, Paula C.; Monette, Sebastien; Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C.; Coleman, Jonathan A.; Solomon, Stephen B.

    2015-01-01

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery

  2. Automated outcome scoring in a virtual reality simulator for endodontic surgery.

    Science.gov (United States)

    Yin, Myat Su; Haddawy, Peter; Suebnukarn, Siriwan; Rhienmora, Phattanapon

    2018-01-01

    We address the problem of automated outcome assessment in a virtual reality (VR) simulator for endodontic surgery. Outcome assessment is an essential component of any system that provides formative feedback, which requires assessing the outcome, relating it to the procedure, and communicating in a language natural to dental students. This study takes a first step toward automated generation of such comprehensive feedback. Virtual reference templates are computed based on tooth anatomy and the outcome is assessed with a 3D score cube volume which consists of voxel-level non-linear weighted scores based on the templates. The detailed scores are transformed into standard scoring language used by dental schools. The system was evaluated on fifteen outcome samples that contained optimal results and those with errors including perforation of the walls, floor, and both, as well as various combinations of major and minor over and under drilling errors. Five endodontists who had professional training and varying levels of experiences in root canal treatment participated as raters in the experiment. Results from evaluation of our system with expert endodontists show a high degree of agreement with expert scores (information based measure of disagreement 0.04-0.21). At the same time they show some disagreement among human expert scores, reflecting the subjective nature of human outcome scoring. The discriminatory power of the AOS scores analyzed with three grade tiers (A, B, C) using the area under the receiver operating characteristic curve (AUC). The AUC values are generally highest for the {AB: C} cutoff which is cutoff at the boundary between clinically acceptable (B) and clinically unacceptable (C) grades. The objective consistency of computed scores and high degree of agreement with experts make the proposed system a promising addition to existing VR simulators. The translation of detailed level scores into terminology commonly used in dental surgery supports natural

  3. Sequential memory: Binding dynamics

    Science.gov (United States)

    Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail

    2015-10-01

    Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.

  4. Mining compressing sequential problems

    NARCIS (Netherlands)

    Hoang, T.L.; Mörchen, F.; Fradkin, D.; Calders, T.G.K.

    2012-01-01

    Compression based pattern mining has been successfully applied to many data mining tasks. We propose an approach based on the minimum description length principle to extract sequential patterns that compress a database of sequences well. We show that mining compressing patterns is NP-Hard and

  5. Learning outcomes evaluation of a simulation-based introductory course to anaesthesia.

    Science.gov (United States)

    Rábago, J L; López-Doueil, M; Sancho, R; Hernández-Pinto, P; Neira, N; Capa, E; Larraz, E; Redondo-Figuero, C G; Maestre, J M

    2017-10-01

    An increased number of errors and reduced patient safety have been reported during the incorporation of residents, as this period involves learning new skills. The objectives were to evaluate the learning outcomes of an immersive simulation boot-camp for incoming residents before starting the clinical rotations. Airway assessment, airway control with direct laryngoscopy, and epidural catheterization competencies were evaluated. Twelve first-year anaesthesiology residents participated. A prospective study to evaluate transfer of endotracheal intubation skills learned at the simulation centre to clinical practice (primary outcome) was conducted. A checklist of 28 skills and behaviours was used to assess the first supervised intubation performed during anaesthesia induction in ASA I/II patients. Secondary outcome was self-efficacy to perform epidural catheterization. A satisfaction survey was also performed. Seventy-five percent of residents completed more than 21 out of 28 skills and behaviours to assess and control the airway during their first intubation in patients. Twelve items were performed by all residents and 5 by half of them. More than 83% of participants reported a high level of self-efficacy in placing an epidural catheter. All participants would recommend the course to their colleagues. A focused intensive simulation-based boot-camp addressing key competencies required to begin anaesthesia residency was well received, and led to transfer of airway management skills learned to clinical settings when performing for first time on patients, and to increased self-reported efficacy in performing epidural catheterization. Copyright © 2017 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Comparison of real and computer-simulated outcomes of LASIK refractive surgery

    Science.gov (United States)

    Cano, Daniel; Barbero, Sergio; Marcos, Susana

    2004-06-01

    Computer simulations of alternative LASIK ablation patterns were performed for corneal elevation maps of 13 real myopic corneas (range of myopia, -2.0 to -11.5 D). The computationally simulated ablation patterns were designed with biconic surfaces (standard Munnerlyn pattern, parabolic pattern, and biconic pattern) or with aberrometry measurements (customized pattern). Simulated results were compared with real postoperative outcomes. Standard LASIK refractive surgery for myopia increased corneal asphericity and spherical aberration. Computations with the theoretical Munnerlyn ablation pattern did not increase the corneal asphericity and spherical aberration. The theoretical parabolic pattern induced a slight increase of asphericity and spherical aberration, explaining only 40% of the clinically found increase. The theoretical biconic pattern controlled corneal spherical aberration. Computations showed that the theoretical customized pattern can correct high-order asymmetric aberrations. Simulations of changes in efficiency due to reflection and nonnormal incidence of the laser light showed a further increase in corneal asphericity. Consideration of these effects with a parabolic pattern accounts for 70% of the clinical increase in asphericity.

  7. Future planning: default network activity couples with frontoparietal control network and reward-processing regions during process and outcome simulations.

    Science.gov (United States)

    Gerlach, Kathy D; Spreng, R Nathan; Madore, Kevin P; Schacter, Daniel L

    2014-12-01

    We spend much of our daily lives imagining how we can reach future goals and what will happen when we attain them. Despite the prevalence of such goal-directed simulations, neuroimaging studies on planning have mainly focused on executive processes in the frontal lobe. This experiment examined the neural basis of process simulations, during which participants imagined themselves going through steps toward attaining a goal, and outcome simulations, during which participants imagined events they associated with achieving a goal. In the scanner, participants engaged in these simulation tasks and an odd/even control task. We hypothesized that process simulations would recruit default and frontoparietal control network regions, and that outcome simulations, which allow us to anticipate the affective consequences of achieving goals, would recruit default and reward-processing regions. Our analysis of brain activity that covaried with process and outcome simulations confirmed these hypotheses. A functional connectivity analysis with posterior cingulate, dorsolateral prefrontal cortex and anterior inferior parietal lobule seeds showed that their activity was correlated during process simulations and associated with a distributed network of default and frontoparietal control network regions. During outcome simulations, medial prefrontal cortex and amygdala seeds covaried together and formed a functional network with default and reward-processing regions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. Use of simulation-based education to improve outcomes of central venous catheterization: a systematic review and meta-analysis.

    Science.gov (United States)

    Ma, Irene W Y; Brindle, Mary E; Ronksley, Paul E; Lorenzetti, Diane L; Sauve, Reg S; Ghali, William A

    2011-09-01

    Central venous catheterization (CVC) is increasingly taught by simulation. The authors reviewed the literature on the effects of simulation training in CVC on learner and clinical outcomes. The authors searched computerized databases (1950 to May 2010), reference lists, and considered studies with a control group (without simulation education intervention). Two independent assessors reviewed the retrieved citations. Independent data abstraction was performed on study design, study quality score, learner characteristics, sample size, components of interventional curriculum, outcomes assessed, and method of assessment. Learner outcomes included performance measures on simulators, knowledge, and confidence. Patient outcomes included number of needle passes, arterial puncture, pneumothorax, and catheter-related infections. Twenty studies were identified. Simulation-based education was associated with significant improvements in learner outcomes: performance on simulators (standardized mean difference [SMD] 0.60 [95% CI 0.45 to 0.76]), knowledge (SMD 0.60 [95% CI 0.35 to 0.84]), and confidence (SMD 0.41 [95% CI 0.30 to 0.53] for studies with single-group pretest and posttest design; SMD 0.52 (95% CI 0.23 to 0.81) for studies with nonrandomized, two-group design). Furthermore, simulation-based education was associated with improved patient outcomes, including fewer needle passes (SMD -0.58 [95% CI -0.95 to -0.20]), and pneumothorax (relative risk 0.62 [95% CI 0.40 to 0.97]), for studies with nonrandomized, two-group design. However, simulation-based training was not associated with a significant reduction in risk of either arterial puncture or catheter-related infections. Despite some limitations in the literature reviewed, evidence suggests that simulation-based education for CVC provides benefits in learner and select clinical outcomes.

  9. Cost: the missing outcome in simulation-based medical education research: a systematic review.

    Science.gov (United States)

    Zendejas, Benjamin; Wang, Amy T; Brydges, Ryan; Hamstra, Stanley J; Cook, David A

    2013-02-01

    The costs involved with technology-enhanced simulation remain unknown. Appraising the value of simulation-based medical education (SBME) requires complete accounting and reporting of cost. We sought to summarize the quantity and quality of studies that contain an economic analysis of SBME for the training of health professions learners. We performed a systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Articles reporting original research in any language evaluating the cost of simulation, in comparison with nonstimulation instruction or another simulation intervention, for training practicing and student physicians, nurses, and other health professionals were selected. Reviewers working in duplicate evaluated study quality and abstracted information on learners, instructional design, cost elements, and outcomes. From a pool of 10,903 articles we identified 967 comparative studies. Of these, 59 studies (6.1%) reported any cost elements and 15 (1.6%) provided information on cost compared with another instructional approach. We identified 11 cost components reported, most often the cost of the simulator (n = 42 studies; 71%) and training materials (n = 21; 36%). Ten potential cost components were never reported. The median number of cost components reported per study was 2 (range, 1-9). Only 12 studies (20%) reported cost in the Results section; most reported it in the Discussion (n = 34; 58%). Cost reporting in SBME research is infrequent and incomplete. We propose a comprehensive model for accounting and reporting costs in SBME. Copyright © 2013 Mosby, Inc. All rights reserved.

  10. Sequential Power-Dependence Theory

    NARCIS (Netherlands)

    Buskens, Vincent; Rijt, Arnout van de

    2008-01-01

    Existing methods for predicting resource divisions in laboratory exchange networks do not take into account the sequential nature of the experimental setting. We extend network exchange theory by considering sequential exchange. We prove that Sequential Power-Dependence Theory—unlike

  11. Modelling sequentially scored item responses

    NARCIS (Netherlands)

    Akkermans, W.

    2000-01-01

    The sequential model can be used to describe the variable resulting from a sequential scoring process. In this paper two more item response models are investigated with respect to their suitability for sequential scoring: the partial credit model and the graded response model. The investigation is

  12. The differential outcomes procedure enhances adherence to treatment: A simulated study with healthy adults

    Directory of Open Access Journals (Sweden)

    Michael eMolina

    2015-11-01

    Full Text Available Memory for medical recommendations is a prerequisite for good adherence to treatment, and therefore to ameliorate the negative effects of the disease, a problem that mainly affects people with memory deficits. We conducted a simulated study to test the utility of a procedure (the differential outcomes procedure, DOP that may improve adherence to treatment by increasing the patient’s learning and retention of medical recommendations regarding medication. The DOP requires the structure of a conditional discriminative learning task in which correct choice responses to specific stimulus-stimulus associations are reinforced with a particular reinforcer or outcome. In two experiments, participants had to learn and retain in their memory the pills that were associated with particular disorders. To assess whether the DOP improved long-term retention of the learned disorder/pill associations, participants were asked to perform two recognition memory tests, 1 hour and 1 week after completing the learning phase. The results showed that compared with the standard non-differential outcomes procedure (NOP, the DOP produced better learning and long-term retention of the previously learned associations. These findings suggest that the DOP can be used as a useful complementary technique in intervention programs targeted at increasing adherence to clinical recommendations.

  13. Temperature-assisted solute focusing with sequential trap/release zones in isocratic and gradient capillary liquid chromatography: Simulation and experiment

    Science.gov (United States)

    Groskreutz, Stephen R.; Weber, Stephen G.

    2016-01-01

    In this work we characterize the development of a method to enhance temperature-assisted on-column solute focusing (TASF) called two-stage TASF. A new instrument was built to implement two-stage TASF consisting of a linear array of three independent, electronically controlled Peltier devices (thermoelectric coolers, TECs). Samples are loaded onto the chromatographic column with the first two TECs, TEC A and TEC B, cold. In the two-stage TASF approach TECs A and B are cooled during injection. TEC A is heated following sample loading. At some time following TEC A’s temperature rise, TEC B’s temperature is increased from the focusing temperature to a temperature matching that of TEC A. Injection bands are focused twice on-column, first on the initial TEC, e.g. single-stage TASF, then refocused on the second, cold TEC. Our goal is to understand the two-stage TASF approach in detail. We have developed a simple yet powerful digital simulation procedure to model the effect of changing temperature in the two focusing zones on retention, band shape and band spreading. The simulation can predict experimental chromatograms resulting from spatial and temporal temperature programs in combination with isocratic and solvent gradient elution. To assess the two-stage TASF method and the accuracy of the simulation well characterized solutes are needed. Thus, retention factors were measured at six temperatures (25–75 °C) at each of twelve mobile phases compositions (0.05–0.60 acetonitrile/water) for homologs of n-alkyl hydroxylbenzoate esters and n-alkyl p-hydroxyphenones. Simulations accurately reflect experimental results in showing that the two-stage approach improves separation quality. For example, two-stage TASF increased sensitivity for a low retention solute by a factor of 2.2 relative to single-stage TASF and 8.8 relative to isothermal conditions using isocratic elution. Gradient elution results for two-stage TASF were more encouraging. Application of two-stage TASF

  14. Simultaneous versus sequential penetrating keratoplasty and cataract surgery.

    Science.gov (United States)

    Hayashi, Ken; Hayashi, Hideyuki

    2006-10-01

    To compare the surgical outcomes of simultaneous penetrating keratoplasty and cataract surgery with those of sequential surgery. Thirty-nine eyes of 39 patients scheduled for simultaneous keratoplasty and cataract surgery and 23 eyes of 23 patients scheduled for sequential keratoplasty and secondary phacoemulsification surgery were recruited. Refractive error, regular and irregular corneal astigmatism determined by Fourier analysis, and endothelial cell loss were studied at 1 week and 3, 6, and 12 months after combined surgery in the simultaneous surgery group or after subsequent phacoemulsification surgery in the sequential surgery group. At 3 and more months after surgery, mean refractive error was significantly greater in the simultaneous surgery group than in the sequential surgery group, although no difference was seen at 1 week. The refractive error at 12 months was within 2 D of that targeted in 15 eyes (39%) in the simultaneous surgery group and within 2 D in 16 eyes (70%) in the sequential surgery group; the incidence was significantly greater in the sequential group (P = 0.0344). The regular and irregular astigmatism was not significantly different between the groups at 3 and more months after surgery. No significant difference was also found in the percentage of endothelial cell loss between the groups. Although corneal astigmatism and endothelial cell loss were not different, refractive error from target refraction was greater after simultaneous keratoplasty and cataract surgery than after sequential surgery, indicating a better outcome after sequential surgery than after simultaneous surgery.

  15. Forced Sequence Sequential Decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis; Paaske, Erik

    1998-01-01

    We describe a new concatenated decoding scheme based on iterations between an inner sequentially decoded convolutional code of rate R=1/4 and memory M=23, and block interleaved outer Reed-Solomon (RS) codes with nonuniform profile. With this scheme decoding with good performance is possible as low...... as Eb/N0=0.6 dB, which is about 1.25 dB below the signal-to-noise ratio (SNR) that marks the cutoff rate for the full system. Accounting for about 0.45 dB due to the outer codes, sequential decoding takes place at about 1.7 dB below the SNR cutoff rate for the convolutional code. This is possible since...... the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability of computational overflow. Analytical results for the probability that the first RS word is decoded after C computations are presented. These results are supported...

  16. Sequential use of the STICS crop model and of the MACRO pesticide fate model to simulate pesticides leaching in cropping systems.

    Science.gov (United States)

    Lammoglia, Sabine-Karen; Moeys, Julien; Barriuso, Enrique; Larsbo, Mats; Marín-Benito, Jesús-María; Justes, Eric; Alletto, Lionel; Ubertosi, Marjorie; Nicolardot, Bernard; Munier-Jolain, Nicolas; Mamy, Laure

    2017-03-01

    The current challenge in sustainable agriculture is to introduce new cropping systems to reduce pesticides use in order to reduce ground and surface water contamination. However, it is difficult to carry out in situ experiments to assess the environmental impacts of pesticide use for all possible combinations of climate, crop, and soils; therefore, in silico tools are necessary. The objective of this work was to assess pesticides leaching in cropping systems coupling the performances of a crop model (STICS) and of a pesticide fate model (MACRO). STICS-MACRO has the advantage of being able to simulate pesticides fate in complex cropping systems and to consider some agricultural practices such as fertilization, mulch, or crop residues management, which cannot be accounted for with MACRO. The performance of STICS-MACRO was tested, without calibration, from measurements done in two French experimental sites with contrasted soil and climate properties. The prediction of water percolation and pesticides concentrations with STICS-MACRO was satisfactory, but it varied with the pedoclimatic context. The performance of STICS-MACRO was shown to be similar or better than that of MACRO. The improvement of the simulation of crop growth allowed better estimate of crop transpiration therefore of water balance. It also allowed better estimate of pesticide interception by the crop which was found to be crucial for the prediction of pesticides concentrations in water. STICS-MACRO is a new promising tool to improve the assessment of the environmental risks of pesticides used in cropping systems.

  17. Simulation and comparison of progression-free survival among patients with non-squamous non-small-cell lung cancer receiving sequential therapy.

    Science.gov (United States)

    Walzer, Stefan; Chouaid, Christos; Lister, Johanna; Gultyaev, Dmitry; Vergnenegre, Alain; de Marinis, Filippo; Meng, Jie; de Castro Carpeno, Javier; Crott, Ralph; Kleman, Martin; Ngoh, Charles

    2015-01-01

    In recent years, the treatment landscape in advanced non-squamous non-small-cell lung cancer (nsNSCLC) has changed. New therapies (e.g., bevacizumab indicated in first line) have become available and other therapies (e.g., pemetrexed in first line and second line) moved into earlier lines in the treatment paradigm. While there has been an expansion of the available treatment options, it is still a key research question which therapy sequence results in the best survival outcomes for patients with nsNSCLC. A therapy-sequencing disease model that approximates treatment outcomes in up to five lines of treatment was developed for patients with nsNSCLC. The primary source of data for progression-free survival (PFS) and time to death was published pivotal trial data. All patients were treatment-naïve and in the PFS state, received first-line treatment with either bevacizumab-based therapy or doublet chemotherapy (including the option of pemetrexed + cisplatin). Patients would then progress to a subsequent line of therapy, remain in PFS or die. In case of progression, it was assumed that each survivor would receive a subsequent line of therapy, based on EMA licensed therapies. Weibull distribution curves were fitted to the data. All bevacizumab-based first-line therapy sequences analyzed achieved total PFS of around 15 months. Bevacizumab + carboplatin + paclitaxel (first line) → pemetrexed (second line) → erlotinib (third line) → docetaxel (fourth line) resulted in total mean PFS time of 15.7 months, for instance. Sequences with pemetrexed in combination with cisplatin in first line achieved total PFS times between 12.6 and 12.8 months with a slightly higher total PFS time achieved when assuming pemetrexed continuation therapy in maintenance after pemetrexed + cisplatin in first-line induction. Overall survival results followed the same trend as PFS. The model suggests that treatment-sequencing strategies starting with a bevacizumab-based combination in first line

  18. Comparison of two commercial embryo culture media (SAGE-1 step single medium vs. G1-PLUSTM/G2-PLUSTM sequential media): Influence on in vitro fertilization outcomes and human embryo quality.

    Science.gov (United States)

    López-Pelayo, Iratxe; Gutiérrez-Romero, Javier María; Armada, Ana Isabel Mangano; Calero-Ruiz, María Mercedes; Acevedo-Yagüe, Pablo Javier Moreno de

    2018-04-26

    To compare embryo quality, fertilization, implantation, miscarriage and clinical pregnancy rates for embryos cultured in two different commercial culture media until D-2 or D-3. In this retrospective study, we analyzed 189 cycles performed in 2016. Metaphase II oocytes were microinjected and allocated into single medium (SAGE 1-STEP, Origio) until transferred, frozen or discarded; or, if sequential media were used, the oocytes were cultured in G1-PLUSTM (Vitrolife) up to D-2 or D-3 and in G2-PLUSTM (Vitrolife) to transfer. On the following day, the oocytes were checked for normal fertilization and on D-2 and D-3 for morphological classification. Statistical analysis was performed using the chi-square and Mann-Whitney tests in PASW Statistics 18.0. The fertilization rates were 70.07% for single and 69.11% for sequential media (p=0.736). The mean number of embryos with high morphological quality (class A/B) was higher in the single medium than in the sequential media: D-2 [class A (190 vs. 107, pcultured in single medium were frozen: 197 (21.00%) vs. sequential: 102 (11.00%), pculture in single medium yields greater efficiency per cycle than in sequential media. Higher embryo quality and quantity were achieved, resulting in more frozen embryos. There were no differences in clinical pregnancy rates.

  19. The effect of different training exercises on the performance outcome on the da Vinci Skills Simulator.

    Science.gov (United States)

    Walliczek-Dworschak, U; Schmitt, M; Dworschak, P; Diogo, I; Ecke, A; Mandapathil, M; Teymoortash, A; Güldner, C

    2017-06-01

    Increasing usage of robotic surgery presents surgeons with the question of how to acquire the special skills required. This study aimed to analyze the effect of different exercises on their performance outcomes. This prospective study was conducted on the da Vinci Skills Simulator from December 2014 till August 2015. Sixty robotic novices were included and randomized to three groups of 20 participants each. Each group performed three different exercises with comparable difficulty levels. The exercises were performed three times in a row within two training sessions, with an interval of 1 week in between. On the final training day, two new exercises were added and a questionnaire was completed. Technical metrics of performance (overall score, time to complete, economy of motion, instrument collisions, excessive instrument force, instruments out of view, master work space range, drops, missed targets, misapplied energy time, blood loss and broken vessels) were recorded by the simulator software for further analysis. Training with different exercises led to comparable results in performance metrics for the final exercises among the three groups. A significant skills gain was recorded between the first and last exercises, with improved performance in overall score, time to complete and economy of motion for all exercises in all three groups. As training with different exercises led to comparable results in robotic training, the type of exercise seems to play a minor role in the outcome. For a robotic training curriculum, it might be important to choose exercises with comparable difficulty levels. In addition, it seems to be advantageous to limit the duration of the training to maintain the concentration throughout the entire session.

  20. Simulating the impact of improved cardiovascular risk interventions on clinical and economic outcomes in Russia.

    Directory of Open Access Journals (Sweden)

    Kenny Shum

    Full Text Available OBJECTIVES: Russia faces a high burden of cardiovascular disease. Prevalence of all cardiovascular risk factors, especially hypertension, is high. Elevated blood pressure is generally poorly controlled and medication usage is suboptimal. With a disease-model simulation, we forecast how various treatment programs aimed at increasing blood pressure control would affect cardiovascular outcomes. In addition, we investigated what additional benefit adding lipid control and smoking cessation to blood pressure control would generate in terms of reduced cardiovascular events. Finally, we estimated the direct health care costs saved by treating fewer cardiovascular events. METHODS: The Archimedes Model, a detailed computer model of human physiology, disease progression, and health care delivery was adapted to the Russian setting. Intervention scenarios of achieving systolic blood pressure control rates (defined as systolic blood pressure <140 mmHg of 40% and 60% were simulated by modifying adherence rates of an antihypertensive medication combination and compared with current care (23.9% blood pressure control rate. Outcomes of major adverse cardiovascular events; cerebrovascular event (stroke, myocardial infarction, and cardiovascular death over a 10-year time horizon were reported. Direct health care costs of strokes and myocardial infarctions were derived from official Russian statistics and tariff lists. RESULTS: To achieve systolic blood pressure control rates of 40% and 60%, adherence rates to the antihypertensive treatment program were 29.4% and 65.9%. Cardiovascular death relative risk reductions were 13.2%, and 29.6%, respectively. For the current estimated 43,855,000-person Russian hypertensive population, each control-rate scenario resulted in an absolute reduction of 1.0 million and 2.4 million cardiovascular deaths, and a reduction of 1.2 million and 2.7 million stroke/myocardial infarction diagnoses, respectively. Averted direct costs from

  1. Simulating the Impact of Improved Cardiovascular Risk Interventions on Clinical and Economic Outcomes in Russia

    Science.gov (United States)

    Shum, Kenny; Alperin, Peter; Shalnova, Svetlana; Boytsov, Sergey; Kontsevaya, Anna; Vigdorchik, Alexey; Guetz, Adam; Eriksson, Jennifer; Hughes, David

    2014-01-01

    Objectives Russia faces a high burden of cardiovascular disease. Prevalence of all cardiovascular risk factors, especially hypertension, is high. Elevated blood pressure is generally poorly controlled and medication usage is suboptimal. With a disease-model simulation, we forecast how various treatment programs aimed at increasing blood pressure control would affect cardiovascular outcomes. In addition, we investigated what additional benefit adding lipid control and smoking cessation to blood pressure control would generate in terms of reduced cardiovascular events. Finally, we estimated the direct health care costs saved by treating fewer cardiovascular events. Methods The Archimedes Model, a detailed computer model of human physiology, disease progression, and health care delivery was adapted to the Russian setting. Intervention scenarios of achieving systolic blood pressure control rates (defined as systolic blood pressure <140 mmHg) of 40% and 60% were simulated by modifying adherence rates of an antihypertensive medication combination and compared with current care (23.9% blood pressure control rate). Outcomes of major adverse cardiovascular events; cerebrovascular event (stroke), myocardial infarction, and cardiovascular death over a 10-year time horizon were reported. Direct health care costs of strokes and myocardial infarctions were derived from official Russian statistics and tariff lists. Results To achieve systolic blood pressure control rates of 40% and 60%, adherence rates to the antihypertensive treatment program were 29.4% and 65.9%. Cardiovascular death relative risk reductions were 13.2%, and 29.6%, respectively. For the current estimated 43,855,000-person Russian hypertensive population, each control-rate scenario resulted in an absolute reduction of 1.0 million and 2.4 million cardiovascular deaths, and a reduction of 1.2 million and 2.7 million stroke/myocardial infarction diagnoses, respectively. Averted direct costs from current care levels

  2. Sequential decay of Reggeons

    International Nuclear Information System (INIS)

    Yoshida, Toshihiro

    1981-01-01

    Probabilities of meson production in the sequential decay of Reggeons, which are formed from the projectile and the target in the hadron-hadron to Reggeon-Reggeon processes, are investigated. It is assumed that pair creation of heavy quarks and simultaneous creation of two antiquark-quark pairs are negligible. The leading-order terms with respect to ratio of creation probabilities of anti s s to anti u u (anti d d) are calculated. The production cross sections in the target fragmentation region are given in terms of probabilities in the initial decay of the Reggeons and an effect of manyparticle production. (author)

  3. Comparing Hospital Processes and Outcomes in California Medicare Beneficiaries: Simulation Prompts Reconsideration.

    Science.gov (United States)

    Escobar, Gabriel J; Baker, Jennifer M; Turk, Benjamin J; Draper, David; Liu, Vincent; Kipnis, Patricia

    2017-01-01

    This article is not a traditional research report. It describes how conducting a specific set of benchmarking analyses led us to broader reflections on hospital benchmarking. We reexamined an issue that has received far less attention from researchers than in the past: How variations in the hospital admission threshold might affect hospital rankings. Considering this threshold made us reconsider what benchmarking is and what future benchmarking studies might be like. Although we recognize that some of our assertions are speculative, they are based on our reading of the literature and previous and ongoing data analyses being conducted in our research unit. We describe the benchmarking analyses that led to these reflections. The Centers for Medicare and Medicaid Services' Hospital Compare Web site includes data on fee-for-service Medicare beneficiaries but does not control for severity of illness, which requires physiologic data now available in most electronic medical records.To address this limitation, we compared hospital processes and outcomes among Kaiser Permanente Northern California's (KPNC) Medicare Advantage beneficiaries and non-KPNC California Medicare beneficiaries between 2009 and 2010. We assigned a simulated severity of illness measure to each record and explored the effect of having the additional information on outcomes. We found that if the admission severity of illness in non-KPNC hospitals increased, KPNC hospitals' mortality performance would appear worse; conversely, if admission severity at non-KPNC hospitals' decreased, KPNC hospitals' performance would appear better. Future hospital benchmarking should consider the impact of variation in admission thresholds.

  4. Sequential Monte Carlo with Highly Informative Observations

    OpenAIRE

    Del Moral, Pierre; Murray, Lawrence M.

    2014-01-01

    We propose sequential Monte Carlo (SMC) methods for sampling the posterior distribution of state-space models under highly informative observation regimes, a situation in which standard SMC methods can perform poorly. A special case is simulating bridges between given initial and final values. The basic idea is to introduce a schedule of intermediate weighting and resampling times between observation times, which guide particles towards the final state. This can always be done for continuous-...

  5. Sequential test procedures for inventory differences

    International Nuclear Information System (INIS)

    Goldman, A.S.; Kern, E.A.; Emeigh, C.W.

    1985-01-01

    By means of a simulation study, we investigated the appropriateness of Page's and power-one sequential tests on sequences of inventory differences obtained from an example materials control unit, a sub-area of a hypothetical UF 6 -to-U 3 O 8 conversion process. The study examined detection probability and run length curves obtained from different loss scenarios. 12 refs., 10 figs., 2 tabs

  6. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  7. Pectus excavatum postsurgical outcome based on preoperative soft body dynamics simulation

    Science.gov (United States)

    Moreira, Antonio H. J.; Rodrigues, Pedro L.; Fonseca, Jaime; Pinho, A. C. M.; Rodrigues, Nuno F.; Correia-Pinto, Jorge; Vilaça, João L.

    2012-02-01

    Pectus excavatum is the most common congenital deformity of the anterior chest wall, in which an abnormal formation of the rib cage gives the chest a caved-in or sunken appearance. Today, the surgical correction of this deformity is carried out in children and adults through Nuss technic, which consists in the placement of a prosthetic bar under the sternum and over the ribs. Although this technique has been shown to be safe and reliable, not all patients have achieved adequate cosmetic outcome. This often leads to psychological problems and social stress, before and after the surgical correction. This paper targets this particular problem by presenting a method to predict the patient surgical outcome based on pre-surgical imagiologic information and chest skin dynamic modulation. The proposed approach uses the patient pre-surgical thoracic CT scan and anatomical-surgical references to perform a 3D segmentation of the left ribs, right ribs, sternum and skin. The technique encompasses three steps: a) approximation of the cartilages, between the ribs and the sternum, trough b-spline interpolation; b) a volumetric mass spring model that connects two layers - inner skin layer based on the outer pleura contour and the outer surface skin; and c) displacement of the sternum according to the prosthetic bar position. A dynamic model of the skin around the chest wall region was generated, capable of simulating the effect of the movement of the prosthetic bar along the sternum. The results were compared and validated with patient postsurgical skin surface acquired with Polhemus FastSCAN system.

  8. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  9. Implementation and outcome evaluation of high-fidelity simulation scenarios to integrate cognitive and psychomotor skills for Korean nursing students.

    Science.gov (United States)

    Ahn, Heejung; Kim, Hyun-Young

    2015-05-01

    This study is involved in designing high-fidelity simulations reflecting the Korean nursing education environment. In addition, it evaluated the simulations by nursing students' learning outcomes and perceptions of the simulation design features. A quantitative design was used in two separate phases. For the first phase, five nursing experts participated in verifying the appropriateness of two simulation scenarios that reflected the intended learning objectives. For the second phase, 69 nursing students in the third year of a bachelor's degree at a nursing school participated in evaluating the simulations and were randomized according to their previous course grades. The first phase verified the two simulation scenarios using a questionnaire. The second phase evaluated students' perceptions of the simulation design, self-confidence, and critical thinking skills using a quasi-experimental post-test design. ANCOVA was used to compare the experimental and control groups, and correlation coefficient analysis was used to determine the correlation among them. We created 2 simulation scenarios to integrate cognitive and psychomotor skills according to the learning objectives and clinical environment in Korea. The experimental group had significantly higher scores on self-confidence in the first scenario. The positive correlations between perceptions of the simulation design features, self-confidence, and critical thinking skill scores were statistically significant. Students with a more positive perception of the design features of the simulations had better learning outcomes. Based on this result, simulations need to be designed and implemented with more differentiation in order to be perceived more appropriately by students. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Transforming Healthcare Delivery: Integrating Dynamic Simulation Modelling and Big Data in Health Economics and Outcomes Research.

    Science.gov (United States)

    Marshall, Deborah A; Burgos-Liz, Lina; Pasupathy, Kalyan S; Padula, William V; IJzerman, Maarten J; Wong, Peter K; Higashi, Mitchell K; Engbers, Jordan; Wiebe, Samuel; Crown, William; Osgood, Nathaniel D

    2016-02-01

    In the era of the Information Age and personalized medicine, healthcare delivery systems need to be efficient and patient-centred. The health system must be responsive to individual patient choices and preferences about their care, while considering the system consequences. While dynamic simulation modelling (DSM) and big data share characteristics, they present distinct and complementary value in healthcare. Big data and DSM are synergistic-big data offer support to enhance the application of dynamic models, but DSM also can greatly enhance the value conferred by big data. Big data can inform patient-centred care with its high velocity, volume, and variety (the three Vs) over traditional data analytics; however, big data are not sufficient to extract meaningful insights to inform approaches to improve healthcare delivery. DSM can serve as a natural bridge between the wealth of evidence offered by big data and informed decision making as a means of faster, deeper, more consistent learning from that evidence. We discuss the synergies between big data and DSM, practical considerations and challenges, and how integrating big data and DSM can be useful to decision makers to address complex, systemic health economics and outcomes questions and to transform healthcare delivery.

  11. Simulation-based team training for multi-professional obstetric care teams to improve patient outcome : a multicentre, cluster randomised controlled trial

    NARCIS (Netherlands)

    Fransen, A F; van de Ven, J; Schuit, E; van Tetering, Aac; Mol, B W; Oei, S G

    OBJECTIVE: To investigate whether simulation-based obstetric team training in a simulation centre improves patient outcome. DESIGN: Multicentre, open, cluster randomised controlled trial. SETTING: Obstetric units in the Netherlands. POPULATION: Women with a singleton pregnancy beyond 24 weeks of

  12. Multiple imputation using linked proxy outcome data resulted in important bias reduction and efficiency gains: a simulation study.

    Science.gov (United States)

    Cornish, R P; Macleod, J; Carpenter, J R; Tilling, K

    2017-01-01

    When an outcome variable is missing not at random (MNAR: probability of missingness depends on outcome values), estimates of the effect of an exposure on this outcome are often biased. We investigated the extent of this bias and examined whether the bias can be reduced through incorporating proxy outcomes obtained through linkage to administrative data as auxiliary variables in multiple imputation (MI). Using data from the Avon Longitudinal Study of Parents and Children (ALSPAC) we estimated the association between breastfeeding and IQ (continuous outcome), incorporating linked attainment data (proxies for IQ) as auxiliary variables in MI models. Simulation studies explored the impact of varying the proportion of missing data (from 20 to 80%), the correlation between the outcome and its proxy (0.1-0.9), the strength of the missing data mechanism, and having a proxy variable that was incomplete. Incorporating a linked proxy for the missing outcome as an auxiliary variable reduced bias and increased efficiency in all scenarios, even when 80% of the outcome was missing. Using an incomplete proxy was similarly beneficial. High correlations (> 0.5) between the outcome and its proxy substantially reduced the missing information. Consistent with this, ALSPAC analysis showed inclusion of a proxy reduced bias and improved efficiency. Gains with additional proxies were modest. In longitudinal studies with loss to follow-up, incorporating proxies for this study outcome obtained via linkage to external sources of data as auxiliary variables in MI models can give practically important bias reduction and efficiency gains when the study outcome is MNAR.

  13. Adaptive sequential controller

    Energy Technology Data Exchange (ETDEWEB)

    El-Sharkawi, Mohamed A. (Renton, WA); Xing, Jian (Seattle, WA); Butler, Nicholas G. (Newberg, OR); Rodriguez, Alonso (Pasadena, CA)

    1994-01-01

    An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.

  14. Adaptive sequential controller

    Science.gov (United States)

    El-Sharkawi, Mohamed A.; Xing, Jian; Butler, Nicholas G.; Rodriguez, Alonso

    1994-01-01

    An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.

  15. The impact of eyewitness identifications from simultaneous and sequential lineups.

    Science.gov (United States)

    Wright, Daniel B

    2007-10-01

    Recent guidelines in the US allow either simultaneous or sequential lineups to be used for eyewitness identification. This paper investigates how potential jurors weight the probative value of the different outcomes from both of these types of lineups. Participants (n=340) were given a description of a case that included some exonerating and some incriminating evidence. There was either a simultaneous or a sequential lineup. Depending on the condition, an eyewitness chose the suspect, chose a filler, or made no identification. The participant had to judge the guilt of the suspect and decide whether to render a guilty verdict. For both simultaneous and sequential lineups an identification had a large effect,increasing the probability of a guilty verdict. There were no reliable effects detected between making no identification and identifying a filler. The effect sizes were similar for simultaneous and sequential lineups. These findings are important for judges and other legal professionals to know for trials involving lineup identifications.

  16. Quantum Inequalities and Sequential Measurements

    International Nuclear Information System (INIS)

    Candelpergher, B.; Grandouz, T.; Rubinx, J.L.

    2011-01-01

    In this article, the peculiar context of sequential measurements is chosen in order to analyze the quantum specificity in the two most famous examples of Heisenberg and Bell inequalities: Results are found at some interesting variance with customary textbook materials, where the context of initial state re-initialization is described. A key-point of the analysis is the possibility of defining Joint Probability Distributions for sequential random variables associated to quantum operators. Within the sequential context, it is shown that Joint Probability Distributions can be defined in situations where not all of the quantum operators (corresponding to random variables) do commute two by two. (authors)

  17. A Combination of Outcome and Process Feedback Enhances Performance in Simulations of Child Sexual Abuse Interviews Using Avatars

    Directory of Open Access Journals (Sweden)

    Francesco Pompedda

    2017-09-01

    Full Text Available Simulated interviews in alleged child sexual abuse (CSA cases with computer-generated avatars paired with feedback improve interview quality. In the current study, we aimed to understand better the effect of different types of feedback in this context. Feedback was divided into feedback regarding conclusions about what happened to the avatar (outcome feedback and feedback regarding the appropriateness of question-types used by the interviewer (process feedback. Forty-eight participants each interviewed four different avatars. Participants were divided into four groups (no feedback, outcome feedback, process feedback, and a combination of both feedback types. Compared to the control group, interview quality was generally improved in all the feedback groups on all outcome variables included. Combined feedback produced the strongest effect on increasing recommended questions and correct conclusions. For relevant and neutral details elicited by the interviewers, no statistically significant differences were found between feedback types. For wrong details, the combination of feedback produced the strongest effect, but this did not differ from the other two feedback groups. Nevertheless, process feedback produced a better result compared to outcome feedback. The present study replicated previous findings regarding the effect of feedback in improving interview quality, and provided new knowledge on feedback characteristics that maximize training effects. A combination of process and outcome feedback showed the strongest effect in enhancing training in simulated CSA interviews. Further research is, however, needed.

  18. Simultaneous Versus Sequential Ptosis and Strabismus Surgery in Children.

    Science.gov (United States)

    Revere, Karen E; Binenbaum, Gil; Li, Jonathan; Mills, Monte D; Katowitz, William R; Katowitz, James A

    The authors sought to compare the clinical outcomes of simultaneous versus sequential ptosis and strabismus surgery in children. Retrospective, single-center cohort study of children requiring both ptosis and strabismus surgery on the same eye. Simultaneous surgeries were performed during a single anesthetic event; sequential surgeries were performed at least 7 weeks apart. Outcomes were ptosis surgery success (margin reflex distance 1 ≥ 2 mm, good eyelid contour, and good eyelid crease); strabismus surgery success (ocular alignment within 10 prism diopters of orthophoria and/or improved head position); surgical complications; and reoperations. Fifty-six children were studied, 38 had simultaneous surgery and 18 sequential. Strabismus surgery was performed first in 38/38 simultaneous and 6/18 sequential cases. Mean age at first surgery was 64 months, with mean follow up 27 months. A total of 75% of children had congenital ptosis; 64% had comitant strabismus. A majority of ptosis surgeries were frontalis sling (59%) or Fasanella-Servat (30%) procedures. There were no significant differences between simultaneous and sequential groups with regards to surgical success rates, complications, or reoperations (all p > 0.28). In the first comparative study of simultaneous versus sequential ptosis and strabismus surgery, no advantage for sequential surgery was seen. Despite a theoretical risk of postoperative eyelid malposition or complications when surgeries were performed in a combined manner, the rate of such outcomes was not increased with simultaneous surgeries. Performing ptosis and strabismus surgery together appears to be clinically effective and safe, and reduces anesthesia exposure during childhood.

  19. Framework for sequential approximate optimization

    NARCIS (Netherlands)

    Jacobs, J.H.; Etman, L.F.P.; Keulen, van F.; Rooda, J.E.

    2004-01-01

    An object-oriented framework for Sequential Approximate Optimization (SAO) isproposed. The framework aims to provide an open environment for thespecification and implementation of SAO strategies. The framework is based onthe Python programming language and contains a toolbox of Python

  20. Simulation-based multiprofessional obstetric anaesthesia training conducted in situ versus off-site leads to similar individual and team outcomes

    DEFF Research Database (Denmark)

    Sørensen, Jette Led; van der Vleuten, Cees; Rosthøj, Susanne

    2015-01-01

    choice question test. EXPLORATORY OUTCOMES: Individual outcomes: scores on the Safety Attitudes Questionnaire, stress measurements (State-Trait Anxiety Inventory, cognitive appraisal and salivary cortisol), Intrinsic Motivation Inventory and perceptions of simulations. Team outcome: video assessment......OBJECTIVE: To investigate the effect of in situ simulation (ISS) versus off-site simulation (OSS) on knowledge, patient safety attitude, stress, motivation, perceptions of simulation, team performance and organisational impact. DESIGN: Investigator-initiated single-centre randomised superiority...... educational trial. SETTING: Obstetrics and anaesthesiology departments, Rigshospitalet, University of Copenhagen, Denmark. PARTICIPANTS: 100 participants in teams of 10, comprising midwives, specialised midwives, auxiliary nurses, nurse anaesthetists, operating theatre nurses, and consultant doctors...

  1. Sequentially pulsed traveling wave accelerator

    Science.gov (United States)

    Caporaso, George J [Livermore, CA; Nelson, Scott D [Patterson, CA; Poole, Brian R [Tracy, CA

    2009-08-18

    A sequentially pulsed traveling wave compact accelerator having two or more pulse forming lines each with a switch for producing a short acceleration pulse along a short length of a beam tube, and a trigger mechanism for sequentially triggering the switches so that a traveling axial electric field is produced along the beam tube in synchronism with an axially traversing pulsed beam of charged particles to serially impart energy to the particle beam.

  2. A Combination of Outcome and Process Feedback Enhances Performance in Simulations of Child Sexual Abuse Interviews Using Avatars

    OpenAIRE

    Francesco Pompedda; Jan Antfolk; Jan Antfolk; Angelo Zappalà; Angelo Zappalà; Pekka Santtila; Pekka Santtila

    2017-01-01

    Simulated interviews in alleged child sexual abuse (CSA) cases with computer-generated avatars paired with feedback improve interview quality. In the current study, we aimed to understand better the effect of different types of feedback in this context. Feedback was divided into feedback regarding conclusions about what happened to the avatar (outcome feedback) and feedback regarding the appropriateness of question-types used by the interviewer (process feedback). Forty-eight participants eac...

  3. A systematic review of the effectiveness of simulation-based education on satisfaction and learning outcomes in nurse practitioner programs.

    Science.gov (United States)

    Warren, Jessie N; Luctkar-Flude, Marian; Godfrey, Christina; Lukewich, Julia

    2016-11-01

    High-fidelity simulation (HFS) is becoming an integral component in healthcare education programs. There is considerable evidence demonstrating the effectiveness of HFS on satisfaction and learning outcomes within undergraduate nursing programs; however, there are few studies that have investigated its use and effectiveness within nurse practitioner (NP) programs. To synthesize the best available evidence about the effectiveness of HFS within NP education programs worldwide. The specific review question was: what is the effect of HFS on learner satisfaction, knowledge, attitudes, and skill performance in NP education? Joanna Briggs Institute systematic review methodology was utilized. The following databases were searched: MEDLINE, CINAHL, EMBASE, Epistemonikos, PROSPERO, HealthSTAR, AMED, Cochrane, Global Health and PsycINFO. Studies were included if they were quantitative in nature and reported on any aspect HFS within a NP program. Ten studies were included in the review. All studies were conducted in the United States and published between 2007 and 2014. Outcomes explored included: knowledge, attitudes, skills and satisfaction. The majority of studies compared HFS to online learning or traditional classroom lecture. Most study scenarios featured high acuity, low frequency events within acute care settings; only two studies utilized scenarios simulated within primary care. There is limited evidence supporting the use of HFS within NP programs. In general, HFS increases students' knowledge and confidence, and students are more satisfied with simulation-based teaching in comparison to other methods. Future studies should explore the effectiveness of simulation training within NP programs in reducing the theory to practice gap, and evaluate knowledge retention, transferability to real patient situations, and impact of simulation on patient outcomes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    Science.gov (United States)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  5. Incorporating Reflective Practice into Team Simulation Projects for Improved Learning Outcomes

    Science.gov (United States)

    Wills, Katherine V.; Clerkin, Thomas A.

    2009-01-01

    The use of simulation games in business courses is a popular method for providing undergraduate students with experiences similar to those they might encounter in the business world. As such, in 2003 the authors were pleased to find a classroom simulation tool that combined the decision-making and team experiences of a senior management group with…

  6. Sequential and simultaneous revascularization in adult orthotopic piggyback liver transplantation

    NARCIS (Netherlands)

    Polak, WG; Miyamoto, S; Nemes, BA; Peeters, PMJG; de Jong, KP; Porte, RJ; Slooff, MJH

    The aim of the study was to assess whether there is a difference in outcome after sequential or simultaneous revascularization during orthotopic liver transplantation (OLT) in terms of patient and graft survival, mortality, morbidity, and liver function. The study population consisted of 102 adult

  7. The Effect of Model Fidelity on Learning Outcomes of a Simulation-Based Education Program for Central Venous Catheter Insertion.

    Science.gov (United States)

    Diederich, Emily; Mahnken, Jonathan D; Rigler, Sally K; Williamson, Timothy L; Tarver, Stephen; Sharpe, Matthew R

    2015-12-01

    Simulation-based education for central venous catheter (CVC) insertion has been repeatedly documented to improve performance, but the impact of simulation model fidelity has not been described. The aim of this study was to examine the impact of the physical fidelity of the simulation model on learning outcomes for a simulation-based education program for CVC insertion. Forty consecutive residents rotating through the medical intensive care unit of an academic medical center completed a simulation-based education program for CVC insertion. The curriculum was designed in accordance with the principles of deliberate practice and mastery learning. Each resident underwent baseline skills testing and was then randomized to training on a commercially available CVC model with high physical fidelity (High-Fi group) or a simply constructed model with low physical fidelity (Low-Fi group) in a noninferiority trial. Upon completion of their medical intensive care unit rotation 4 weeks later, residents returned for repeat skills testing on the high-fidelity model using a 26-item checklist. The mean (SD) posttraining score on the 26-item checklist for the Low-Fi group was 23.8 (2.2) (91.5%) and was not inferior to the mean (SD) score for the High-Fi group of 22.5 (2.6) (86.5%) (P Simulation-based education using equipment with low physical fidelity can achieve learning outcomes comparable with those with high-fidelity equipment, as long as other aspects of fidelity are maintained and robust educational principles are applied during the design of the curriculum.

  8. Replicable Interprofessional Competency Outcomes from High-Volume, Inter-Institutional, Interprofessional Simulation

    Directory of Open Access Journals (Sweden)

    Deborah Bambini

    2016-10-01

    Full Text Available There are significant limitations among the few prior studies that have examined the development and implementation of interprofessional education (IPE experiences to accommodate a high volume of students from several disciplines and from different institutions. The present study addressed these gaps by seeking to determine the extent to which a single, large, inter-institutional, and IPE simulation event improves student perceptions of the importance and relevance of IPE and simulation as a learning modality, whether there is a difference in students’ perceptions among disciplines, and whether the results are reproducible. A total of 290 medical, nursing, pharmacy, and physical therapy students participated in one of two large, inter-institutional, IPE simulation events. Measurements included student perceptions about their simulation experience using the Attitude Towards Teamwork in Training Undergoing Designed Educational Simulation (ATTITUDES Questionnaire and open-ended questions related to teamwork and communication. Results demonstrated a statistically significant improvement across all ATTITUDES subscales, while time management, role confusion, collaboration, and mutual support emerged as significant themes. Results of the present study indicate that a single IPE simulation event can reproducibly result in significant and educationally meaningful improvements in student perceptions towards teamwork, IPE, and simulation as a learning modality.

  9. Replicable Interprofessional Competency Outcomes from High-Volume, Inter-Institutional, Interprofessional Simulation.

    Science.gov (United States)

    Bambini, Deborah; Emery, Matthew; de Voest, Margaret; Meny, Lisa; Shoemaker, Michael J

    2016-10-25

    There are significant limitations among the few prior studies that have examined the development and implementation of interprofessional education (IPE) experiences to accommodate a high volume of students from several disciplines and from different institutions. The present study addressed these gaps by seeking to determine the extent to which a single, large, inter-institutional, and IPE simulation event improves student perceptions of the importance and relevance of IPE and simulation as a learning modality, whether there is a difference in students' perceptions among disciplines, and whether the results are reproducible. A total of 290 medical, nursing, pharmacy, and physical therapy students participated in one of two large, inter-institutional, IPE simulation events. Measurements included student perceptions about their simulation experience using the Attitude Towards Teamwork in Training Undergoing Designed Educational Simulation (ATTITUDES) Questionnaire and open-ended questions related to teamwork and communication. Results demonstrated a statistically significant improvement across all ATTITUDES subscales, while time management, role confusion, collaboration, and mutual support emerged as significant themes. Results of the present study indicate that a single IPE simulation event can reproducibly result in significant and educationally meaningful improvements in student perceptions towards teamwork, IPE, and simulation as a learning modality.

  10. Survey of outcomes in a faculty development program on simulation pedagogy.

    Science.gov (United States)

    Roh, Young Sook; Kim, Mi Kang; Tangkawanich, Thitiarpha

    2016-06-01

    Although many nursing programs use simulation as a teaching-learning modality, there are few systematic approaches to help nursing educators learn this pedagogy. This study evaluates the effects of a simulation pedagogy nursing faculty development program on participants' learning perceptions using a retrospective pre-course and post-course design. Sixteen Thai participants completed a two-day nursing faculty development program on simulation pedagogy. Thirteen questionnaires were used in the final analysis. The participants' self-perceived learning about simulation teaching showed significant post-course improvement. On a five-point Likert scale, the composite mean attitude, subjective norm, and perceived behavioral control scores, as well as intention to use a simulator, showed a significant post-course increase. A faculty development program on simulation pedagogy induced favorable learning and attitudes. Further studies must test how faculty performance affects the cognitive, emotional, and social dimensions of learning in a simulation-based learning domain. © 2015 Wiley Publishing Asia Pty Ltd.

  11. Comparison of Sequential and Variational Data Assimilation

    Science.gov (United States)

    Alvarado Montero, Rodolfo; Schwanenberg, Dirk; Weerts, Albrecht

    2017-04-01

    Data assimilation is a valuable tool to improve model state estimates by combining measured observations with model simulations. It has recently gained significant attention due to its potential in using remote sensing products to improve operational hydrological forecasts and for reanalysis purposes. This has been supported by the application of sequential techniques such as the Ensemble Kalman Filter which require no additional features within the modeling process, i.e. it can use arbitrary black-box models. Alternatively, variational techniques rely on optimization algorithms to minimize a pre-defined objective function. This function describes the trade-off between the amount of noise introduced into the system and the mismatch between simulated and observed variables. While sequential techniques have been commonly applied to hydrological processes, variational techniques are seldom used. In our believe, this is mainly attributed to the required computation of first order sensitivities by algorithmic differentiation techniques and related model enhancements, but also to lack of comparison between both techniques. We contribute to filling this gap and present the results from the assimilation of streamflow data in two basins located in Germany and Canada. The assimilation introduces noise to precipitation and temperature to produce better initial estimates of an HBV model. The results are computed for a hindcast period and assessed using lead time performance metrics. The study concludes with a discussion of the main features of each technique and their advantages/disadvantages in hydrological applications.

  12. Time scale of random sequential adsorption.

    Science.gov (United States)

    Erban, Radek; Chapman, S Jonathan

    2007-04-01

    A simple multiscale approach to the diffusion-driven adsorption from a solution to a solid surface is presented. The model combines two important features of the adsorption process: (i) The kinetics of the chemical reaction between adsorbing molecules and the surface and (ii) geometrical constraints on the surface made by molecules which are already adsorbed. The process (i) is modeled in a diffusion-driven context, i.e., the conditional probability of adsorbing a molecule provided that the molecule hits the surface is related to the macroscopic surface reaction rate. The geometrical constraint (ii) is modeled using random sequential adsorption (RSA), which is the sequential addition of molecules at random positions on a surface; one attempt to attach a molecule is made per one RSA simulation time step. By coupling RSA with the diffusion of molecules in the solution above the surface the RSA simulation time step is related to the real physical time. The method is illustrated on a model of chemisorption of reactive polymers to a virus surface.

  13. Remarks on sequential designs in risk assessment

    International Nuclear Information System (INIS)

    Seidenfeld, T.

    1982-01-01

    The special merits of sequential designs are reviewed in light of particular challenges that attend risk assessment for human population. The kinds of ''statistical inference'' are distinguished and the problem of design which is pursued is the clash between Neyman-Pearson and Bayesian programs of sequential design. The value of sequential designs is discussed and the Neyman-Pearson vs. Bayesian sequential designs are probed in particular. Finally, warnings with sequential designs are considered, especially in relation to utilitarianism

  14. Simulation research to enhance patient safety and outcomes: recommendations of the Simnovate Patient Safety Domain Group

    OpenAIRE

    Pucher, PH; Tamblyn, R; Boorman, D; Dixon-Woods, Mary Margaret; Donaldson, L; Draycott, T; Forster, A; Nadkarni, V; Power, C; Sevdalis, N; Aggarwal, R

    2017-01-01

    The use of simulation-based training has established itself in healthcare but its implementation has been varied and mostly limited to technical and non-technical skills training. This article discusses the possibilities of the use of simulation as part of an overarching approach to improving patient safety, and represents the views of the Simnovate Patient Safety Domain Group, an international multidisciplinary expert group dedicated to the improvement of patient safety. The application and ...

  15. Sequential versus simultaneous market delineation

    DEFF Research Database (Denmark)

    Haldrup, Niels; Møllgaard, Peter; Kastberg Nielsen, Claus

    2005-01-01

    and geographical markets. Using a unique data setfor prices of Norwegian and Scottish salmon, we propose a methodologyfor simultaneous market delineation and we demonstrate that comparedto a sequential approach conclusions will be reversed.JEL: C3, K21, L41, Q22Keywords: Relevant market, econometric delineation......Delineation of the relevant market forms a pivotal part of most antitrustcases. The standard approach is sequential. First the product marketis delineated, then the geographical market is defined. Demand andsupply substitution in both the product dimension and the geographicaldimension...

  16. Sequential logic analysis and synthesis

    CERN Document Server

    Cavanagh, Joseph

    2007-01-01

    Until now, there was no single resource for actual digital system design. Using both basic and advanced concepts, Sequential Logic: Analysis and Synthesis offers a thorough exposition of the analysis and synthesis of both synchronous and asynchronous sequential machines. With 25 years of experience in designing computing equipment, the author stresses the practical design of state machines. He clearly delineates each step of the structured and rigorous design principles that can be applied to practical applications. The book begins by reviewing the analysis of combinatorial logic and Boolean a

  17. Learning sequential control in a Neural Blackboard Architecture for in situ concept reasoning

    NARCIS (Netherlands)

    van der Velde, Frank; van der Velde, Frank; Besold, Tarek R.; Lamb, Luis; Serafini, Luciano; Tabor, Whitney

    2016-01-01

    Simulations are presented and discussed of learning sequential control in a Neural Blackboard Architecture (NBA) for in situ concept-based reasoning. Sequential control is learned in a reservoir network, consisting of columns with neural circuits. This allows the reservoir to control the dynamics of

  18. Virtual simulation of the postsurgical cosmetic outcome in patients with Pectus Excavatum

    Science.gov (United States)

    Vilaça, João L.; Moreira, António H. J.; L-Rodrigues, Pedro; Rodrigues, Nuno; Fonseca, Jaime C.; Pinho, A. C. M.; Correia-Pinto, Jorge

    2011-03-01

    Pectus excavatum is the most common congenital deformity of the anterior chest wall, in which several ribs and the sternum grow abnormally. Nowadays, the surgical correction is carried out in children and adults through Nuss technic. This technic has been shown to be safe with major drivers as cosmesis and the prevention of psychological problems and social stress. Nowadays, no application is known to predict the cosmetic outcome of the pectus excavatum surgical correction. Such tool could be used to help the surgeon and the patient in the moment of deciding the need for surgery correction. This work is a first step to predict postsurgical outcome in pectus excavatum surgery correction. Facing this goal, it was firstly determined a point cloud of the skin surface along the thoracic wall using Computed Tomography (before surgical correction) and the Polhemus FastSCAN (after the surgical correction). Then, a surface mesh was reconstructed from the two point clouds using a Radial Basis Function algorithm for further affine registration between the meshes. After registration, one studied the surgical correction influence area (SCIA) of the thoracic wall. This SCIA was used to train, test and validate artificial neural networks in order to predict the surgical outcome of pectus excavatum correction and to determine the degree of convergence of SCIA in different patients. Often, ANN did not converge to a satisfactory solution (each patient had its own deformity characteristics), thus invalidating the creation of a mathematical model capable of estimating, with satisfactory results, the postsurgical outcome.

  19. Commentary on a participatory inquiry paradigm used to assess EOL simulation participant outcomes and design.

    Science.gov (United States)

    Gannon, Jane M

    2017-11-20

    Care at the end-of-life has attracted global attention, as health care workers struggle with balancing cure based care with end-of-life care, and knowing when to transition from the former to the latter. Simulation is gaining in popularity as an education strategy to facilitate health care provider decision-making by improving communication skills with patients and family members. This commentary focuses on the authors' simulation evaluation process. When data were assessed using a participatory inquiry paradigm, the evaluation revealed far more than a formative or summative evaluation of participant knowledge and skills in this area of care. Consequently, this assessment strategy has ramifications for best practices for simulation design and evaluation.

  20. Perceived Benefits of Pre-Clinical Simulation-based Training on Clinical Learning Outcomes among Omani Undergraduate Nursing Students

    Directory of Open Access Journals (Sweden)

    Girija Madhavanprabhakaran

    2015-01-01

    Full Text Available Objectives: This study aimed to explore the benefits perceived by Omani undergraduate maternity nursing students regarding the effect of pre-clinical simulation-based training (PSBT on clinical learning outcomes. Methods: This non-experimental quantitative survey was conducted between August and December 2012 among third-year baccalaureate nursing students at Sultan Qaboos University in Muscat, Oman. Voluntary participants were exposed to faculty-guided PSBT sessions using low- and medium-fidelity manikins, standardised scenarios and skill checklists on antenatal, intranatal, postnatal and newborn care and assessment. Participants answered a purposely designed self-administered questionnaire on the benefits of PSBT in enhancing learning outcomes. Items were categorised into six subscales: knowledge, skills, patient safety, academic safety, confidence and satisfaction. Scores were rated on a four-point Likert scale. Results: Of the 57 participants, the majority (95.2% agreed that PSBT enhanced their knowledge. Most students (94.3% felt that their patient safety practices improved and 86.5% rated PSBT as beneficial for enhancing skill competencies. All male students and 97% of the female students agreed that PSBT enhanced their confidence in the safe holding of newborns. Moreover, 93% of participants were satisfied with PSBT. Conclusion: Omani undergraduate nursing students perceived that PSBT enhanced their knowledge, skills, patient safety practices and confidence levels in providing maternity care. These findings support the use of simulation training as a strategy to facilitate clinical learning outcomes in future nursing courses in Oman, although further research is needed to explore the objective impact of PSBT on learning outcomes.

  1. The pursuit of balance in sequential randomized trials

    Directory of Open Access Journals (Sweden)

    Raymond P. Guiteras

    2016-06-01

    Full Text Available In many randomized trials, subjects enter the sample sequentially. Because the covariates for all units are not known in advance, standard methods of stratification do not apply. We describe and assess the method of DA-optimal sequential allocation (Atkinson, 1982 for balancing stratification covariates across treatment arms. We provide simulation evidence that the method can provide substantial improvements in precision over commonly employed alternatives. We also describe our experience implementing the method in a field trial of a clean water and handwashing intervention in Dhaka, Bangladesh, the first time the method has been used. We provide advice and software for future researchers.

  2. Event-shape analysis: Sequential versus simultaneous multifragment emission

    International Nuclear Information System (INIS)

    Cebra, D.A.; Howden, S.; Karn, J.; Nadasen, A.; Ogilvie, C.A.; Vander Molen, A.; Westfall, G.D.; Wilson, W.K.; Winfield, J.S.; Norbeck, E.

    1990-01-01

    The Michigan State University 4π array has been used to select central-impact-parameter events from the reaction 40 Ar+ 51 V at incident energies from 35 to 85 MeV/nucleon. The event shape in momentum space is an observable which is shown to be sensitive to the dynamics of the fragmentation process. A comparison of the experimental event-shape distribution to sequential- and simultaneous-decay predictions suggests that a transition in the breakup process may have occurred. At 35 MeV/nucleon, a sequential-decay simulation reproduces the data. For the higher energies, the experimental distributions fall between the two contrasting predictions

  3. Evaluation Using Sequential Trials Methods.

    Science.gov (United States)

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  4. Attack Trees with Sequential Conjunction

    NARCIS (Netherlands)

    Jhawar, Ravi; Kordy, Barbara; Mauw, Sjouke; Radomirović, Sasa; Trujillo-Rasua, Rolando

    2015-01-01

    We provide the first formal foundation of SAND attack trees which are a popular extension of the well-known attack trees. The SAND at- tack tree formalism increases the expressivity of attack trees by intro- ducing the sequential conjunctive operator SAND. This operator enables the modeling of

  5. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  6. Learning Outcomes from Business Simulation Exercises: Challenges for the Implementation of Learning Technologies

    Science.gov (United States)

    Clarke, Elizabeth

    2009-01-01

    Purpose: High order leadership, problem solving skills, and the capacity for innovation in new markets, and technologically complex and multidimensional contexts, are the new set of skills that are most valued by companies and employers alike. Business simulation exercises are one way of enhancing these skills. This article aims to examine the…

  7. Female rock sparrows (Petronia petronia), not males, respond differently to simulations of different courtship interaction outcomes

    DEFF Research Database (Denmark)

    Matessi, Giuliano; Peake, Tom M.; McGregor, Peter K.

    2007-01-01

    individuals of both sexes have access to a range of mating strategies. We tested whether rock sparrows (Petronia petronia) behave differently after hearing playbacks of vocal interactions simulating a successful courtship as opposed to playback of an unsuccessful courtship. We found no support for our...

  8. Assessing Critical Thinking Outcomes of Dental Hygiene Students Utilizing Virtual Patient Simulation: A Mixed Methods Study.

    Science.gov (United States)

    Allaire, Joanna L

    2015-09-01

    Dental hygiene educators must determine which educational practices best promote critical thinking, a quality necessary to translate knowledge into sound clinical decision making. The aim of this small pilot study was to determine whether virtual patient simulation had an effect on the critical thinking of dental hygiene students. A pretest-posttest design using the Health Science Reasoning Test was used to evaluate the critical thinking skills of senior dental hygiene students at The University of Texas School of Dentistry at Houston Dental Hygiene Program before and after their experience with computer-based patient simulation cases. Additional survey questions sought to identify the students' perceptions of whether the experience had helped develop their critical thinking skills and improved their ability to provide competent patient care. A convenience sample of 31 senior dental hygiene students completed both the pretest and posttest (81.5% of total students in that class); 30 senior dental hygiene students completed the survey on perceptions of the simulation (78.9% response rate). Although the results did not show a significant increase in mean scores, the students reported feeling that the use of virtual patients was an effective teaching method to promote critical thinking, problem-solving, and confidence in the clinical realm. The results of this pilot study may have implications to support the use of virtual patient simulations in dental hygiene education. Future research could include a larger controlled study to validate findings from this study.

  9. Transforming Healthcare Delivery: Integrating Dynamic Simulation Modelling and Big Data in Health Economics and Outcomes Research

    NARCIS (Netherlands)

    Marshall, Deborah A.; Burgos-Liz, Lina; Pasupathy, Kalyan S.; Padula, William V.; IJzerman, Maarten Joost; Wong, Peter K.; Higashi, Mitchell K.; Engbers, Jordan; Wiebe, Samuel; Crown, William; Osgood, Nathaniel D.

    2016-01-01

    In the era of the Information Age and personalized medicine, healthcare delivery systems need to be efficient and patient-centred. The health system must be responsive to individual patient choices and preferences about their care, while considering the system consequences. While dynamic simulation

  10. A Sequential Multiplicative Extended Kalman Filter for Attitude Estimation Using Vector Observations

    Science.gov (United States)

    Qin, Fangjun; Jiang, Sai; Zha, Feng

    2018-01-01

    In this paper, a sequential multiplicative extended Kalman filter (SMEKF) is proposed for attitude estimation using vector observations. In the proposed SMEKF, each of the vector observations is processed sequentially to update the attitude, which can make the measurement model linearization more accurate for the next vector observation. This is the main difference to Murrell’s variation of the MEKF, which does not update the attitude estimate during the sequential procedure. Meanwhile, the covariance is updated after all the vector observations have been processed, which is used to account for the special characteristics of the reset operation necessary for the attitude update. This is the main difference to the traditional sequential EKF, which updates the state covariance at each step of the sequential procedure. The numerical simulation study demonstrates that the proposed SMEKF has more consistent and accurate performance in a wide range of initial estimate errors compared to the MEKF and its traditional sequential forms. PMID:29751538

  11. A Sequential Multiplicative Extended Kalman Filter for Attitude Estimation Using Vector Observations

    Directory of Open Access Journals (Sweden)

    Fangjun Qin

    2018-05-01

    Full Text Available In this paper, a sequential multiplicative extended Kalman filter (SMEKF is proposed for attitude estimation using vector observations. In the proposed SMEKF, each of the vector observations is processed sequentially to update the attitude, which can make the measurement model linearization more accurate for the next vector observation. This is the main difference to Murrell’s variation of the MEKF, which does not update the attitude estimate during the sequential procedure. Meanwhile, the covariance is updated after all the vector observations have been processed, which is used to account for the special characteristics of the reset operation necessary for the attitude update. This is the main difference to the traditional sequential EKF, which updates the state covariance at each step of the sequential procedure. The numerical simulation study demonstrates that the proposed SMEKF has more consistent and accurate performance in a wide range of initial estimate errors compared to the MEKF and its traditional sequential forms.

  12. Benefits of applying a proxy eligibility period when using electronic health records for outcomes research: a simulation study.

    Science.gov (United States)

    Yu, Tzy-Chyi; Zhou, Huanxue

    2015-06-09

    Electronic health records (EHRs) can provide valuable data for outcomes research. However, unlike administrative claims databases, EHRs lack eligibility tables or a standard way to define the benefit coverage period, which could lead to underreporting of healthcare utilization or outcomes, and could result in surveillance bias. We tested the effect of using a proxy eligibility period (eligibility proxy) when estimating a range of health resource utilization and outcomes parameters under varying degrees of missing encounter data. We applied an eligibility proxy to create a benchmark cohort of chronic obstructive pulmonary disease (COPD) patients with 12 months of follow-up, with the assumption of no missing encounter data. The benchmark cohort provided parameter estimates for comparison with 9,000 simulated datasets representing 10-90% of COPD patients (by 10th percentiles) with between 1 and 11 months of continuous missing data. Two analyses, one for datasets using an eligibility proxy and one for those without an eligibility proxy, were performed on the 9,000 datasets to assess estimator performance under increasing levels of missing data. Estimates for each study variable were compared with those from the benchmark dataset, and performance was evaluated using bias, percentage change, and root-mean-square error. The benchmark dataset contained 6,717 COPD patients, whereas the simulated datasets where the eligibility proxy was applied had between 671 and 6,045 patients depending on the percentage of missing data. Parameter estimates had better performance when an eligibility proxy based on the first and last month of observed activity was applied. This finding was consistent across a range of variables representing patient comorbidities, symptoms, outcomes, health resource utilization, and medications, regardless of the measures of performance used. Without the eligibility proxy, all evaluated parameters were consistently underestimated. In a large COPD patient

  13. Can Simulation Credibility Be Improved Using Sensitivity Analysis to Understand Input Data Effects on Model Outcome?

    Science.gov (United States)

    Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.

    2015-01-01

    Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.

  14. Prosody and alignment: a sequential perspective

    Science.gov (United States)

    Szczepek Reed, Beatrice

    2010-12-01

    In their analysis of a corpus of classroom interactions in an inner city high school, Roth and Tobin describe how teachers and students accomplish interactional alignment by prosodically matching each other's turns. Prosodic matching, and specific prosodic patterns are interpreted as signs of, and contributions to successful interactional outcomes and positive emotions. Lack of prosodic matching, and other specific prosodic patterns are interpreted as features of unsuccessful interactions, and negative emotions. This forum focuses on the article's analysis of the relation between interpersonal alignment, emotion and prosody. It argues that prosodic matching, and other prosodic linking practices, play a primarily sequential role, i.e. one that displays the way in which participants place and design their turns in relation to other participants' turns. Prosodic matching, rather than being a conversational action in itself, is argued to be an interactional practice (Schegloff 1997), which is not always employed for the accomplishment of `positive', or aligning actions.

  15. Evaluation of high-fidelity simulation training in radiation oncology using an outcomes logic model

    International Nuclear Information System (INIS)

    Giuliani, Meredith; Gillan, Caitlin; Wong, Olive; Harnett, Nicole; Milne, Emily; Moseley, Doug; Thompson, Robert; Catton, Pamela; Bissonnette, Jean-Pierre

    2014-01-01

    To evaluate the feasibility and educational value of high-fidelity, interprofessional team-based simulation in radiation oncology. The simulation event was conducted in a radiation oncology department during a non-clinical day. It involved 5 simulation scenarios that were run over three 105 minute timeslots in a single day. High-acuity, low-frequency clinical situations were selected and included HDR brachytherapy emergency, 4D CT artifact management, pediatric emergency clinical mark-up, electron scalp trial set-up and a cone beam CT misregistration incident. A purposive sample of a minimum of 20 trainees was required to assess recruitment feasibility. A faculty radiation oncologist (RO), medical physicist (MP) or radiation therapist (RTT), facilitated each case. Participants completed a pre event survey of demographic data and motivation for participation. A post event survey collected perceptions of familiarity with the clinical content, comfort with interprofessional practice, and event satisfaction, scored on a 1–10 scale in terms of clinical knowledge, clinical decision making, clinical skills, exposure to other trainees and interprofessional communication. Means and standard deviations were calculated. Twenty-one trainees participated including 6 ROs (29%), 6 MPs (29%), and 9 RTTs (43%). All 12 cases (100%) were completed within the allocated 105 minutes. Nine faculty facilitators, (3MP, 2 RO, 4 RTTs) were required for 405 minutes each. Additional costs associated with this event were 154 hours to build the high fidelity scenarios, 2 standardized patients (SPs) for a total of 15.5 hours, and consumables.The mean (±SD) educational value score reported by participants with respect to clinical knowledge was 8.9 (1.1), clinical decision making 8.9 (1.3), clinical skills 8.9 (1.1), exposure to other trainees 9.1 (2.3) and interprofessional communication 9.1 (1.0). Fifteen (71%) participants reported the cases were of an appropriate complexity. The importance

  16. Sequential method for the assessment of innovations in computer assisted industrial processes

    International Nuclear Information System (INIS)

    Suarez Antola R.

    1995-01-01

    A sequential method for the assessment of innovations in industrial processes is proposed, using suitable combinations of mathematical modelling and numerical simulation of dynamics. Some advantages and limitations of the proposed method are discussed. tabs

  17. Changes in self-efficacy, collective efficacy and patient outcome following interprofessional simulation training on postpartum haemorrhage.

    Science.gov (United States)

    Egenberg, Signe; Øian, Pål; Eggebø, Torbjørn Moe; Arsenovic, Mirjana Grujic; Bru, Lars Edvin

    2017-10-01

    To examine whether interprofessional simulation training on management of postpartum haemorrhage enhances self-efficacy and collective efficacy and reduces the blood transfusion rate after birth. Postpartum haemorrhage is a leading cause of maternal morbidity and mortality worldwide, although it is preventable in most cases. Interprofessional simulation training might help improve the competence of health professionals dealing with postpartum haemorrhage, and more information is needed to determine its potential. Multimethod, quasi-experimental, pre-post intervention design. Interprofessional simulation training on postpartum haemorrhage was implemented for midwives, obstetricians and auxiliary nurses in a university hospital. Training included realistic scenarios and debriefing, and a measurement scale for perceived postpartum haemorrhage-specific self-efficacy, and collective efficacy was developed and implemented. Red blood cell transfusion was used as the dependent variable for improved patient outcome pre-post intervention. Self-efficacy and collective efficacy levels were significantly increased after training. The overall red blood cell transfusion rate did not change, but there was a significant reduction in the use of ≥5 units of blood products related to severe bleeding after birth. The study contributes to new knowledge on how simulation training through mastery and vicarious experiences, verbal persuasion and psychophysiological state might enhance postpartum haemorrhage-specific self-efficacy and collective efficacy levels and thereby predict team performance. The significant reduction in severe postpartum haemorrhage after training, indicated by reduction in ≥5 units of blood transfusions, corresponds well with the improvement in collective efficacy, and might reflect the emphasis on collective efforts to counteract severe cases of postpartum haemorrhage. Interprofessional simulation training in teams may contribute to enhanced prevention and

  18. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.; Shamma, Jeff S.

    2014-01-01

    incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well

  19. Robustness of the Sequential Lineup Advantage

    Science.gov (United States)

    Gronlund, Scott D.; Carlson, Curt A.; Dailey, Sarah B.; Goodsell, Charles A.

    2009-01-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup…

  20. Random sequential adsorption of cubes

    Science.gov (United States)

    Cieśla, Michał; Kubala, Piotr

    2018-01-01

    Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.

  1. Correlations between contouring similarity metrics and simulated treatment outcome for prostate radiotherapy

    Science.gov (United States)

    Roach, D.; Jameson, M. G.; Dowling, J. A.; Ebert, M. A.; Greer, P. B.; Kennedy, A. M.; Watt, S.; Holloway, L. C.

    2018-02-01

    Many similarity metrics exist for inter-observer contouring variation studies, however no correlation between metric choice and prostate cancer radiotherapy dosimetry has been explored. These correlations were investigated in this study. Two separate trials were undertaken, the first a thirty-five patient cohort with three observers, the second a five patient dataset with ten observers. Clinical and planning target volumes (CTV and PTV), rectum, and bladder were independently contoured by all observers in each trial. Structures were contoured on T2-weighted MRI and transferred onto CT following rigid registration for treatment planning in the first trial. Structures were contoured directly on CT in the second trial. STAPLE and majority voting volumes were generated as reference gold standard volumes for each structure for the two trials respectively. VMAT treatment plans (78 Gy to PTV) were simulated for observer and gold standard volumes, and dosimetry assessed using multiple radiobiological metrics. Correlations between contouring similarity metrics and dosimetry were calculated using Spearman’s rank correlation coefficient. No correlations were observed between contouring similarity metrics and dosimetry for CTV within either trial. Volume similarity correlated most strongly with radiobiological metrics for PTV in both trials, including TCPPoisson (ρ  =  0.57, 0.65), TCPLogit (ρ  =  0.39, 0.62), and EUD (ρ  =  0.43, 0.61) for each respective trial. Rectum and bladder metric correlations displayed no consistency for the two trials. PTV volume similarity was found to significantly correlate with rectum normal tissue complication probability (ρ  =  0.33, 0.48). Minimal to no correlations with dosimetry were observed for overlap or boundary contouring metrics. Future inter-observer contouring variation studies for prostate cancer should incorporate volume similarity to provide additional insights into dosimetry during analysis.

  2. Sequential Triangle Strip Generator based on Hopfield Networks

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Lněnička, Radim

    2009-01-01

    Roč. 21, č. 2 (2009), s. 583-617 ISSN 0899-7667 R&D Projects: GA MŠk(CZ) 1M0545; GA AV ČR 1ET100300517; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10300504; CEZ:AV0Z10750506 Keywords : sequential triangle strip * combinatorial optimization * Hopfield network * minimum energy * simulated annealing Subject RIV: IN - Informatics, Computer Science Impact factor: 2.175, year: 2009

  3. Sequential series for nuclear reactions

    International Nuclear Information System (INIS)

    Izumo, Ko

    1975-01-01

    A new time-dependent treatment of nuclear reactions is given, in which the wave function of compound nucleus is expanded by a sequential series of the reaction processes. The wave functions of the sequential series form another complete set of compound nucleus at the limit Δt→0. It is pointed out that the wave function is characterized by the quantities: the number of degrees of freedom of motion n, the period of the motion (Poincare cycle) tsub(n), the delay time t sub(nμ) and the relaxation time tausub(n) to the equilibrium of compound nucleus, instead of the usual quantum number lambda, the energy eigenvalue Esub(lambda) and the total width GAMMAsub(lambda) of resonance levels, respectively. The transition matrix elements and the yields of nuclear reactions also become the functions of time given by the Fourier transform of the usual ones. The Poincare cycles of compound nuclei are compared with the observed correlations among resonance levels, which are about 10 -17 --10 -16 sec for medium and heavy nuclei and about 10 -20 sec for the intermediate resonances. (auth.)

  4. Simulation-based multiprofessional obstetric anaesthesia training conducted in situ versus off-site leads to similar individual and team outcomes: a randomised educational trial

    Science.gov (United States)

    Sørensen, Jette Led; van der Vleuten, Cees; Rosthøj, Susanne; Østergaard, Doris; LeBlanc, Vicki; Johansen, Marianne; Ekelund, Kim; Starkopf, Liis; Lindschou, Jane; Gluud, Christian; Weikop, Pia; Ottesen, Bent

    2015-01-01

    Objective To investigate the effect of in situ simulation (ISS) versus off-site simulation (OSS) on knowledge, patient safety attitude, stress, motivation, perceptions of simulation, team performance and organisational impact. Design Investigator-initiated single-centre randomised superiority educational trial. Setting Obstetrics and anaesthesiology departments, Rigshospitalet, University of Copenhagen, Denmark. Participants 100 participants in teams of 10, comprising midwives, specialised midwives, auxiliary nurses, nurse anaesthetists, operating theatre nurses, and consultant doctors and trainees in obstetrics and anaesthesiology. Interventions Two multiprofessional simulations (clinical management of an emergency caesarean section and a postpartum haemorrhage scenario) were conducted in teams of 10 in the ISS versus the OSS setting. Primary outcome Knowledge assessed by a multiple choice question test. Exploratory outcomes Individual outcomes: scores on the Safety Attitudes Questionnaire, stress measurements (State-Trait Anxiety Inventory, cognitive appraisal and salivary cortisol), Intrinsic Motivation Inventory and perceptions of simulations. Team outcome: video assessment of team performance. Organisational impact: suggestions for organisational changes. Results The trial was conducted from April to June 2013. No differences between the two groups were found for the multiple choice question test, patient safety attitude, stress measurements, motivation or the evaluation of the simulations. The participants in the ISS group scored the authenticity of the simulation significantly higher than did the participants in the OSS group. Expert video assessment of team performance showed no differences between the ISS versus the OSS group. The ISS group provided more ideas and suggestions for changes at the organisational level. Conclusions In this randomised trial, no significant differences were found regarding knowledge, patient safety attitude, motivation or stress

  5. On the origin of reproducible sequential activity in neural circuits

    Science.gov (United States)

    Afraimovich, V. S.; Zhigulin, V. P.; Rabinovich, M. I.

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  6. Exploring the sequential lineup advantage using WITNESS.

    Science.gov (United States)

    Goodsell, Charles A; Gronlund, Scott D; Carlson, Curt A

    2010-12-01

    Advocates claim that the sequential lineup is an improvement over simultaneous lineup procedures, but no formal (quantitatively specified) explanation exists for why it is better. The computational model WITNESS (Clark, Appl Cogn Psychol 17:629-654, 2003) was used to develop theoretical explanations for the sequential lineup advantage. In its current form, WITNESS produced a sequential advantage only by pairing conservative sequential choosing with liberal simultaneous choosing. However, this combination failed to approximate four extant experiments that exhibited large sequential advantages. Two of these experiments became the focus of our efforts because the data were uncontaminated by likely suspect position effects. Decision-based and memory-based modifications to WITNESS approximated the data and produced a sequential advantage. The next step is to evaluate the proposed explanations and modify public policy recommendations accordingly.

  7. Sequential lineup presentation: Patterns and policy

    OpenAIRE

    Lindsay, R C L; Mansour, Jamal K; Beaudry, J L; Leach, A-M; Bertrand, M I

    2009-01-01

    Sequential lineups were offered as an alternative to the traditional simultaneous lineup. Sequential lineups reduce incorrect lineup selections; however, the accompanying loss of correct identifications has resulted in controversy regarding adoption of the technique. We discuss the procedure and research relevant to (1) the pattern of results found using sequential versus simultaneous lineups; (2) reasons (theory) for differences in witness responses; (3) two methodological issues; and (4) im...

  8. Assessment and implication of prognostic imbalance in randomized controlled trials with a binary outcome--a simulation study.

    Directory of Open Access Journals (Sweden)

    Rong Chu

    Full Text Available Chance imbalance in baseline prognosis of a randomized controlled trial can lead to over or underestimation of treatment effects, particularly in trials with small sample sizes. Our study aimed to (1 evaluate the probability of imbalance in a binary prognostic factor (PF between two treatment arms, (2 investigate the impact of prognostic imbalance on the estimation of a treatment effect, and (3 examine the effect of sample size (n in relation to the first two objectives.We simulated data from parallel-group trials evaluating a binary outcome by varying the risk of the outcome, effect of the treatment, power and prevalence of the PF, and n. Logistic regression models with and without adjustment for the PF were compared in terms of bias, standard error, coverage of confidence interval and statistical power.For a PF with a prevalence of 0.5, the probability of a difference in the frequency of the PF≥5% reaches 0.42 with 125/arm. Ignoring a strong PF (relative risk = 5 leads to underestimating the strength of a moderate treatment effect, and the underestimate is independent of n when n is >50/arm. Adjusting for such PF increases statistical power. If the PF is weak (RR = 2, adjustment makes little difference in statistical inference. Conditional on a 5% imbalance of a powerful PF, adjustment reduces the likelihood of large bias. If an absolute measure of imbalance ≥5% is deemed important, including 1000 patients/arm provides sufficient protection against such an imbalance. Two thousand patients/arm may provide an adequate control against large random deviations in treatment effect estimation in the presence of a powerful PF.The probability of prognostic imbalance in small trials can be substantial. Covariate adjustment improves estimation accuracy and statistical power, and hence should be performed when strong PFs are observed.

  9. Assessment and Implication of Prognostic Imbalance in Randomized Controlled Trials with a Binary Outcome – A Simulation Study

    Science.gov (United States)

    Chu, Rong; Walter, Stephen D.; Guyatt, Gordon; Devereaux, P. J.; Walsh, Michael; Thorlund, Kristian; Thabane, Lehana

    2012-01-01

    Background Chance imbalance in baseline prognosis of a randomized controlled trial can lead to over or underestimation of treatment effects, particularly in trials with small sample sizes. Our study aimed to (1) evaluate the probability of imbalance in a binary prognostic factor (PF) between two treatment arms, (2) investigate the impact of prognostic imbalance on the estimation of a treatment effect, and (3) examine the effect of sample size (n) in relation to the first two objectives. Methods We simulated data from parallel-group trials evaluating a binary outcome by varying the risk of the outcome, effect of the treatment, power and prevalence of the PF, and n. Logistic regression models with and without adjustment for the PF were compared in terms of bias, standard error, coverage of confidence interval and statistical power. Results For a PF with a prevalence of 0.5, the probability of a difference in the frequency of the PF≥5% reaches 0.42 with 125/arm. Ignoring a strong PF (relative risk = 5) leads to underestimating the strength of a moderate treatment effect, and the underestimate is independent of n when n is >50/arm. Adjusting for such PF increases statistical power. If the PF is weak (RR = 2), adjustment makes little difference in statistical inference. Conditional on a 5% imbalance of a powerful PF, adjustment reduces the likelihood of large bias. If an absolute measure of imbalance ≥5% is deemed important, including 1000 patients/arm provides sufficient protection against such an imbalance. Two thousand patients/arm may provide an adequate control against large random deviations in treatment effect estimation in the presence of a powerful PF. Conclusions The probability of prognostic imbalance in small trials can be substantial. Covariate adjustment improves estimation accuracy and statistical power, and hence should be performed when strong PFs are observed. PMID:22629322

  10. Biased lineups: sequential presentation reduces the problem.

    Science.gov (United States)

    Lindsay, R C; Lea, J A; Nosworthy, G J; Fulford, J A; Hector, J; LeVan, V; Seabrook, C

    1991-12-01

    Biased lineups have been shown to increase significantly false, but not correct, identification rates (Lindsay, Wallbridge, & Drennan, 1987; Lindsay & Wells, 1980; Malpass & Devine, 1981). Lindsay and Wells (1985) found that sequential lineup presentation reduced false identification rates, presumably by reducing reliance on relative judgment processes. Five staged-crime experiments were conducted to examine the effect of lineup biases and sequential presentation on eyewitness recognition accuracy. Sequential lineup presentation significantly reduced false identification rates from fair lineups as well as from lineups biased with regard to foil similarity, instructions, or witness attire, and from lineups biased in all of these ways. The results support recommendations that police present lineups sequentially.

  11. Immediately sequential bilateral cataract surgery: advantages and disadvantages.

    Science.gov (United States)

    Singh, Ranjodh; Dohlman, Thomas H; Sun, Grace

    2017-01-01

    The number of cataract surgeries performed globally will continue to rise to meet the needs of an aging population. This increased demand will require healthcare systems and providers to find new surgical efficiencies while maintaining excellent surgical outcomes. Immediately sequential bilateral cataract surgery (ISBCS) has been proposed as a solution and is increasingly being performed worldwide. The purpose of this review is to discuss the advantages and disadvantages of ISBCS. When appropriate patient selection occurs and guidelines are followed, ISBCS is comparable with delayed sequential bilateral cataract surgery in long-term patient satisfaction, visual acuity and complication rates. In addition, the risk of bilateral postoperative endophthalmitis and concerns of poorer refractive outcomes have not been supported by the literature. ISBCS is cost-effective for the patient, healthcare payors and society, but current reimbursement models in many countries create significant financial barriers for facilities and surgeons. As demand for cataract surgery rises worldwide, ISBCS will become increasingly important as an alternative to delayed sequential bilateral cataract surgery. Advantages include potentially decreased wait times for surgery, patient convenience and cost savings for healthcare payors. Although they are comparable in visual acuity and complication rates, hurdles that prevent wide adoption include liability concerns as ISBCS is not an established standard of care, economic constraints for facilities and surgeons and inability to fine-tune intraocular lens selection in the second eye. Given these considerations, an open discussion regarding the advantages and disadvantages of ISBCS is important for appropriate patient selection.

  12. Random and cooperative sequential adsorption

    Science.gov (United States)

    Evans, J. W.

    1993-10-01

    Irreversible random sequential adsorption (RSA) on lattices, and continuum "car parking" analogues, have long received attention as models for reactions on polymer chains, chemisorption on single-crystal surfaces, adsorption in colloidal systems, and solid state transformations. Cooperative generalizations of these models (CSA) are sometimes more appropriate, and can exhibit richer kinetics and spatial structure, e.g., autocatalysis and clustering. The distribution of filled or transformed sites in RSA and CSA is not described by an equilibrium Gibbs measure. This is the case even for the saturation "jammed" state of models where the lattice or space cannot fill completely. However exact analysis is often possible in one dimension, and a variety of powerful analytic methods have been developed for higher dimensional models. Here we review the detailed understanding of asymptotic kinetics, spatial correlations, percolative structure, etc., which is emerging for these far-from-equilibrium processes.

  13. Sequential and simultaneous SLAR block adjustment. [spline function analysis for mapping

    Science.gov (United States)

    Leberl, F.

    1975-01-01

    Two sequential methods of planimetric SLAR (Side Looking Airborne Radar) block adjustment, with and without splines, and three simultaneous methods based on the principles of least squares are evaluated. A limited experiment with simulated SLAR images indicates that sequential block formation with splines followed by external interpolative adjustment is superior to the simultaneous methods such as planimetric block adjustment with similarity transformations. The use of the sequential block formation is recommended, since it represents an inexpensive tool for satisfactory point determination from SLAR images.

  14. Sequential transformation of the structural and thermodynamic parameters of the complex particles, combining covalent conjugate (sodium caseinate + maltodextrin) with polyunsaturated lipids stabilized by a plant antioxidant, in the simulated gastro-intestinal conditions in vitro.

    Science.gov (United States)

    Antipova, Anna S; Zelikina, Darya V; Shumilina, Elena A; Semenova, Maria G

    2016-10-01

    The present work is focused on the structural transformation of the complexes, formed between covalent conjugate (sodium caseinate + maltodextrin) and an equimass mixture of the polyunsaturated lipids (PULs): (soy phosphatidylcholine + triglycerides of flaxseed oil) stabilized by a plant antioxidant (an essential oil of clove buds), in the simulated conditions of the gastrointestinal tract. The conjugate was used here as a food-grade delivery vehicle for the PULs. The release of these PULs at each stage of the simulated digestion was estimated. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  16. Research on parallel algorithm for sequential pattern mining

    Science.gov (United States)

    Zhou, Lijuan; Qin, Bai; Wang, Yu; Hao, Zhongxiao

    2008-03-01

    Sequential pattern mining is the mining of frequent sequences related to time or other orders from the sequence database. Its initial motivation is to discover the laws of customer purchasing in a time section by finding the frequent sequences. In recent years, sequential pattern mining has become an important direction of data mining, and its application field has not been confined to the business database and has extended to new data sources such as Web and advanced science fields such as DNA analysis. The data of sequential pattern mining has characteristics as follows: mass data amount and distributed storage. Most existing sequential pattern mining algorithms haven't considered the above-mentioned characteristics synthetically. According to the traits mentioned above and combining the parallel theory, this paper puts forward a new distributed parallel algorithm SPP(Sequential Pattern Parallel). The algorithm abides by the principal of pattern reduction and utilizes the divide-and-conquer strategy for parallelization. The first parallel task is to construct frequent item sets applying frequent concept and search space partition theory and the second task is to structure frequent sequences using the depth-first search method at each processor. The algorithm only needs to access the database twice and doesn't generate the candidated sequences, which abates the access time and improves the mining efficiency. Based on the random data generation procedure and different information structure designed, this paper simulated the SPP algorithm in a concrete parallel environment and implemented the AprioriAll algorithm. The experiments demonstrate that compared with AprioriAll, the SPP algorithm had excellent speedup factor and efficiency.

  17. Fully vs. Sequentially Coupled Loads Analysis of Offshore Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Damiani, Rick; Wendt, Fabian; Musial, Walter; Finucane, Z.; Hulliger, L.; Chilka, S.; Dolan, D.; Cushing, J.; O' Connell, D.; Falk, S.

    2017-06-19

    The design and analysis methods for offshore wind turbines must consider the aerodynamic and hydrodynamic loads and response of the entire system (turbine, tower, substructure, and foundation) coupled to the turbine control system dynamics. Whereas a fully coupled (turbine and support structure) modeling approach is more rigorous, intellectual property concerns can preclude this approach. In fact, turbine control system algorithms and turbine properties are strictly guarded and often not shared. In many cases, a partially coupled analysis using separate tools and an exchange of reduced sets of data via sequential coupling may be necessary. In the sequentially coupled approach, the turbine and substructure designers will independently determine and exchange an abridged model of their respective subsystems to be used in their partners' dynamic simulations. Although the ability to achieve design optimization is sacrificed to some degree with a sequentially coupled analysis method, the central question here is whether this approach can deliver the required safety and how the differences in the results from the fully coupled method could affect the design. This work summarizes the scope and preliminary results of a study conducted for the Bureau of Safety and Environmental Enforcement aimed at quantifying differences between these approaches through aero-hydro-servo-elastic simulations of two offshore wind turbines on a monopile and jacket substructure.

  18. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  19. Sequential lineup laps and eyewitness accuracy.

    Science.gov (United States)

    Steblay, Nancy K; Dietrich, Hannah L; Ryan, Shannon L; Raczynski, Jeanette L; James, Kali A

    2011-08-01

    Police practice of double-blind sequential lineups prompts a question about the efficacy of repeated viewings (laps) of the sequential lineup. Two laboratory experiments confirmed the presence of a sequential lap effect: an increase in witness lineup picks from first to second lap, when the culprit was a stranger. The second lap produced more errors than correct identifications. In Experiment 2, lineup diagnosticity was significantly higher for sequential lineup procedures that employed a single versus double laps. Witnesses who elected to view a second lap made significantly more errors than witnesses who chose to stop after one lap or those who were required to view two laps. Witnesses with prior exposure to the culprit did not exhibit a sequential lap effect.

  20. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  1. Sequential Product of Quantum Effects: An Overview

    Science.gov (United States)

    Gudder, Stan

    2010-12-01

    This article presents an overview for the theory of sequential products of quantum effects. We first summarize some of the highlights of this relatively recent field of investigation and then provide some new results. We begin by discussing sequential effect algebras which are effect algebras endowed with a sequential product satisfying certain basic conditions. We then consider sequential products of (discrete) quantum measurements. We next treat transition effect matrices (TEMs) and their associated sequential product. A TEM is a matrix whose entries are effects and whose rows form quantum measurements. We show that TEMs can be employed for the study of quantum Markov chains. Finally, we prove some new results concerning TEMs and vector densities.

  2. Sequential boundaries approach in clinical trials with unequal allocation ratios

    Directory of Open Access Journals (Sweden)

    Ayatollahi Seyyed

    2006-01-01

    Full Text Available Abstract Background In clinical trials, both unequal randomization design and sequential analyses have ethical and economic advantages. In the single-stage-design (SSD, however, if the sample size is not adjusted based on unequal randomization, the power of the trial will decrease, whereas with sequential analysis the power will always remain constant. Our aim was to compare sequential boundaries approach with the SSD when the allocation ratio (R was not equal. Methods We evaluated the influence of R, the ratio of the patients in experimental group to the standard group, on the statistical properties of two-sided tests, including the two-sided single triangular test (TT, double triangular test (DTT and SSD by multiple simulations. The average sample size numbers (ASNs and power (1-β were evaluated for all tests. Results Our simulation study showed that choosing R = 2 instead of R = 1 increases the sample size of SSD by 12% and the ASN of the TT and DTT by the same proportion. Moreover, when R = 2, compared to the adjusted SSD, using the TT or DTT allows to retrieve the well known reductions of ASN observed when R = 1, compared to SSD. In addition, when R = 2, compared to SSD, using the TT and DTT allows to obtain smaller reductions of ASN than when R = 1, but maintains the power of the test to its planned value. Conclusion This study indicates that when the allocation ratio is not equal among the treatment groups, sequential analysis could indeed serve as a compromise between ethicists, economists and statisticians.

  3. Causal mediation analysis with a binary outcome and multiple continuous or ordinal mediators: Simulations and application to an alcohol intervention

    OpenAIRE

    Nguyen, Trang Quynh; Webb-Vargas, Yenny; Koning, Ina M.; Stuart, Elizabeth A.

    2016-01-01

    We investigate a method to estimate the combined effect of multiple continuous/ordinal mediators on a binary outcome: 1) fit a structural equation model with probit link for the outcome and identity/probit link for continuous/ordinal mediators, 2) predict potential outcome probabilities, and 3) compute natural direct and indirect effects. Step 2 involves rescaling the latent continuous variable underlying the outcome to address residual mediator variance/covariance. We evaluate the estimation...

  4. Survival outcomes after radiation therapy for stage III non-small-cell lung cancer after adoption of computed tomography-based simulation.

    Science.gov (United States)

    Chen, Aileen B; Neville, Bridget A; Sher, David J; Chen, Kun; Schrag, Deborah

    2011-06-10

    Technical studies suggest that computed tomography (CT) -based simulation improves the therapeutic ratio for thoracic radiation therapy (TRT), although few studies have evaluated its use or impact on outcomes. We used the Surveillance, Epidemiology and End Results (SEER) -Medicare linked data to identify CT-based simulation for TRT among Medicare beneficiaries diagnosed with stage III non-small-cell lung cancer (NSCLC) between 2000 and 2005. Demographic and clinical factors associated with use of CT simulation were identified, and the impact of CT simulation on survival was analyzed by using Cox models and propensity score analysis. The proportion of patients treated with TRT who had CT simulation increased from 2.4% in 1994 to 34.0% in 2000 to 77.6% in 2005. Of the 5,540 patients treated with TRT from 2000 to 2005, 60.1% had CT simulation. Geographic variation was seen in rates of CT simulation, with lower rates in rural areas and in the South and West compared with those in the Northeast and Midwest. Patients treated with chemotherapy were more likely to have CT simulation (65.2% v 51.2%; adjusted odds ratio, 1.67; 95% CI, 1.48 to 1.88; P simulation. Controlling for demographic and clinical characteristics, CT simulation was associated with lower risk of death (adjusted hazard ratio, 0.77; 95% CI, 0.73 to 0.82; P simulation. CT-based simulation has been widely, although not uniformly, adopted for the treatment of stage III NSCLC and is associated with higher survival among patients receiving TRT.

  5. Boot cAMP: educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship.

    Science.gov (United States)

    Fernandez, Gladys L; Page, David W; Coe, Nicholas P; Lee, Patrick C; Patterson, Lisa A; Skylizard, Loki; St Louis, Myron; Amaral, Marisa H; Wait, Richard B; Seymour, Neal E

    2012-01-01

    Preparatory training for new trainees beginning residency has been used by a variety of programs across the country. To improve the clinical orientation process for our new postgraduate year (PGY)-1 residents, we developed an intensive preparatory training curriculum inclusive of cognitive and procedural skills, training activities considered essential for early PGY-1 clinical management. We define our surgical PGY-1 Boot Camp as preparatory simulation-based training implemented at the onset of internship for introduction of skills necessary for basic surgical patient problem assessment and management. This orientation process includes exposure to simulated patient care encounters and technical skills training essential to new resident education. We report educational results of 4 successive years of Boot Camp training. Results were analyzed to determine if performance evidenced at onset of training was predictive of later educational outcomes. Learners were PGY-1 residents, in both categorical and preliminary positions, at our medium-sized surgical residency program. Over a 4-year period, from July 2007 to July 2010, all 30 PGY-1 residents starting surgical residency at our institution underwent specific preparatory didactic and skills training over a 9-week period. This consisted of mandatory weekly 1-hour and 3-hour sessions in the Simulation Center, representing a 4-fold increase in time in simulation laboratory training compared with the remainder of the year. Training occurred in 8 procedural skills areas (instrument use, knot-tying, suturing, laparoscopic skills, airway management, cardiopulmonary resuscitation, central venous catheter, and chest tube insertion) and in simulated patient care (shock, surgical emergencies, and respiratory, cardiac, and trauma management) using a variety of high- and low-tech simulation platforms. Faculty and senior residents served as instructors. All educational activities were structured to include preparatory materials

  6. The subtyping of primary aldosteronism by adrenal vein sampling: sequential blood sampling causes factitious lateralization.

    Science.gov (United States)

    Rossitto, Giacomo; Battistel, Michele; Barbiero, Giulio; Bisogni, Valeria; Maiolino, Giuseppe; Diego, Miotto; Seccia, Teresa M; Rossi, Gian Paolo

    2018-02-01

    The pulsatile secretion of adrenocortical hormones and a stress reaction occurring when starting adrenal vein sampling (AVS) can affect the selectivity and also the assessment of lateralization when sequential blood sampling is used. We therefore tested the hypothesis that a simulated sequential blood sampling could decrease the diagnostic accuracy of lateralization index for identification of aldosterone-producing adenoma (APA), as compared with bilaterally simultaneous AVS. In 138 consecutive patients who underwent subtyping of primary aldosteronism, we compared the results obtained simultaneously bilaterally when starting AVS (t-15) and 15 min after (t0), with those gained with a simulated sequential right-to-left AVS technique (R ⇒ L) created by combining hormonal values obtained at t-15 and at t0. The concordance between simultaneously obtained values at t-15 and t0, and between simultaneously obtained values and values gained with a sequential R ⇒ L technique, was also assessed. We found a marked interindividual variability of lateralization index values in the patients with bilaterally selective AVS at both time point. However, overall the lateralization index simultaneously determined at t0 provided a more accurate identification of APA than the simulated sequential lateralization indexR ⇒ L (P = 0.001). Moreover, regardless of which side was sampled first, the sequential AVS technique induced a sequence-dependent overestimation of lateralization index. While in APA patients the concordance between simultaneous AVS at t0 and t-15 and between simultaneous t0 and sequential technique was moderate-to-good (K = 0.55 and 0.66, respectively), in non-APA patients, it was poor (K = 0.12 and 0.13, respectively). Sequential AVS generates factitious between-sides gradients, which lower its diagnostic accuracy, likely because of the stress reaction arising upon starting AVS.

  7. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros; Jasra, Ajay; Law, Kody; Tempone, Raul; Zhou, Yan

    2016-01-01

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  8. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros

    2016-08-29

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  9. Sequential Scintigraphy in Renal Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Winkel, K. zum; Harbst, H.; Schenck, P.; Franz, H. E.; Ritz, E.; Roehl, L.; Ziegler, M.; Ammann, W.; Maier-Borst, W. [Institut Fuer Nuklearmedizin, Deutsches Krebsforschungszentrum, Heidelberg, Federal Republic of Germany (Germany)

    1969-05-15

    Based on experience gained from more than 1600 patients with proved or suspected kidney diseases and on results on extended studies with dogs, sequential scintigraphy was performed after renal transplantation in dogs. After intravenous injection of 500 {mu}Ci. {sup 131}I-Hippuran scintiphotos were taken during the first minute with an exposure time of 15 sec each and thereafter with an exposure of 2 min up to at least 16 min.. Several examinations were evaluated digitally. 26 examinations were performed on 11 dogs with homotransplanted kidneys. Immediately after transplantation the renal function was almost normal arid the bladder was filled in due time. At the beginning of rejection the initial uptake of radioactive Hippuran was reduced. The intrarenal transport became delayed; probably the renal extraction rate decreased. Corresponding to the development of an oedema in the transplant the uptake area increased in size. In cases of thrombosis of the main artery there was no evidence of any uptake of radioactivity in the transplant. Similar results were obtained in 41 examinations on 15 persons. Patients with postoperative anuria due to acute tubular necrosis showed still some uptake of radioactivity contrary to those with thrombosis of the renal artery, where no uptake was found. In cases of rejection the most frequent signs were a reduced initial uptake and a delayed intrarenal transport of radioactive Hippuran. Infarction could be detected by a reduced uptake in distinct areas of the transplant. (author)

  10. Sequential provisional implant prosthodontics therapy.

    Science.gov (United States)

    Zinner, Ira D; Markovits, Stanley; Jansen, Curtis E; Reid, Patrick E; Schnader, Yale E; Shapiro, Herbert J

    2012-01-01

    The fabrication and long-term use of first- and second-stage provisional implant prostheses is critical to create a favorable prognosis for function and esthetics of a fixed-implant supported prosthesis. The fixed metal and acrylic resin cemented first-stage prosthesis, as reviewed in Part I, is needed for prevention of adjacent and opposing tooth movement, pressure on the implant site as well as protection to avoid micromovement of the freshly placed implant body. The second-stage prosthesis, reviewed in Part II, should be used following implant uncovering and abutment installation. The patient wears this provisional prosthesis until maturation of the bone and healing of soft tissues. The second-stage provisional prosthesis is also a fail-safe mechanism for possible early implant failures and also can be used with late failures and/or for the necessity to repair the definitive prosthesis. In addition, the screw-retained provisional prosthesis is used if and when an implant requires removal or other implants are to be placed as in a sequential approach. The creation and use of both first- and second-stage provisional prostheses involve a restorative dentist, dental technician, surgeon, and patient to work as a team. If the dentist alone cannot do diagnosis and treatment planning, surgery, and laboratory techniques, he or she needs help by employing the expertise of a surgeon and a laboratory technician. This team approach is essential for optimum results.

  11. Compound and Sequential Digital Exclusion : Internet Skills, Uses, and Outcomes

    NARCIS (Netherlands)

    van Deursen, Alexander Johannes Aloysius Maria; Helsper, Ellen; Eynon, Rebecca; van Dijk, Johannes A.G.M.

    2016-01-01

    Through a survey with a representative sample of Dutch Internet users, this paper examines (1) compound digital exclusion, that is, whether a person who lacks a particular type of digital skill also lacks another kind of skill; whether a person who does not engage in a particular way with the

  12. Tradable permit allocations and sequential choice

    Energy Technology Data Exchange (ETDEWEB)

    MacKenzie, Ian A. [Centre for Economic Research, ETH Zuerich, Zurichbergstrasse 18, 8092 Zuerich (Switzerland)

    2011-01-15

    This paper investigates initial allocation choices in an international tradable pollution permit market. For two sovereign governments, we compare allocation choices that are either simultaneously or sequentially announced. We show sequential allocation announcements result in higher (lower) aggregate emissions when announcements are strategic substitutes (complements). Whether allocation announcements are strategic substitutes or complements depends on the relationship between the follower's damage function and governments' abatement costs. When the marginal damage function is relatively steep (flat), allocation announcements are strategic substitutes (complements). For quadratic abatement costs and damages, sequential announcements provide a higher level of aggregate emissions. (author)

  13. Sequential Generalized Transforms on Function Space

    Directory of Open Access Journals (Sweden)

    Jae Gil Choi

    2013-01-01

    Full Text Available We define two sequential transforms on a function space Ca,b[0,T] induced by generalized Brownian motion process. We then establish the existence of the sequential transforms for functionals in a Banach algebra of functionals on Ca,b[0,T]. We also establish that any one of these transforms acts like an inverse transform of the other transform. Finally, we give some remarks about certain relations between our sequential transforms and other well-known transforms on Ca,b[0,T].

  14. A Bayesian sequential design using alpha spending function to control type I error.

    Science.gov (United States)

    Zhu, Han; Yu, Qingzhao

    2017-10-01

    We propose in this article a Bayesian sequential design using alpha spending functions to control the overall type I error in phase III clinical trials. We provide algorithms to calculate critical values, power, and sample sizes for the proposed design. Sensitivity analysis is implemented to check the effects from different prior distributions, and conservative priors are recommended. We compare the power and actual sample sizes of the proposed Bayesian sequential design with different alpha spending functions through simulations. We also compare the power of the proposed method with frequentist sequential design using the same alpha spending function. Simulations show that, at the same sample size, the proposed method provides larger power than the corresponding frequentist sequential design. It also has larger power than traditional Bayesian sequential design which sets equal critical values for all interim analyses. When compared with other alpha spending functions, O'Brien-Fleming alpha spending function has the largest power and is the most conservative in terms that at the same sample size, the null hypothesis is the least likely to be rejected at early stage of clinical trials. And finally, we show that adding a step of stop for futility in the Bayesian sequential design can reduce the overall type I error and reduce the actual sample sizes.

  15. Virtual Learning Simulations in High School: Effects on Cognitive and Non-cognitive Outcomes and Implications on the Development of STEM Academic and Career Choice

    Directory of Open Access Journals (Sweden)

    Malene Thisgaard

    2017-05-01

    Full Text Available The present study compared the value of using a virtual learning simulation compared to traditional lessons on the topic of evolution, and investigated if the virtual learning simulation could serve as a catalyst for STEM academic and career development, based on social cognitive career theory. The investigation was conducted using a crossover repeated measures design based on a sample of 128 high school biology/biotech students. The results showed that the virtual learning simulation increased knowledge of evolution significantly, compared to the traditional lesson. No significant differences between the simulation and lesson were found in their ability to increase the non-cognitive measures. Both interventions increased self-efficacy significantly, and none of them had a significant effect on motivation. In addition, the results showed that the simulation increased interest in biology related tasks, but not outcome expectations. The findings suggest that virtual learning simulations are at least as efficient in enhancing learning and self-efficacy as traditional lessons, and high schools can thus use them as supplementary educational methods. In addition, the findings indicate that virtual learning simulations may be a useful tool in enhancing student’s interest in and goals toward STEM related careers.

  16. Virtual Learning Simulations in High School: Effects on Cognitive and Non-cognitive Outcomes and Implications on the Development of STEM Academic and Career Choice.

    Science.gov (United States)

    Thisgaard, Malene; Makransky, Guido

    2017-01-01

    The present study compared the value of using a virtual learning simulation compared to traditional lessons on the topic of evolution, and investigated if the virtual learning simulation could serve as a catalyst for STEM academic and career development, based on social cognitive career theory. The investigation was conducted using a crossover repeated measures design based on a sample of 128 high school biology/biotech students. The results showed that the virtual learning simulation increased knowledge of evolution significantly, compared to the traditional lesson. No significant differences between the simulation and lesson were found in their ability to increase the non-cognitive measures. Both interventions increased self-efficacy significantly, and none of them had a significant effect on motivation. In addition, the results showed that the simulation increased interest in biology related tasks, but not outcome expectations. The findings suggest that virtual learning simulations are at least as efficient in enhancing learning and self-efficacy as traditional lessons, and high schools can thus use them as supplementary educational methods. In addition, the findings indicate that virtual learning simulations may be a useful tool in enhancing student's interest in and goals toward STEM related careers.

  17. Is it worth it to consider videogames in accounting education? A comparison of a simulation and a videogame in attributes, motivation and learning outcomes

    Directory of Open Access Journals (Sweden)

    Jordi Carenys

    2017-07-01

    Full Text Available The objective of this study is to assess the effectiveness of videogames in comparison to simulations in a higher education environment and with regard to their attributes, motivation, and learning outcomes, as three of the main dimensions that play a role in the effectiveness of digital game-based learning. Results demonstrate significant differences between the attributes and motivation dimensions, while no significant differences were found for the learning outcomes. This would imply that although both instructional tools lead students to the desired level of knowledge acquisition, the motivation generated, together with the set of features provided by the games complement each other, leading to a superior learning experience. These results support the inclusion of videogames as a complement to simulations in higher education accounting and business environments and allow us to propose a blended approach that provides the learner with the ‘best of both worlds’.

  18. Efficacy of premixed versus sequential administration of ...

    African Journals Online (AJOL)

    sequential administration in separate syringes on block characteristics, haemodynamic parameters, side effect profile and postoperative analgesic requirement. Trial design: This was a prospective, randomised clinical study. Method: Sixty orthopaedic patients scheduled for elective lower limb surgery under spinal ...

  19. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  20. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    Science.gov (United States)

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  2. Outcomes of a virtual-reality simulator-training programme on basic surgical skills in robot-assisted laparoscopic surgery.

    Science.gov (United States)

    Phé, Véronique; Cattarino, Susanna; Parra, Jérôme; Bitker, Marc-Olivier; Ambrogi, Vanina; Vaessen, Christophe; Rouprêt, Morgan

    2017-06-01

    The utility of the virtual-reality robotic simulator in training programmes has not been clearly evaluated. Our aim was to evaluate the impact of a virtual-reality robotic simulator-training programme on basic surgical skills. A simulator-training programme in robotic surgery, using the da Vinci Skills Simulator, was evaluated in a population including junior and seasoned surgeons, and non-physicians. Their performances on robotic dots and suturing-skin pod platforms before and after virtual-simulation training were rated anonymously by surgeons experienced in robotics. 39 participants were enrolled: 14 medical students and residents in surgery, 14 seasoned surgeons, 11 non-physicians. Junior and seasoned surgeons' performances on platforms were not significantly improved after virtual-reality robotic simulation in any of the skill domains, in contrast to non-physicians. The benefits of virtual-reality simulator training on several tasks to basic skills in robotic surgery were not obvious among surgeons in our initial and early experience with the simulator. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. A study of two sequential culture media - impact on embryo quality ...

    African Journals Online (AJOL)

    Abstract. Objective. A comparative study of embryo quality and pregnancy outcome between Sydney IVF medium and. Quinn's Advantage sequential culture media. Design. A prospective randomised controlled trial and a retrospective study. Setting. In vitro fertilisation clinic in an academic research environment. Patients.

  4. Effect of battery longevity on costs and health outcomes associated with cardiac implantable electronic devices: a Markov model-based Monte Carlo simulation.

    Science.gov (United States)

    Schmier, Jordana K; Lau, Edmund C; Patel, Jasmine D; Klenk, Juergen A; Greenspon, Arnold J

    2017-11-01

    The effects of device and patient characteristics on health and economic outcomes in patients with cardiac implantable electronic devices (CIEDs) are unclear. Modeling can estimate costs and outcomes for patients with CIEDs under a variety of scenarios, varying battery longevity, comorbidities, and care settings. The objective of this analysis was to compare changes in patient outcomes and payer costs attributable to increases in battery life of implantable cardiac defibrillators (ICDs) and cardiac resynchronization therapy defibrillators (CRT-D). We developed a Monte Carlo Markov model simulation to follow patients through primary implant, postoperative maintenance, generator replacement, and revision states. Patients were simulated in 3-month increments for 15 years or until death. Key variables included Charlson Comorbidity Index, CIED type, legacy versus extended battery longevity, mortality rates (procedure and all-cause), infection and non-infectious complication rates, and care settings. Costs included procedure-related (facility and professional), maintenance, and infections and non-infectious complications, all derived from Medicare data (2004-2014, 5% sample). Outcomes included counts of battery replacements, revisions, infections and non-infectious complications, and discounted (3%) costs and life years. An increase in battery longevity in ICDs yielded reductions in numbers of revisions (by 23%), battery changes (by 44%), infections (by 23%), non-infectious complications (by 10%), and total costs per patient (by 9%). Analogous reductions for CRT-Ds were 23% (revisions), 32% (battery changes), 22% (infections), 8% (complications), and 10% (costs). Based on modeling results, as battery longevity increases, patients experience fewer adverse outcomes and healthcare costs are reduced. Understanding the magnitude of the cost benefit of extended battery life can inform budgeting and planning decisions by healthcare providers and insurers.

  5. Deciphering Intrinsic Inter-subunit Couplings that Lead to Sequential Hydrolysis of F 1 -ATPase Ring

    Science.gov (United States)

    Dai, Liqiang; Flechsig, Holger; Yu, Jin

    2017-10-01

    The rotary sequential hydrolysis of metabolic machine F1-ATPase is a prominent feature to reveal high coordination among multiple chemical sites on the stator F1 ring, which also contributes to tight coupling between the chemical reaction and central {\\gamma}-shaft rotation. High-speed AFM experiments discovered that the sequential hydrolysis was maintained on the F1 ring even in the absence of the {\\gamma} rotor. To explore how the intrinsic sequential performance arises, we computationally investigated essential inter-subunit couplings on the hexameric ring of mitochondrial and bacterial F1. We first reproduced the sequential hydrolysis schemes as experimentally detected, by simulating tri-site ATP hydrolysis cycles on the F1 ring upon kinetically imposing inter-subunit couplings to substantially promote the hydrolysis products release. We found that it is key for certain ATP binding and hydrolysis events to facilitate the neighbor-site ADP and Pi release to support the sequential hydrolysis. The kinetically feasible couplings were then scrutinized through atomistic molecular dynamics simulations as well as coarse-grained simulations, in which we enforced targeted conformational changes for the ATP binding or hydrolysis. Notably, we detected the asymmetrical neighbor-site opening that would facilitate the ADP release upon the enforced ATP binding, and computationally captured the complete Pi release through charge hopping upon the enforced neighbor-site ATP hydrolysis. The ATP-hydrolysis triggered Pi release revealed in current TMD simulation confirms a recent prediction made from statistical analyses of single molecule experimental data in regard to the role ATP hydrolysis plays. Our studies, therefore, elucidate both the concerted chemical kinetics and underlying structural dynamics of the inter-subunit couplings that lead to the rotary sequential hydrolysis of the F1 ring.

  6. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  7. Causal mediation analysis with a binary outcome and multiple continuous or ordinal mediators: Simulations and application to an alcohol intervention.

    Science.gov (United States)

    Nguyen, Trang Quynh; Webb-Vargas, Yenny; Koning, Ina M; Stuart, Elizabeth A

    We investigate a method to estimate the combined effect of multiple continuous/ordinal mediators on a binary outcome: 1) fit a structural equation model with probit link for the outcome and identity/probit link for continuous/ordinal mediators, 2) predict potential outcome probabilities, and 3) compute natural direct and indirect effects. Step 2 involves rescaling the latent continuous variable underlying the outcome to address residual mediator variance/covariance. We evaluate the estimation of risk-difference- and risk-ratio-based effects (RDs, RRs) using the ML, WLSMV and Bayes estimators in Mplus. Across most variations in path-coefficient and mediator-residual-correlation signs and strengths, and confounding situations investigated, the method performs well with all estimators, but favors ML/WLSMV for RDs with continuous mediators, and Bayes for RRs with ordinal mediators. Bayes outperforms WLSMV/ML regardless of mediator type when estimating RRs with small potential outcome probabilities and in two other special cases. An adolescent alcohol prevention study is used for illustration.

  8. Tinnitus after Simultaneous and Sequential Bilateral Cochlear Implantation.

    Science.gov (United States)

    Ramakers, Geerte G J; Kraaijenga, Véronique J C; Smulders, Yvette E; van Zon, Alice; Stegeman, Inge; Stokroos, Robert J; Free, Rolien H; Frijns, Johan H M; Huinck, Wendy J; Van Zanten, Gijsbert A; Grolman, Wilko

    2017-01-01

    There is an ongoing global discussion on whether or not bilateral cochlear implantation should be standard care for bilateral deafness. Contrary to unilateral cochlear implantation, however, little is known about the effect of bilateral cochlear implantation on tinnitus. To investigate tinnitus outcomes 1 year after bilateral cochlear implantation. Secondarily, to compare tinnitus outcomes between simultaneous and sequential bilateral cochlear implantation and to investigate long-term follow-up (3 years). This study is a secondary analysis as part of a multicenter randomized controlled trial. Thirty-eight postlingually deafened adults were included in the original trial, in which the presence of tinnitus was not an inclusion criterion. All participants received cochlear implants (CIs) because of profound hearing loss. Nineteen participants received bilateral CIs simultaneously and 19 participants received bilateral CIs sequentially with an inter-implant interval of 2 years. The prevalence and severity of tinnitus before and after simultaneous and sequential bilateral cochlear implantation were measured preoperatively and each year after implantation with the Tinnitus Handicap Inventory (THI) and Tinnitus Questionnaire (TQ). The prevalence of preoperative tinnitus was 42% (16/38). One year after bilateral implantation, there was a median difference of -8 (inter-quartile range (IQR): -28 to 4) in THI score and -9 (IQR: -17 to -9) in TQ score in the participants with preoperative tinnitus. Induction of tinnitus occurred in five participants, all in the simultaneous group, in the year after bilateral implantation. Although the preoperative and also the postoperative median THI and TQ scores were higher in the simultaneous group, the median difference scores were equal in both groups. In the simultaneous group, tinnitus scores fluctuated in the 3 years after implantation. In the sequential group, four patients had an additional benefit of the second CI: a total

  9. Tinnitus after Simultaneous and Sequential Bilateral Cochlear Implantation

    Directory of Open Access Journals (Sweden)

    Geerte G. J. Ramakers

    2017-11-01

    Full Text Available ImportanceThere is an ongoing global discussion on whether or not bilateral cochlear implantation should be standard care for bilateral deafness. Contrary to unilateral cochlear implantation, however, little is known about the effect of bilateral cochlear implantation on tinnitus.ObjectiveTo investigate tinnitus outcomes 1 year after bilateral cochlear implantation. Secondarily, to compare tinnitus outcomes between simultaneous and sequential bilateral cochlear implantation and to investigate long-term follow-up (3 years.Study designThis study is a secondary analysis as part of a multicenter randomized controlled trial.MethodsThirty-eight postlingually deafened adults were included in the original trial, in which the presence of tinnitus was not an inclusion criterion. All participants received cochlear implants (CIs because of profound hearing loss. Nineteen participants received bilateral CIs simultaneously and 19 participants received bilateral CIs sequentially with an inter-implant interval of 2 years. The prevalence and severity of tinnitus before and after simultaneous and sequential bilateral cochlear implantation were measured preoperatively and each year after implantation with the Tinnitus Handicap Inventory (THI and Tinnitus Questionnaire (TQ.ResultsThe prevalence of preoperative tinnitus was 42% (16/38. One year after bilateral implantation, there was a median difference of −8 (inter-quartile range (IQR: −28 to 4 in THI score and −9 (IQR: −17 to −9 in TQ score in the participants with preoperative tinnitus. Induction of tinnitus occurred in five participants, all in the simultaneous group, in the year after bilateral implantation. Although the preoperative and also the postoperative median THI and TQ scores were higher in the simultaneous group, the median difference scores were equal in both groups. In the simultaneous group, tinnitus scores fluctuated in the 3 years after implantation. In the sequential group

  10. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  11. Sequential dependencies in magnitude scaling of loudness

    DEFF Research Database (Denmark)

    Joshi, Suyash Narendra; Jesteadt, Walt

    2013-01-01

    Ten normally hearing listeners used a programmable sone-potentiometer knob to adjust the level of a 1000-Hz sinusoid to match the loudness of numbers presented to them in a magnitude production task. Three different power-law exponents (0.15, 0.30, and 0.60) and a log-law with equal steps in d......B were used to program the sone-potentiometer. The knob settings systematically influenced the form of the loudness function. Time series analysis was used to assess the sequential dependencies in the data, which increased with increasing exponent and were greatest for the log-law. It would be possible......, therefore, to choose knob properties that minimized these dependencies. When the sequential dependencies were removed from the data, the slope of the loudness functions did not change, but the variability decreased. Sequential dependencies were only present when the level of the tone on the previous trial...

  12. Smoking cessation treatment and outcomes patterns simulation: a new framework for evaluating the potential health and economic impact of smoking cessation interventions.

    Science.gov (United States)

    Getsios, Denis; Marton, Jenő P; Revankar, Nikhil; Ward, Alexandra J; Willke, Richard J; Rublee, Dale; Ishak, K Jack; Xenakis, James G

    2013-09-01

    Most existing models of smoking cessation treatments have considered a single quit attempt when modelling long-term outcomes. To develop a model to simulate smokers over their lifetimes accounting for multiple quit attempts and relapses which will allow for prediction of the long-term health and economic impact of smoking cessation strategies. A discrete event simulation (DES) that models individuals' life course of smoking behaviours, attempts to quit, and the cumulative impact on health and economic outcomes was developed. Each individual is assigned one of the available strategies used to support each quit attempt; the outcome of each attempt, time to relapses if abstinence is achieved, and time between quit attempts is tracked. Based on each individual's smoking or abstinence patterns, the risk of developing diseases associated with smoking (chronic obstructive pulmonary disease, lung cancer, myocardial infarction and stroke) is determined and the corresponding costs, changes to mortality, and quality of life assigned. Direct costs are assessed from the perspective of a comprehensive US healthcare payer ($US, 2012 values). Quit attempt strategies that can be evaluated in the current simulation include unassisted quit attempts, brief counselling, behavioural modification therapy, nicotine replacement therapy, bupropion, and varenicline, with the selection of strategies and time between quit attempts based on equations derived from survey data. Equations predicting the success of quit attempts as well as the short-term probability of relapse were derived from five varenicline clinical trials. Concordance between the five trials and predictions from the simulation on abstinence at 12 months was high, indicating that the equations predicting success and relapse in the first year following a quit attempt were reliable. Predictions allowing for only a single quit attempt versus unrestricted attempts demonstrate important differences, with the single quit attempt

  13. The finite sample performance of estimators for mediation analysis under sequential conditional independence

    DEFF Research Database (Denmark)

    Huber, Martin; Lechner, Michael; Mellace, Giovanni

    Using a comprehensive simulation study based on empirical data, this paper investigates the finite sample properties of different classes of parametric and semi-parametric estimators of (natural) direct and indirect causal effects used in mediation analysis under sequential conditional independence...

  14. The finite sample performance of estimators for mediation analysis under sequential conditional independence

    DEFF Research Database (Denmark)

    Huber, Martin; Lechner, Michael; Mellace, Giovanni

    2016-01-01

    Using a comprehensive simulation study based on empirical data, this paper investigates the finite sample properties of different classes of parametric and semi-parametric estimators of (natural) direct and indirect causal effects used in mediation analysis under sequential conditional independen...... of the methods often (but not always) varies with the features of the data generating process....

  15. A Comparison of Ultimate Loads from Fully and Sequentially Coupled Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Damiani, Rick R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-14

    This poster summarizes the scope and preliminary results of a study conducted for the Bureau of Safety and Environmental Enforcement aimed at quantifying differences between two modeling approaches (fully coupled and sequentially coupled) through aero-hydro-servo-elastic simulations of two offshore wind turbines on a monopile and jacket substructure.

  16. Dihydroazulene photoswitch operating in sequential tunneling regime

    DEFF Research Database (Denmark)

    Broman, Søren Lindbæk; Lara-Avila, Samuel; Thisted, Christine Lindbjerg

    2012-01-01

    to electrodes so that the electron transport goes by sequential tunneling. To assure weak coupling, the DHA switching kernel is modified by incorporating p-MeSC6H4 end-groups. Molecules are prepared by Suzuki cross-couplings on suitable halogenated derivatives of DHA. The synthesis presents an expansion of our......, incorporating a p-MeSC6H4 anchoring group in one end, has been placed in a silver nanogap. Conductance measurements justify that transport through both DHA (high resistivity) and VHF (low resistivity) forms goes by sequential tunneling. The switching is fairly reversible and reenterable; after more than 20 ON...

  17. Asynchronous Operators of Sequential Logic Venjunction & Sequention

    CERN Document Server

    Vasyukevich, Vadim

    2011-01-01

    This book is dedicated to new mathematical instruments assigned for logical modeling of the memory of digital devices. The case in point is logic-dynamical operation named venjunction and venjunctive function as well as sequention and sequentional function. Venjunction and sequention operate within the framework of sequential logic. In a form of the corresponding equations, they organically fit analytical expressions of Boolean algebra. Thus, a sort of symbiosis is formed using elements of asynchronous sequential logic on the one hand and combinational logic on the other hand. So, asynchronous

  18. Ratio of geometric means to analyze continuous outcomes in meta-analysis: comparison to mean differences and ratio of arithmetic means using empiric data and simulation.

    Science.gov (United States)

    Friedrich, Jan O; Adhikari, Neill K J; Beyene, Joseph

    2012-07-30

    Meta-analyses pooling continuous outcomes can use mean differences (MD), standardized MD (MD in pooled standard deviation units, SMD), or ratio of arithmetic means (RoM). Recently, ratio of geometric means using ad hoc (RoGM (ad hoc) ) or Taylor series (RoGM (Taylor) ) methods for estimating variances have been proposed as alternative effect measures for skewed continuous data. Skewed data are suggested for summary measures of clinical parameters restricted to positive values which have large coefficients of variation (CV). Our objective was to compare performance characteristics of RoGM (ad hoc) and RoGM (Taylor) to MD, SMD, and RoM. We used empiric data from systematic reviews reporting continuous outcomes and selected from each the meta-analysis with the most and at least 5 trials (Cochrane Database [2008, Issue 1]). We supplemented this with simulations conducted with representative parameters. Pooled results were calculated using each effect measure. Of the reviews, 232/5053 met the inclusion criteria. Empiric data and simulation showed that RoGM (ad hoc) exhibits more extreme treatment effects and greater heterogeneity than all other effect measures. Compared with MD, SMD, and RoM, RoGM (Taylor) exhibits similar treatment effects, more heterogeneity when CV ≤0.7, and less heterogeneity when CV > 0.7. In conclusion, RoGM (Taylor) may be considered for pooling continuous outcomes in meta-analysis when data are skewed, but RoGM (ad hoc) should not be used. However, clinicians' lack of familiarity with geometric means combined with acceptable performance characteristics of RoM in most situations suggests that RoM may be the preferable ratio method for pooling continuous outcomes in meta-analysis. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Integrating hinge axis approximation and the virtual facial simulation of prosthetic outcomes for treatment with CAD-CAM immediate dentures: A clinical report of a patient with microstomia.

    Science.gov (United States)

    Kuric, Katelyn M; Harris, Bryan T; Morton, Dean; Azevedo, Bruno; Lin, Wei-Shao

    2017-09-29

    This clinical report describes a digital workflow using extraoral digital photographs and volumetric datasets from cone beam computed tomography (CBCT) imaging to create a 3-dimensional (3D), virtual patient with photorealistic appearance. In a patient with microstomia, hinge axis approximation, diagnostic casts simulating postextraction alveolar ridge profile, and facial simulation of prosthetic treatment outcome were completed in a 3D, virtual environment. The approach facilitated the diagnosis, communication, and patient acceptance of the treatment of maxillary and mandibular computer-aided design and computer-aided manufacturing (CAD-CAM) of immediate dentures at increased occlusal vertical dimension. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  20. Comparing brain white matter on sequential cranial ultrasound and MRI in very preterm infants

    Energy Technology Data Exchange (ETDEWEB)

    Leijser, Lara M.; Veen, Sylvia; Boer, Inge P. de; Walther, Frans J.; Wezel-Meijler, Gerda van [Leiden University Medical Center, Department of Pediatrics, Division of Neonatology, Albinusdreef 2, P.O. Box 9600, Leiden (Netherlands); Liauw, Lishya [Leiden University Medical Center, Department of Radiology, Division of Neuroradiology, Albinusdreef 2, P.O. Box 9600, Leiden (Netherlands)

    2008-09-15

    Periventricular white matter (WM) echodensities, frequently seen in preterm infants, can be associated with suboptimal neurodevelopment. Major WM injury is well detected on cranial ultrasound (cUS). cUS seems less sensitive for diffuse or more subtle WM injury. Our aim was to assess the value of cUS and magnetic resonance imaging (MRI) for evaluating WM changes and the predictive value of cUS and/or MRI findings for neurodevelopmental outcome in very preterm infants with normal to severely abnormal WM on sequential high-quality cUS. Very preterm infants (<32 weeks) who had sequential cUS and one MRI within the first three postnatal months were included. Periventricular WM on cUS and MRI was compared and correlated with neurodevelopmental outcome at 2 years corrected age. Forty preterm infants were studied; outcome data were available in 32. WM changes on sequential cUS were predictive of WM changes on MRI. Severely abnormal WM on cUS/MRI was predictive of adverse outcome, and normal-mildly abnormal WM of favorable outcome. Moderately abnormal WM on cUS/MRI was associated with variable outcome. Additional MRI slightly increased the predictive value of cUS in severe WM changes. Sequential cUS in preterm infants is reliable for detecting WM changes and predicting favorable and severely abnormal outcome. Conventional and diffusion-weighted MRI sequences before term equivalent age in very preterm infants, suggested on cUS to have mild to moderately abnormal WM, do not seem to be warranted. (orig.)

  1. Comparing brain white matter on sequential cranial ultrasound and MRI in very preterm infants

    International Nuclear Information System (INIS)

    Leijser, Lara M.; Veen, Sylvia; Boer, Inge P. de; Walther, Frans J.; Wezel-Meijler, Gerda van; Liauw, Lishya

    2008-01-01

    Periventricular white matter (WM) echodensities, frequently seen in preterm infants, can be associated with suboptimal neurodevelopment. Major WM injury is well detected on cranial ultrasound (cUS). cUS seems less sensitive for diffuse or more subtle WM injury. Our aim was to assess the value of cUS and magnetic resonance imaging (MRI) for evaluating WM changes and the predictive value of cUS and/or MRI findings for neurodevelopmental outcome in very preterm infants with normal to severely abnormal WM on sequential high-quality cUS. Very preterm infants (<32 weeks) who had sequential cUS and one MRI within the first three postnatal months were included. Periventricular WM on cUS and MRI was compared and correlated with neurodevelopmental outcome at 2 years corrected age. Forty preterm infants were studied; outcome data were available in 32. WM changes on sequential cUS were predictive of WM changes on MRI. Severely abnormal WM on cUS/MRI was predictive of adverse outcome, and normal-mildly abnormal WM of favorable outcome. Moderately abnormal WM on cUS/MRI was associated with variable outcome. Additional MRI slightly increased the predictive value of cUS in severe WM changes. Sequential cUS in preterm infants is reliable for detecting WM changes and predicting favorable and severely abnormal outcome. Conventional and diffusion-weighted MRI sequences before term equivalent age in very preterm infants, suggested on cUS to have mild to moderately abnormal WM, do not seem to be warranted. (orig.)

  2. Sodium bicarbonate supplementation delays neuromuscular fatigue without changes in performance outcomes during a basketball match simulation protocol.

    Science.gov (United States)

    Ansdell, Paul; Dekerle, Jeanne

    2017-10-10

    To investigate the development of neuromuscular fatigue during a basketball game simulation and ascertain whether sodium bicarbonate (NaHCO3) supplementation attenuates any neuromuscular fatigue that persists. Ten participants ingested 0.2 g.kg of NaHCO3 (or an equimolar placebo dosage of sodium chloride [NaCl]) 90 and 60 minutes prior to commencing a basketball game simulation (ALK-T vs PLA-T). Isometric maximal voluntary contractions of the knee extensors (MVIC) and potentiated high (100 Hz) and low (10 Hz) frequency doublet twitches were recorded before and after each match quarter for both trials. In addition, 15 m sprint times and layup completion (%) were recorded during each quarter. MVIC, 100 and 10 Hz twitch forces declined progressively in both trials (P0.05). A basketball simulation protocol induces a substantial amount of neuromuscular (reduction in knee extensor MVICs) and peripheral fatigue with a concomitant increase in 15 m sprint time over the protocol. NaHCO3 supplementation attenuated the rate of fatigue development by protecting contractile elements of the muscle fibres. This study provides coaches with information about the magnitude of fatigue induced by a simulated basketball game, and provides evidence of the efficacy of NaHCO3 in attenuating fatigue.

  3. Three-dimensional classical-ensemble modeling of non-sequential double ionization

    International Nuclear Information System (INIS)

    Haan, S.L.; Breen, L.; Tannor, D.; Panfili, R.; Ho, Phay J.; Eberly, J.H.

    2005-01-01

    Full text: We have been using 1d ensembles of classical two-electron atoms to simulate helium atoms that are exposed to pulses of intense laser radiation. In this talk we discuss the challenges in setting up a 3d classical ensemble that can mimic the quantum ground state of helium. We then report studies in which each one of 500,000 two-electron trajectories is followed in 3d through a ten-cycle (25 fs) 780 nm laser pulse. We examine double-ionization yield for various intensities, finding the familiar knee structure. We consider the momentum spread of outcoming electrons in directions both parallel and perpendicular to the direction of laser polarization, and find results that are consistent with experiment. We examine individual trajectories and recollision processes that lead to double ionization, considering the best phases of the laser cycle for recollision events and looking at the possible time delay between recollision and emergence. We consider also the number of recollision events, and find that multiple recollisions are common in the classical ensemble. We investigate which collisional processes lead to various final electron momenta. We conclude with comments regarding the ability of classical mechanics to describe non-sequential double ionization, and a quick summary of similarities and differences between 1d and 3d classical double ionization using energy-trajectory comparisons. Refs. 3 (author)

  4. The Effects of Exercising in Different Natural Environments on Psycho-Physiological Outcomes in Post-Menopausal Women: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Mathew P. White

    2015-09-01

    Full Text Available The current study examined potential psycho-physiological benefits from exercising in simulated natural environments among a sample of post-menopausal women using a laboratory based protocol. Participants cycled on a stationary exercise bike for 15 min while facing either a blank wall (Control or while watching one of three videos: Urban (Grey, Countryside (Green, Coast (Blue. Blood pressure, heart rate and affective responses were measured pre-post. Heart rate, affect, perceived exertion and time perception were also measured at 5, 10 and 15 min during exercise. Experience evaluation was measured at the end. Replicating most earlier findings, affective, but not physiological, outcomes were more positive for exercise in the simulated Green and, for the first time, Blue environment, compared to Control. Moreover, only the simulated Blue environment was associated with shorter perceived exercise duration than Control and participants were most willing to repeat exercise in the Blue setting. The current research extended earlier work by exploring the effects of “blue exercise” and by using a demographic with relatively low average levels of physical activity. That this sample of postmenopausal women were most willing to repeat a bout of exercise in a simulated Blue environment may be important for physical activity promotion in this cohort.

  5. The Effects of Exercising in Different Natural Environments on Psycho-Physiological Outcomes in Post-Menopausal Women: A Simulation Study

    Science.gov (United States)

    White, Mathew P.; Pahl, Sabine; Ashbullby, Katherine J.; Burton, Francesca; Depledge, Michael H.

    2015-01-01

    The current study examined potential psycho-physiological benefits from exercising in simulated natural environments among a sample of post-menopausal women using a laboratory based protocol. Participants cycled on a stationary exercise bike for 15 min while facing either a blank wall (Control) or while watching one of three videos: Urban (Grey), Countryside (Green), Coast (Blue). Blood pressure, heart rate and affective responses were measured pre-post. Heart rate, affect, perceived exertion and time perception were also measured at 5, 10 and 15 min during exercise. Experience evaluation was measured at the end. Replicating most earlier findings, affective, but not physiological, outcomes were more positive for exercise in the simulated Green and, for the first time, Blue environment, compared to Control. Moreover, only the simulated Blue environment was associated with shorter perceived exercise duration than Control and participants were most willing to repeat exercise in the Blue setting. The current research extended earlier work by exploring the effects of “blue exercise” and by using a demographic with relatively low average levels of physical activity. That this sample of postmenopausal women were most willing to repeat a bout of exercise in a simulated Blue environment may be important for physical activity promotion in this cohort. PMID:26404351

  6. Eyewitness confidence in simultaneous and sequential lineups: a criterion shift account for sequential mistaken identification overconfidence.

    Science.gov (United States)

    Dobolyi, David G; Dodson, Chad S

    2013-12-01

    Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. Comparative efficacy of simultaneous versus sequential multiple health behavior change interventions among adults: A systematic review of randomised trials.

    Science.gov (United States)

    James, Erica; Freund, Megan; Booth, Angela; Duncan, Mitch J; Johnson, Natalie; Short, Camille E; Wolfenden, Luke; Stacey, Fiona G; Kay-Lambkin, Frances; Vandelanotte, Corneel

    2016-08-01

    Growing evidence points to the benefits of addressing multiple health behaviors rather than single behaviors. This review evaluates the relative effectiveness of simultaneous and sequentially delivered multiple health behavior change (MHBC) interventions. Secondary aims were to identify: a) the most effective spacing of sequentially delivered components; b) differences in efficacy of MHBC interventions for adoption/cessation behaviors and lifestyle/addictive behaviors, and; c) differences in trial retention between simultaneously and sequentially delivered interventions. MHBC intervention trials published up to October 2015 were identified through a systematic search. Eligible trials were randomised controlled trials that directly compared simultaneous and sequential delivery of a MHBC intervention. A narrative synthesis was undertaken. Six trials met the inclusion criteria and across these trials the behaviors targeted were smoking, diet, physical activity, and alcohol consumption. Three trials reported a difference in intervention effect between a sequential and simultaneous approach in at least one behavioral outcome. Of these, two trials favoured a sequential approach on smoking. One trial favoured a simultaneous approach on fat intake. There was no difference in retention between sequential and simultaneous approaches. There is limited evidence regarding the relative effectiveness of sequential and simultaneous approaches. Given only three of the six trials observed a difference in intervention effectiveness for one health behavior outcome, and the relatively consistent finding that the sequential and simultaneous approaches were more effective than a usual/minimal care control condition, it appears that both approaches should be considered equally efficacious. PROSPERO registration number: CRD42015027876. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Impact of sequential disorder on the scaling behavior of airplane boarding time

    Science.gov (United States)

    Baek, Yongjoo; Ha, Meesoon; Jeong, Hawoong

    2013-05-01

    Airplane boarding process is an example where disorder properties of the system are relevant to the emergence of universality classes. Based on a simple model, we present a systematic analysis of finite-size effects in boarding time, and propose a comprehensive view of the role of sequential disorder in the scaling behavior of boarding time against the plane size. Using numerical simulations and mathematical arguments, we find how the scaling behavior depends on the number of seat columns and the range of sequential disorder. Our results show that new scaling exponents can arise as disorder is localized to varying extents.

  9. Bootstrap Sequential Determination of the Co-integration Rank in VAR Models

    DEFF Research Database (Denmark)

    Guiseppe, Cavaliere; Rahbæk, Anders; Taylor, A.M. Robert

    with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....

  10. Sequential analysis of materials balances. Application to a prospective reprocessing facility

    International Nuclear Information System (INIS)

    Picard, R.

    1986-01-01

    This paper discusses near-real-time accounting in the context of the prospective DWK reprocessing plant. Sensitivity of a standard sequential testing procedure, applied to unfalsified operator data only, is examined with respect to a variety of loss scenarios. It is seen that large inventories preclude high-probability detection of certain protracted losses of material. In Sec. 2, a rough error propagation for the MBA of interest is outlined. Mathematical development for the analysis is given in Sec. 3, and generic aspects of sequential testing are reviewed in Sec. 4. In Sec. 5, results from a simulation to quantify performance of the accounting system are presented

  11. Interpretability degrees of finitely axiomatized sequential theories

    NARCIS (Netherlands)

    Visser, Albert

    In this paper we show that the degrees of interpretability of finitely axiomatized extensions-in-the-same-language of a finitely axiomatized sequential theory-like Elementary Arithmetic EA, IΣ1, or the Gödel-Bernays theory of sets and classes GB-have suprema. This partially answers a question posed

  12. Interpretability Degrees of Finitely Axiomatized Sequential Theories

    NARCIS (Netherlands)

    Visser, Albert

    2012-01-01

    In this paper we show that the degrees of interpretability of finitely axiomatized extensions-in-the-same-language of a finitely axiomatized sequential theory —like Elementary Arithmetic EA, IΣ1, or the Gödel-Bernays theory of sets and classes GB— have suprema. This partially answers a question

  13. S.M.P. SEQUENTIAL MATHEMATICS PROGRAM.

    Science.gov (United States)

    CICIARELLI, V; LEONARD, JOSEPH

    A SEQUENTIAL MATHEMATICS PROGRAM BEGINNING WITH THE BASIC FUNDAMENTALS ON THE FOURTH GRADE LEVEL IS PRESENTED. INCLUDED ARE AN UNDERSTANDING OF OUR NUMBER SYSTEM, AND THE BASIC OPERATIONS OF WORKING WITH WHOLE NUMBERS--ADDITION, SUBTRACTION, MULTIPLICATION, AND DIVISION. COMMON FRACTIONS ARE TAUGHT IN THE FIFTH, SIXTH, AND SEVENTH GRADES. A…

  14. Sequential and Simultaneous Logit: A Nested Model.

    NARCIS (Netherlands)

    van Ophem, J.C.M.; Schram, A.J.H.C.

    1997-01-01

    A nested model is presented which has both the sequential and the multinomial logit model as special cases. This model provides a simple test to investigate the validity of these specifications. Some theoretical properties of the model are discussed. In the analysis a distribution function is

  15. Sensitivity Analysis in Sequential Decision Models.

    Science.gov (United States)

    Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet

    2017-02-01

    Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.

  16. Sequential models for coarsening and missingness

    NARCIS (Netherlands)

    Gill, R.D.; Robins, J.M.

    1997-01-01

    In a companion paper we described what intuitively would seem to be the most general possible way to generate Coarsening at Random mechanisms a sequential procedure called randomized monotone coarsening Counterexamples showed that CAR mechanisms exist which cannot be represented in this way Here we

  17. Sequential motor skill: cognition, perception and action

    NARCIS (Netherlands)

    Ruitenberg, M.F.L.

    2013-01-01

    Discrete movement sequences are assumed to be the building blocks of more complex sequential actions that are present in our everyday behavior. The studies presented in this dissertation address the (neuro)cognitive underpinnings of such movement sequences, in particular in relationship to the role

  18. Sequential decoders for large MIMO systems

    KAUST Repository

    Ali, Konpal S.; Abediseid, Walid; Alouini, Mohamed-Slim

    2014-01-01

    the Sequential Decoder using the Fano Algorithm for large MIMO systems. A parameter called the bias is varied to attain different performance-complexity trade-offs. Low values of the bias result in excellent performance but at the expense of high complexity

  19. A framework for sequential multiblock component methods

    NARCIS (Netherlands)

    Smilde, A.K.; Westerhuis, J.A.; Jong, S.de

    2003-01-01

    Multiblock or multiset methods are starting to be used in chemistry and biology to study complex data sets. In chemometrics, sequential multiblock methods are popular; that is, methods that calculate one component at a time and use deflation for finding the next component. In this paper a framework

  20. Classical and sequential limit analysis revisited

    Science.gov (United States)

    Leblond, Jean-Baptiste; Kondo, Djimédo; Morin, Léo; Remmal, Almahdi

    2018-04-01

    Classical limit analysis applies to ideal plastic materials, and within a linearized geometrical framework implying small displacements and strains. Sequential limit analysis was proposed as a heuristic extension to materials exhibiting strain hardening, and within a fully general geometrical framework involving large displacements and strains. The purpose of this paper is to study and clearly state the precise conditions permitting such an extension. This is done by comparing the evolution equations of the full elastic-plastic problem, the equations of classical limit analysis, and those of sequential limit analysis. The main conclusion is that, whereas classical limit analysis applies to materials exhibiting elasticity - in the absence of hardening and within a linearized geometrical framework -, sequential limit analysis, to be applicable, strictly prohibits the presence of elasticity - although it tolerates strain hardening and large displacements and strains. For a given mechanical situation, the relevance of sequential limit analysis therefore essentially depends upon the importance of the elastic-plastic coupling in the specific case considered.

  1. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette); V. Capasso

    2009-01-01

    htmlabstractWe give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects

  2. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    Lieshout, van M.N.M.; Capasso, V.

    2009-01-01

    We give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects through a video

  3. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    Science.gov (United States)

    2014-07-11

    maintains the flexibility of deciding sooner than the fixed sample size procedure at the price of some lower power [13, 514]. The sequential probability... markets , detection of signals with unknown arrival time in seismology, navigation, radar and sonar signal processing, speech segmentation, and the... skimming cruise missile can yield a significant increase in the probability of raid annihilation. Furthermore, usually detection systems are

  4. STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING: A SURVEY

    Directory of Open Access Journals (Sweden)

    Damián Fernández

    2014-12-01

    Full Text Available We review the motivation for, the current state-of-the-art in convergence results, and some open questions concerning the stabilized version of the sequential quadratic programming algorithm for constrained optimization. We also discuss the tools required for its local convergence analysis, globalization challenges, and extentions of the method to the more general variational problems.

  5. Truly costly sequential search and oligopolistic pricing

    NARCIS (Netherlands)

    Janssen, Maarten C W; Moraga-González, José Luis; Wildenbeest, Matthijs R.

    We modify the paper of Stahl (1989) [Stahl, D.O., 1989. Oligopolistic pricing with sequential consumer search. American Economic Review 79, 700-12] by relaxing the assumption that consumers obtain the first price quotation for free. When all price quotations are costly to obtain, the unique

  6. Zips : mining compressing sequential patterns in streams

    NARCIS (Netherlands)

    Hoang, T.L.; Calders, T.G.K.; Yang, J.; Mörchen, F.; Fradkin, D.; Chau, D.H.; Vreeken, J.; Leeuwen, van M.; Faloutsos, C.

    2013-01-01

    We propose a streaming algorithm, based on the minimal description length (MDL) principle, for extracting non-redundant sequential patterns. For static databases, the MDL-based approach that selects patterns based on their capacity to compress data rather than their frequency, was shown to be

  7. How to Read the Tractatus Sequentially

    Directory of Open Access Journals (Sweden)

    Tim Kraft

    2016-11-01

    Full Text Available One of the unconventional features of Wittgenstein’s Tractatus Logico-Philosophicus is its use of an elaborated and detailed numbering system. Recently, Bazzocchi, Hacker und Kuusela have argued that the numbering system means that the Tractatus must be read and interpreted not as a sequentially ordered book, but as a text with a two-dimensional, tree-like structure. Apart from being able to explain how the Tractatus was composed, the tree reading allegedly solves exegetical issues both on the local (e. g. how 4.02 fits into the series of remarks surrounding it and the global level (e. g. relation between ontology and picture theory, solipsism and the eye analogy, resolute and irresolute readings. This paper defends the sequential reading against the tree reading. After presenting the challenges generated by the numbering system and the two accounts as attempts to solve them, it is argued that Wittgenstein’s own explanation of the numbering system, anaphoric references within the Tractatus and the exegetical issues mentioned above do not favour the tree reading, but a version of the sequential reading. This reading maintains that the remarks of the Tractatus form a sequential chain: The role of the numbers is to indicate how remarks on different levels are interconnected to form a concise, surveyable and unified whole.

  8. Adult Word Recognition and Visual Sequential Memory

    Science.gov (United States)

    Holmes, V. M.

    2012-01-01

    Two experiments were conducted investigating the role of visual sequential memory skill in the word recognition efficiency of undergraduate university students. Word recognition was assessed in a lexical decision task using regularly and strangely spelt words, and nonwords that were either standard orthographically legal strings or items made from…

  9. Terminating Sequential Delphi Survey Data Collection

    Science.gov (United States)

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…

  10. Simultaneous Versus Sequential Side-by-Side Bilateral Metal Stent Placement for Malignant Hilar Biliary Obstructions.

    Science.gov (United States)

    Inoue, Tadahisa; Ishii, Norimitsu; Kobayashi, Yuji; Kitano, Rena; Sakamoto, Kazumasa; Ohashi, Tomohiko; Nakade, Yukiomi; Sumida, Yoshio; Ito, Kiyoaki; Nakao, Haruhisa; Yoneda, Masashi

    2017-09-01

    Endoscopic bilateral self-expandable metallic stent (SEMS) placement for malignant hilar biliary obstructions (MHBOs) is technically demanding, and a second SEMS insertion is particularly challenging. A simultaneous side-by-side (SBS) placement technique using a thinner delivery system may mitigate these issues. We aimed to examine the feasibility and efficacy of simultaneous SBS SEMS placement for treating MHBOs using a novel SEMS that has a 5.7-Fr ultra-thin delivery system. Thirty-four patients with MHBOs underwent SBS SEMS placement between 2010 and 2016. We divided the patient cohort into those who underwent sequential (conventional) SBS placement between 2010 and 2014 (sequential group) and those who underwent simultaneous SBS placement between 2015 and 2016 (simultaneous group), and compared the groups with respect to the clinical outcomes. The technical success rates were 71% (12/17) and 100% (17/17) in the sequential and simultaneous groups, respectively, a difference that was significant (P = .045). The median procedure time was significantly shorter in the simultaneous group (22 min) than in the sequential group (52 min) (P = .017). There were no significant group differences in the time to recurrent biliary obstruction (sequential group: 113 days; simultaneous group: 140 days) or other adverse event rates (sequential group: 12%; simultaneous group: 12%). Simultaneous SBS placement using the novel 5.7-Fr SEMS delivery system may be more straightforward and have a higher success rate compared to that with sequential SBS placement. This new method may be useful for bilateral stenting to treat MHBOs.

  11. Decision-making in research tasks with sequential testing.

    Directory of Open Access Journals (Sweden)

    Thomas Pfeiffer

    Full Text Available BACKGROUND: In a recent controversial essay, published by JPA Ioannidis in PLoS Medicine, it has been argued that in some research fields, most of the published findings are false. Based on theoretical reasoning it can be shown that small effect sizes, error-prone tests, low priors of the tested hypotheses and biases in the evaluation and publication of research findings increase the fraction of false positives. These findings raise concerns about the reliability of research. However, they are based on a very simple scenario of scientific research, where single tests are used to evaluate independent hypotheses. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we present computer simulations and experimental approaches for analyzing more realistic scenarios. In these scenarios, research tasks are solved sequentially, i.e. subsequent tests can be chosen depending on previous results. We investigate simple sequential testing and scenarios where only a selected subset of results can be published and used for future rounds of test choice. Results from computer simulations indicate that for the tasks analyzed in this study, the fraction of false among the positive findings declines over several rounds of testing if the most informative tests are performed. Our experiments show that human subjects frequently perform the most informative tests, leading to a decline of false positives as expected from the simulations. CONCLUSIONS/SIGNIFICANCE: For the research tasks studied here, findings tend to become more reliable over time. We also find that the performance in those experimental settings where not all performed tests could be published turned out to be surprisingly inefficient. Our results may help optimize existing procedures used in the practice of scientific research and provide guidance for the development of novel forms of scholarly communication.

  12. Simultaneous Versus Sequential Presentation in Testing Recognition Memory for Faces.

    Science.gov (United States)

    Finley, Jason R; Roediger, Henry L; Hughes, Andrea D; Wahlheim, Christopher N; Jacoby, Larry L

    2015-01-01

    Three experiments examined the issue of whether faces could be better recognized in a simul- taneous test format (2-alternative forced choice [2AFC]) or a sequential test format (yes-no). All experiments showed that when target faces were present in the test, the simultaneous procedure led to superior performance (area under the ROC curve), whether lures were high or low in similarity to the targets. However, when a target-absent condition was used in which no lures resembled the targets but the lures were similar to each other, the simultaneous procedure yielded higher false alarm rates (Experiments 2 and 3) and worse overall performance (Experi- ment 3). This pattern persisted even when we excluded responses that participants opted to withhold rather than volunteer. We conclude that for the basic recognition procedures used in these experiments, simultaneous presentation of alternatives (2AFC) generally leads to better discriminability than does sequential presentation (yes-no) when a target is among the alterna- tives. However, our results also show that the opposite can occur when there is no target among the alternatives. An important future step is to see whether these patterns extend to more realistic eyewitness lineup procedures. The pictures used in the experiment are available online at http://www.press.uillinois.edu/journals/ajp/media/testing_recognition/.

  13. Optical design of a novel instrument that uses the Hartmann-Shack sensor and Zernike polynomials to measure and simulate customized refraction correction surgery outcomes and patient satisfaction

    Science.gov (United States)

    Yasuoka, Fatima M. M.; Matos, Luciana; Cremasco, Antonio; Numajiri, Mirian; Marcato, Rafael; Oliveira, Otavio G.; Sabino, Luis G.; Castro N., Jarbas C.; Bagnato, Vanderlei S.; Carvalho, Luis A. V.

    2016-03-01

    An optical system that conjugates the patient's pupil to the plane of a Hartmann-Shack (HS) wavefront sensor has been simulated using optical design software. And an optical bench prototype is mounted using mechanical eye device, beam splitter, illumination system, lenses, mirrors, mirrored prism, movable mirror, wavefront sensor and camera CCD. The mechanical eye device is used to simulate aberrations of the eye. From this device the rays are emitted and travelled by the beam splitter to the optical system. Some rays fall on the camera CCD and others pass in the optical system and finally reach the sensor. The eye models based on typical in vivo eye aberrations is constructed using the optical design software Zemax. The computer-aided outcomes of each HS images for each case are acquired, and these images are processed using customized techniques. The simulated and real images for low order aberrations are compared using centroid coordinates to assure that the optical system is constructed precisely in order to match the simulated system. Afterwards a simulated version of retinal images is constructed to show how these typical eyes would perceive an optotype positioned 20 ft away. Certain personalized corrections are allowed by eye doctors based on different Zernike polynomial values and the optical images are rendered to the new parameters. Optical images of how that eye would see with or without corrections of certain aberrations are generated in order to allow which aberrations can be corrected and in which degree. The patient can then "personalize" the correction to their own satisfaction. This new approach to wavefront sensing is a promising change in paradigm towards the betterment of the patient-physician relationship.

  14. Optimal Energy Management of Multi-Microgrids with Sequentially Coordinated Operations

    Directory of Open Access Journals (Sweden)

    Nah-Oak Song

    2015-08-01

    Full Text Available We propose an optimal electric energy management of a cooperative multi-microgrid community with sequentially coordinated operations. The sequentially coordinated operations are suggested to distribute computational burden and yet to make the optimal 24 energy management of multi-microgrids possible. The sequential operations are mathematically modeled to find the optimal operation conditions and illustrated with physical interpretation of how to achieve optimal energy management in the cooperative multi-microgrid community. This global electric energy optimization of the cooperative community is realized by the ancillary internal trading between the microgrids in the cooperative community which reduces the extra cost from unnecessary external trading by adjusting the electric energy production amounts of combined heat and power (CHP generators and amounts of both internal and external electric energy trading of the cooperative community. A simulation study is also conducted to validate the proposed mathematical energy management models.

  15. Sequential decoding of intramuscular EMG signals via estimation of a Markov model.

    Science.gov (United States)

    Monsifrot, Jonathan; Le Carpentier, Eric; Aoustin, Yannick; Farina, Dario

    2014-09-01

    This paper addresses the sequential decoding of intramuscular single-channel electromyographic (EMG) signals to extract the activity of individual motor neurons. A hidden Markov model is derived from the physiological generation of the EMG signal. The EMG signal is described as a sum of several action potentials (wavelet) trains, embedded in noise. For each train, the time interval between wavelets is modeled by a process that parameters are linked to the muscular activity. The parameters of this process are estimated sequentially by a Bayes filter, along with the firing instants. The method was tested on some simulated signals and an experimental one, from which the rates of detection and classification of action potentials were above 95% with respect to the reference decomposition. The method works sequentially in time, and is the first to address the problem of intramuscular EMG decomposition online. It has potential applications for man-machine interfacing based on motor neuron activities.

  16. Comparison of robustness to outliers between robust poisson models and log-binomial models when estimating relative risks for common binary outcomes: a simulation study.

    Science.gov (United States)

    Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P

    2014-06-26

    To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.

  17. Sequential processing deficits in schizophrenia: relationship to neuropsychology and genetics.

    Science.gov (United States)

    Hill, S Kristian; Bjorkquist, Olivia; Carrathers, Tarra; Roseberry, Jarett E; Hochberger, William C; Bishop, Jeffrey R

    2013-12-01

    Utilizing a combination of neuropsychological and cognitive neuroscience approaches may be essential for characterizing cognitive deficits in schizophrenia and eventually assessing cognitive outcomes. This study was designed to compare the stability of select exemplars for these approaches and their correlations in schizophrenia patients with stable treatment and clinical profiles. Reliability estimates for serial order processing were comparable to neuropsychological measures and indicate that experimental serial order processing measures may be less susceptible to practice effects than traditional neuropsychological measures. Correlations were moderate and consistent with a global cognitive factor. Exploratory analyses indicated a potentially critical role of the Met allele of the Catechol-O-methyltransferase (COMT) Val158Met polymorphism in externally paced sequential recall. Experimental measures of serial order processing may reflect frontostriatal dysfunction and be a useful supplement to large neuropsychological batteries. © 2013.

  18. Sequential method for the assessment of innovations in computer assisted industrial processes; Metodo secuencial para evaluacion de innovaciones en procesos industriales asistido por computadora

    Energy Technology Data Exchange (ETDEWEB)

    Suarez Antola, R [Universidad Catolica del Uruguay, Montevideo (Uruguay); Artucio, G [Ministerio de Industria Energia y Mineria. Direccion Nacional de Tecnologia Nuclear, Montevideo (Uruguay)

    1995-08-01

    A sequential method for the assessment of innovations in industrial processes is proposed, using suitable combinations of mathematical modelling and numerical simulation of dynamics. Some advantages and limitations of the proposed method are discussed. tabs.

  19. Impact of Diagrams on Recalling Sequential Elements in Expository Texts.

    Science.gov (United States)

    Guri-Rozenblit, Sarah

    1988-01-01

    Examines the instructional effectiveness of abstract diagrams on recall of sequential relations in social science textbooks. Concludes that diagrams assist significantly the recall of sequential relations in a text and decrease significantly the rate of order mistakes. (RS)

  20. The Effects of the Previous Outcome on Probabilistic Choice in Rats

    Science.gov (United States)

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2014-01-01

    This study examined the effects of previous outcomes on subsequent choices in a probabilistic-choice task. Twenty-four rats were trained to choose between a certain outcome (1 or 3 pellets) versus an uncertain outcome (3 or 9 pellets), delivered with a probability of .1, .33, .67, and .9 in different phases. Uncertain outcome choices increased with the probability of uncertain food. Additionally, uncertain choices increased with the probability of uncertain food following both certain-choice outcomes and unrewarded uncertain choices. However, following uncertain-choice food outcomes, there was a tendency to choose the uncertain outcome in all cases, indicating that the rats continued to “gamble” after successful uncertain choices, regardless of the overall probability or magnitude of food. A subsequent manipulation, in which the probability of uncertain food varied within each session as a function of the previous uncertain outcome, examined how the previous outcome and probability of uncertain food affected choice in a dynamic environment. Uncertain-choice behavior increased with the probability of uncertain food. The rats exhibited increased sensitivity to probability changes and a greater degree of win–stay/lose–shift behavior than in the static phase. Simulations of two sequential choice models were performed to explore the possible mechanisms of reward value computations. The simulation results supported an exponentially decaying value function that updated as a function of trial (rather than time). These results emphasize the importance of analyzing global and local factors in choice behavior and suggest avenues for the future development of sequential-choice models. PMID:23205915

  1. Assessment of Differential Item Functioning in Health-Related Outcomes: A Simulation and Empirical Analysis with Hierarchical Polytomous Data

    Directory of Open Access Journals (Sweden)

    Zahra Sharafi

    2017-01-01

    Full Text Available Background. The purpose of this study was to evaluate the effectiveness of two methods of detecting differential item functioning (DIF in the presence of multilevel data and polytomously scored items. The assessment of DIF with multilevel data (e.g., patients nested within hospitals, hospitals nested within districts from large-scale assessment programs has received considerable attention but very few studies evaluated the effect of hierarchical structure of data on DIF detection for polytomously scored items. Methods. The ordinal logistic regression (OLR and hierarchical ordinal logistic regression (HOLR were utilized to assess DIF in simulated and real multilevel polytomous data. Six factors (DIF magnitude, grouping variable, intraclass correlation coefficient, number of clusters, number of participants per cluster, and item discrimination parameter with a fully crossed design were considered in the simulation study. Furthermore, data of Pediatric Quality of Life Inventory™ (PedsQL™ 4.0 collected from 576 healthy school children were analyzed. Results. Overall, results indicate that both methods performed equivalently in terms of controlling Type I error and detection power rates. Conclusions. The current study showed negligible difference between OLR and HOLR in detecting DIF with polytomously scored items in a hierarchical structure. Implications and considerations while analyzing real data were also discussed.

  2. Transaction costs and sequential bargaining in transferable discharge permit markets.

    Science.gov (United States)

    Netusil, N R; Braden, J B

    2001-03-01

    Market-type mechanisms have been introduced and are being explored for various environmental programs. Several existing programs, however, have not attained the cost savings that were initially projected. Modeling that acknowledges the role of transactions costs and the discrete, bilateral, and sequential manner in which trades are executed should provide a more realistic basis for calculating potential cost savings. This paper presents empirical evidence on potential cost savings by examining a market for the abatement of sediment from farmland. Empirical results based on a market simulation model find no statistically significant change in mean abatement costs under several transaction cost levels when contracts are randomly executed. An alternative method of contract execution, gain-ranked, yields similar results. At the highest transaction cost level studied, trading reduces the total cost of compliance relative to a uniform standard that reflects current regulations.

  3. A one-sided sequential test

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A.; Lux, I. [Hungarian Academy of Sciences, Budapest (Hungary). Atomic Energy Research Inst.

    1996-04-16

    The applicability of the classical sequential probability ratio testing (SPRT) for early failure detection problems is limited by the fact that there is an extra time delay between the occurrence of the failure and its first recognition. Chien and Adams developed a method to minimize this time for the case when the problem can be formulated as testing the mean value of a Gaussian signal. In our paper we propose a procedure that can be applied for both mean and variance testing and that minimizes the time delay. The method is based on a special parametrization of the classical SPRT. The one-sided sequential tests (OSST) can reproduce the results of the Chien-Adams test when applied for mean values. (author).

  4. Documentscape: Intertextuality, Sequentiality & Autonomy at Work

    DEFF Research Database (Denmark)

    Christensen, Lars Rune; Bjørn, Pernille

    2014-01-01

    On the basis of an ethnographic field study, this article introduces the concept of documentscape to the analysis of document-centric work practices. The concept of documentscape refers to the entire ensemble of documents in their mutual intertextual interlocking. Providing empirical data from...... a global software development case, we show how hierarchical structures and sequentiality across the interlocked documents are critical to how actors make sense of the work of others and what to do next in a geographically distributed setting. Furthermore, we found that while each document is created...... as part of a quasi-sequential order, this characteristic does not make the document, as a single entity, into a stable object. Instead, we found that the documents were malleable and dynamic while suspended in intertextual structures. Our concept of documentscape points to how the hierarchical structure...

  5. A minimax procedure in the context of sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1999-01-01

    The purpose of this paper is to derive optimal rules for sequential mastery tests. In a sequential mastery test, the decision is to classify a subject as a master or a nonmaster, or to continue sampling and administering another random test item. The framework of minimax sequential decision theory

  6. Applying the minimax principle to sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    2002-01-01

    The purpose of this paper is to derive optimal rules for sequential mastery tests. In a sequential mastery test, the decision is to classify a subject as a master, a nonmaster, or to continue sampling and administering another random item. The framework of minimax sequential decision theory (minimum

  7. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  8. On Locally Most Powerful Sequential Rank Tests

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : nonparametric test s * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016

  9. Sequential pattern recognition by maximum conditional informativity

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří

    2014-01-01

    Roč. 45, č. 1 (2014), s. 39-45 ISSN 0167-8655 R&D Projects: GA ČR(CZ) GA14-02652S; GA ČR(CZ) GA14-10911S Keywords : Multivariate statistics * Statistical pattern recognition * Sequential decision making * Product mixtures * EM algorithm * Shannon information Subject RIV: IN - Informatics, Computer Sci ence Impact factor: 1.551, year: 2014 http://library.utia.cas.cz/separaty/2014/RO/grim-0428565.pdf

  10. Comparing two Poisson populations sequentially: an application

    International Nuclear Information System (INIS)

    Halteman, E.J.

    1986-01-01

    Rocky Flats Plant in Golden, Colorado monitors each of its employees for radiation exposure. Excess exposure is detected by comparing the means of two Poisson populations. A sequential probability ratio test (SPRT) is proposed as a replacement for the fixed sample normal approximation test. A uniformly most efficient SPRT exists, however logistics suggest using a truncated SPRT. The truncated SPRT is evaluated in detail and shown to possess large potential savings in average time spent by employees in the monitoring process

  11. Heat accumulation during sequential cortical bone drilling.

    Science.gov (United States)

    Palmisano, Andrew C; Tai, Bruce L; Belmont, Barry; Irwin, Todd A; Shih, Albert; Holmes, James R

    2016-03-01

    Significant research exists regarding heat production during single-hole bone drilling. No published data exist regarding repetitive sequential drilling. This study elucidates the phenomenon of heat accumulation for sequential drilling with both Kirschner wires (K wires) and standard two-flute twist drills. It was hypothesized that cumulative heat would result in a higher temperature with each subsequent drill pass. Nine holes in a 3 × 3 array were drilled sequentially on moistened cadaveric tibia bone kept at body temperature (about 37 °C). Four thermocouples were placed at the center of four adjacent holes and 2 mm below the surface. A battery-driven hand drill guided by a servo-controlled motion system was used. Six samples were drilled with each tool (2.0 mm K wire and 2.0 and 2.5 mm standard drills). K wire drilling increased temperature from 5 °C at the first hole to 20 °C at holes 6 through 9. A similar trend was found in standard drills with less significant increments. The maximum temperatures of both tools increased from drill sizes was found to be insignificant (P > 0.05). In conclusion, heat accumulated during sequential drilling, with size difference being insignificant. K wire produced more heat than its twist-drill counterparts. This study has demonstrated the heat accumulation phenomenon and its significant effect on temperature. Maximizing the drilling field and reducing the number of drill passes may decrease bone injury. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  12. Sequential neural models with stochastic layers

    DEFF Research Database (Denmark)

    Fraccaro, Marco; Sønderby, Søren Kaae; Paquet, Ulrich

    2016-01-01

    How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural...... generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over...

  13. Simultaneous versus Sequential Intratympanic Steroid Treatment for Severe-to-Profound Sudden Sensorineural Hearing Loss.

    Science.gov (United States)

    Yoo, Myung Hoon; Lim, Won Sub; Park, Joo Hyun; Kwon, Joong Keun; Lee, Tae-Hoon; An, Yong-Hwi; Kim, Young-Jin; Kim, Jong Yang; Lim, Hyun Woo; Park, Hong Ju

    2016-01-01

    Severe-to-profound sudden sensorineural hearing loss (SSNHL) has a poor prognosis. We aimed to compare the efficacy of simultaneous and sequential oral and intratympanic steroids for this condition. Fifty patients with severe-to-profound SSNHL (>70 dB HL) were included from 7 centers. The simultaneous group (27 patients) received oral and intratympanic steroid injections for 2 weeks. The sequential group (23 patients) was treated with oral steroids for 2 weeks and intratympanic steroids for the subsequent 2 weeks. Pure-tone averages (PTA) and word discrimination scores (WDS) were compared before treatment and 2 weeks and 1 and 2 months after treatment. Treatment outcomes according to the modified American Academy of Otolaryngology-Head and Neck Surgery (AAO-HNS) criteria were also analyzed. The improvement in PTA and WDS at the 2-week follow-up was 23 ± 21 dB HL and 20 ± 39% in the simultaneous group and 31 ± 29 dB HL and 37 ± 42% in the sequential group; this was not statistically significant. Complete or partial recovery at the 2-week follow-up was observed in 26% of the simultaneous group and 30% of the sequential group; this was also not significant. The improvement in PTA and WDS at the 2-month follow-up was 40 ± 20 dB HL and 37 ± 35% in the simultaneous group and 41 ± 25 dB HL and 48 ± 41% in the sequential group; this was not statistically significant. Complete or partial recovery at the 2-month follow-up was observed in 33% of the simultaneous group and 35% of the sequential group; this was also not significant. Seven patients in the sequential group did not need intratympanic steroid injections for sufficient improvement after oral steroids alone. Simultaneous oral/intratympanic steroid treatment yielded a recovery similar to that produced by sequential treatment. Because the addition of intratympanic steroids can be decided upon based on the improvement after an oral steroid, the sequential regimen can be recommended to avoid unnecessary

  14. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    Science.gov (United States)

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Algorithm development and simulation outcomes for hypoxic head and neck cancer radiotherapy using a Monte Carlo cell division model

    International Nuclear Information System (INIS)

    Harriss, W.M.; Bezak, E.; Yeoh, E.

    2010-01-01

    Full text: A temporal Monte Carlo tumour model, 'Hyp-RT'. sim ulating hypoxic head and neck cancer has been updated and extended to model radiothcrapy. The aim is to providc a convenient radiobio logical tool for clinicians to evaluate radiotherapy treatment schedules based on many individual tumour properties including oxygenation. FORTRAN95 and JA YA havc been utilised to develop the efficient algorithm, which can propagate 108 cells. Epithelial cell kill is affected by dose, oxygenation and proliferativc status. Accelerated repopulation (AR) has been modelled by increasing the symmetrical stem cell division probability, and reoxygenation (ROx) has been modelled using random incremental boosts of oxygen to the cell po ulation throughout therapy. Results The stem cell percentage and the degree of hypoxia dominate tumour growth rate. For conventional radiotherapy. 15-25% more dose was required for a hypox ic versus oxic tumours, depending on the time of AR onset (0-3 weeks after thc start of treatment). ROx of hypoxic tumours resulted in tumoUJ: sensitisation and therefore a dose reduction, of up to 35%, varying with the time of onset. Fig. I shows results for all combinations of AR and ROx onset times for the moderate hypoxia case. Conclusions In hypoxic tumours, accelerated repopulation and reoxy genation affect ccll kill in the same manner as when the effects are modelled individually. however the degree of the effect is altered and therefore the combined result is difficult to predict. providing evidence for the usefulness of computer models. Simulations have quantitatively

  16. Knee Joint Distraction Compared to Total Knee Arthroplasty for Treatment of End Stage Osteoarthritis: Simulating Long-Term Outcomes and Cost-Effectiveness.

    Science.gov (United States)

    van der Woude, J A D; Nair, S C; Custers, R J H; van Laar, J M; Kuchuck, N O; Lafeber, F P J G; Welsing, P M J

    2016-01-01

    In end-stage knee osteoarthritis the treatment of choice is total knee arthroplasty (TKA). An alternative treatment is knee joint distraction (KJD), suggested to postpone TKA. Several studies reported significant and prolonged clinical improvement of KJD. To make an appropriate decision regarding the position of this treatment, a cost-effectiveness and cost-utility analysis from healthcare perspective for different age and gender categories was performed. A treatment strategy starting with TKA and a strategy starting with KJD for patients of different age and gender was simulated. To extrapolate outcomes to long-term health and economic outcomes a Markov (Health state) model was used. The number of surgeries, QALYs, and treatment costs per strategy were calculated. Costs-effectiveness is expressed using the cost-effectiveness plane and cost-effectiveness acceptability curves. Starting with KJD the number of knee replacing procedures could be reduced, most clearly in the younger age categories; especially revision surgery. This resulted in the KJD strategy being dominant (more effective with cost-savings) in about 80% of simulations (with only inferiority in about 1%) in these age categories when compared to TKA. At a willingness to pay of 20.000 Euro per QALY gained, the probability of starting with KJD to be cost-effective compared to starting with a TKA was already found to be over 75% for all age categories and over 90-95% for the younger age categories. A treatment strategy starting with knee joint distraction for knee osteoarthritis has a large potential for being a cost-effective intervention, especially for the relatively young patient.

  17. State-and-transition simulation modeling to compare outcomes of alternative management scenarios under two natural disturbance regimes in a forested landscape in northeastern Wisconsin, USA

    Directory of Open Access Journals (Sweden)

    Amanda Swearingen

    2015-07-01

    Full Text Available Comparisons of the potential outcomes of multiple land management strategies and an understanding of the influence of potential increases in climate-related disturbances on these outcomes are essential for long term land management and conservation planning. To provide these insights, we developed an approach that uses collaborative scenario development and state-and-transition simulation modeling to provide land managers and conservation practitioners with a comparison of potential landscapes resulting from alternative management scenarios and climate conditions, and we have applied this approach in the Wild Rivers Legacy Forest (WRLF area in northeastern Wisconsin. Three management scenarios were developed with input from local land managers, scientists, and conservation practitioners: 1 continuation of current management, 2 expanded working forest conservation easements, and 3 cooperative ecological forestry. Scenarios were modeled under current climate with contemporary probabilities of natural disturbance and under increased probability of windthrow and wildfire that may result from climate change in this region. All scenarios were modeled for 100 years using the VDDT/TELSA modeling suite. Results showed that landscape composition and configuration were relatively similar among scenarios, and that management had a stronger effect than increased probability of windthrow and wildfire. These findings suggest that the scale of the landscape analysis used here and the lack of differences in predominant management strategies between ownerships in this region play significant roles in scenario outcomes. The approach used here does not rely on complex mechanistic modeling of uncertain dynamics and can therefore be used as starting point for planning and further analysis.

  18. More than 10 years survival with sequential therapy in a patient with advanced renal cell carcinoma: a case report

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, J.L.; Wang, F.L.; Yi, X.M.; Qin, W.J.; Wu, G.J. [Department of Urology, Xijing Hospital, Fourth Military Medical University, Xi' an, Shaanxi (China); Huan, Y. [Department of Radiology, Xijing Hospital, Fourth Military Medical University, Xi' an, Shaanxi (China); Yang, L.J.; Zhang, G.; Yu, L.; Zhang, Y.T.; Qin, R.L.; Tian, C.J. [Department of Urology, Xijing Hospital, Fourth Military Medical University, Xi' an, Shaanxi (China)

    2014-10-31

    Although radical nephrectomy alone is widely accepted as the standard of care in localized treatment for renal cell carcinoma (RCC), it is not sufficient for the treatment of metastatic RCC (mRCC), which invariably leads to an unfavorable outcome despite the use of multiple therapies. Currently, sequential targeted agents are recommended for the management of mRCC, but the optimal drug sequence is still debated. This case was a 57-year-old man with clear-cell mRCC who received multiple therapies following his first operation in 2003 and has survived for over 10 years with a satisfactory quality of life. The treatments given included several surgeries, immunotherapy, and sequentially administered sorafenib, sunitinib, and everolimus regimens. In the course of mRCC treatment, well-planned surgeries, effective sequential targeted therapies and close follow-up are all of great importance for optimal management and a satisfactory outcome.

  19. Hemodynamic analysis of sequential graft from right coronary system to left coronary system.

    Science.gov (United States)

    Wang, Wenxin; Mao, Boyan; Wang, Haoran; Geng, Xueying; Zhao, Xi; Zhang, Huixia; Xie, Jinsheng; Zhao, Zhou; Lian, Bo; Liu, Youjun

    2016-12-28

    Sequential and single grafting are two surgical procedures of coronary artery bypass grafting. However, it remains unclear if the sequential graft can be used between the right and left coronary artery system. The purpose of this paper is to clarify the possibility of right coronary artery system anastomosis to left coronary system. A patient-specific 3D model was first reconstructed based on coronary computed tomography angiography (CCTA) images. Two different grafts, the normal multi-graft (Model 1) and the novel multi-graft (Model 2), were then implemented on this patient-specific model using virtual surgery techniques. In Model 1, the single graft was anastomosed to right coronary artery (RCA) and the sequential graft was adopted to anastomose left anterior descending (LAD) and left circumflex artery (LCX). While in Model 2, the single graft was anastomosed to LAD and the sequential graft was adopted to anastomose RCA and LCX. A zero-dimensional/three-dimensional (0D/3D) coupling method was used to realize the multi-scale simulation of both the pre-operative and two post-operative models. Flow rates in the coronary artery and grafts were obtained. The hemodynamic parameters were also showed, including wall shear stress (WSS) and oscillatory shear index (OSI). The area of low WSS and OSI in Model 1 was much less than that in Model 2. Model 1 shows optimistic hemodynamic modifications which may enhance the long-term patency of grafts. The anterior segments of sequential graft have better long-term patency than the posterior segments. With rational spatial position of the heart vessels, the last anastomosis of sequential graft should be connected to the main branch.

  20. On Locally Most Powerful Sequential Rank Tests

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985556 Keywords : nonparametric test s * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/kalina-0474065.pdf

  1. Decoding restricted participation in sequential electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Knaut, Andreas; Paschmann, Martin

    2017-06-15

    Restricted participation in sequential markets may cause high price volatility and welfare losses. In this paper we therefore analyze the drivers of restricted participation in the German intraday auction which is a short-term electricity market with quarter-hourly products. Applying a fundamental electricity market model with 15-minute temporal resolution, we identify the lack of sub-hourly market coupling being the most relevant driver of restricted participation. We derive a proxy for price volatility and find that full market coupling may trigger quarter-hourly price volatility to decrease by a factor close to four.

  2. THE DEVELOPMENT OF SPECIAL SEQUENTIALLY-TIMED

    Directory of Open Access Journals (Sweden)

    Stanislav LICHOROBIEC

    2016-06-01

    Full Text Available This article documents the development of the noninvasive use of explosives during the destruction of ice mass in river flows. The system of special sequentially-timed charges utilizes the increase in efficiency of cutting charges by covering them with bags filled with water, while simultaneously increasing the effect of the entire system of timed charges. Timing, spatial combinations during placement, and the linking of these charges results in the loosening of ice barriers on a frozen waterway, while at the same time regulating the size of the ice fragments. The developed charges will increase the operability and safety of IRS units.

  3. Pass-transistor asynchronous sequential circuits

    Science.gov (United States)

    Whitaker, Sterling R.; Maki, Gary K.

    1989-01-01

    Design methods for asynchronous sequential pass-transistor circuits, which result in circuits that are hazard- and critical-race-free and which have added degrees of freedom for the input signals, are discussed. The design procedures are straightforward and easy to implement. Two single-transition-time state assignment methods are presented, and hardware bounds for each are established. A surprising result is that the hardware realizations for each next state variable and output variable is identical for a given flow table. Thus, a state machine with N states and M outputs can be constructed using a single layout replicated N + M times.

  4. A sequential/parallel track selector

    CERN Document Server

    Bertolino, F; Bressani, Tullio; Chiavassa, E; Costa, S; Dellacasa, G; Gallio, M; Musso, A

    1980-01-01

    A medium speed ( approximately 1 mu s) hardware pre-analyzer for the selection of events detected in four planes of drift chambers in the magnetic field of the Omicron Spectrometer at the CERN SC is described. Specific geometrical criteria determine patterns of hits in the four planes of vertical wires that have to be recognized and that are stored as patterns of '1's in random access memories. Pairs of good hits are found sequentially, then the RAMs are used as look-up tables. (6 refs).

  5. Boundary conditions in random sequential adsorption

    Science.gov (United States)

    Cieśla, Michał; Ziff, Robert M.

    2018-04-01

    The influence of different boundary conditions on the density of random packings of disks is studied. Packings are generated using the random sequential adsorption algorithm with three different types of boundary conditions: periodic, open, and wall. It is found that the finite size effects are smallest for periodic boundary conditions, as expected. On the other hand, in the case of open and wall boundaries it is possible to introduce an effective packing size and a constant correction term to significantly improve the packing densities.

  6. Automatic synthesis of sequential control schemes

    International Nuclear Information System (INIS)

    Klein, I.

    1993-01-01

    Of all hard- and software developed for industrial control purposes, the majority is devoted to sequential, or binary valued, control and only a minor part to classical linear control. Typically, the sequential parts of the controller are invoked during startup and shut-down to bring the system into its normal operating region and into some safe standby region, respectively. Despite its importance, fairly little theoretical research has been devoted to this area, and sequential control programs are therefore still created manually without much theoretical support to obtain a systematic approach. We propose a method to create sequential control programs automatically. The main ideas is to spend some effort off-line modelling the plant, and from this model generate the control strategy, that is the plan. The plant is modelled using action structures, thereby concentrating on the actions instead of the states of the plant. In general the planning problem shows exponential complexity in the number of state variables. However, by focusing on the actions, we can identify problem classes as well as algorithms such that the planning complexity is reduced to polynomial complexity. We prove that these algorithms are sound, i.e., the generated solution will solve the stated problem, and complete, i.e., if the algorithms fail, then no solution exists. The algorithms generate a plan as a set of actions and a partial order on this set specifying the execution order. The generated plant is proven to be minimal and maximally parallel. For a larger class of problems we propose a method to split the original problem into a number of simple problems that can each be solved using one of the presented algorithms. It is also shown how a plan can be translated into a GRAFCET chart, and to illustrate these ideas we have implemented a planing tool, i.e., a system that is able to automatically create control schemes. Such a tool can of course also be used on-line if it is fast enough. This

  7. From sequential to parallel programming with patterns

    CERN Document Server

    CERN. Geneva

    2018-01-01

    To increase in both performance and efficiency, our programming models need to adapt to better exploit modern processors. The classic idioms and patterns for programming such as loops, branches or recursion are the pillars of almost every code and are well known among all programmers. These patterns all have in common that they are sequential in nature. Embracing parallel programming patterns, which allow us to program for multi- and many-core hardware in a natural way, greatly simplifies the task of designing a program that scales and performs on modern hardware, independently of the used programming language, and in a generic way.

  8. Sequential extraction of uranium metal contamination

    International Nuclear Information System (INIS)

    Murry, M.M.; Spitz, H.B.; Connick, W.B.

    2016-01-01

    Samples of uranium contaminated dirt collected from the dirt floor of an abandoned metal rolling mill were analyzed for uranium using a sequential extraction protocol involving a series of five increasingly aggressive solvents. The quantity of uranium extracted from the contaminated dirt by each reagent can aid in predicting the fate and transport of the uranium contamination in the environment. Uranium was separated from each fraction using anion exchange, electrodeposition and analyzed by alpha spectroscopy analysis. Results demonstrate that approximately 77 % of the uranium was extracted using NH 4 Ac in 25 % acetic acid. (author)

  9. Simultaneous optimization of sequential IMRT plans

    International Nuclear Information System (INIS)

    Popple, Richard A.; Prellop, Perri B.; Spencer, Sharon A.; Santos, Jennifer F. de los; Duan, Jun; Fiveash, John B.; Brezovich, Ivan A.

    2005-01-01

    Radiotherapy often comprises two phases, in which irradiation of a volume at risk for microscopic disease is followed by a sequential dose escalation to a smaller volume either at a higher risk for microscopic disease or containing only gross disease. This technique is difficult to implement with intensity modulated radiotherapy, as the tolerance doses of critical structures must be respected over the sum of the two plans. Techniques that include an integrated boost have been proposed to address this problem. However, clinical experience with such techniques is limited, and many clinicians are uncomfortable prescribing nonconventional fractionation schemes. To solve this problem, we developed an optimization technique that simultaneously generates sequential initial and boost IMRT plans. We have developed an optimization tool that uses a commercial treatment planning system (TPS) and a high level programming language for technical computing. The tool uses the TPS to calculate the dose deposition coefficients (DDCs) for optimization. The DDCs were imported into external software and the treatment ports duplicated to create the boost plan. The initial, boost, and tolerance doses were specified and used to construct cost functions. The initial and boost plans were optimized simultaneously using a gradient search technique. Following optimization, the fluence maps were exported to the TPS for dose calculation. Seven patients treated using sequential techniques were selected from our clinical database. The initial and boost plans used to treat these patients were developed independently of each other by dividing the tolerance doses proportionally between the initial and boost plans and then iteratively optimizing the plans until a summation that met the treatment goals was obtained. We used the simultaneous optimization technique to generate plans that met the original planning goals. The coverage of the initial and boost target volumes in the simultaneously optimized

  10. Melioration as rational choice: sequential decision making in uncertain environments.

    Science.gov (United States)

    Sims, Chris R; Neth, Hansjörg; Jacobs, Robert A; Gray, Wayne D

    2013-01-01

    Melioration-defined as choosing a lesser, local gain over a greater longer term gain-is a behavioral tendency that people and pigeons share. As such, the empirical occurrence of meliorating behavior has frequently been interpreted as evidence that the mechanisms of human choice violate the norms of economic rationality. In some environments, the relationship between actions and outcomes is known. In this case, the rationality of choice behavior can be evaluated in terms of how successfully it maximizes utility given knowledge of the environmental contingencies. In most complex environments, however, the relationship between actions and future outcomes is uncertain and must be learned from experience. When the difficulty of this learning challenge is taken into account, it is not evident that melioration represents suboptimal choice behavior. In the present article, we examine human performance in a sequential decision-making experiment that is known to induce meliorating behavior. In keeping with previous results using this paradigm, we find that the majority of participants in the experiment fail to adopt the optimal decision strategy and instead demonstrate a significant bias toward melioration. To explore the origins of this behavior, we develop a rational analysis (Anderson, 1990) of the learning problem facing individuals in uncertain decision environments. Our analysis demonstrates that an unbiased learner would adopt melioration as the optimal response strategy for maximizing long-term gain. We suggest that many documented cases of melioration can be reinterpreted not as irrational choice but rather as globally optimal choice under uncertainty.

  11. Fatigue reduction during aggregated and distributed sequential stimulation.

    Science.gov (United States)

    Bergquist, Austin J; Babbar, Vishvek; Ali, Saima; Popovic, Milos R; Masani, Kei

    2017-08-01

    Transcutaneous neuromuscular electrical stimulation (NMES) can generate muscle contractions for rehabilitation and exercise. However, NMES-evoked contractions are limited by fatigue when they are delivered "conventionally" (CONV) using a single active electrode. Researchers have developed "sequential" (SEQ) stimulation, involving rotation of pulses between multiple "aggregated" (AGGR-SEQ) or "distributed" (DISTR-SEQ) active electrodes, to reduce fatigue (torque-decline) by reducing motor unit discharge rates. The primary objective was to compare fatigue-related outcomes, "potentiation," "variability," and "efficiency" between CONV, AGGR-SEQ, and DISTR-SEQ stimulation of knee extensors in healthy participants. Torque and current were recorded during testing with fatiguing trains using each NMES type under isometric and isokinetic (180°/s) conditions. Compared with CONV stimulation, SEQ techniques reduced fatigue-related outcomes, increased potentiation, did not affect variability, and reduced efficiency. SEQ techniques hold promise for reducing fatigue during NMES-based rehabilitation and exercise; however, optimization is required to improve efficiency. Muscle Nerve 56: 271-281, 2017. © 2016 Wiley Periodicals, Inc.

  12. Modeling eye gaze patterns in clinician-patient interaction with lag sequential analysis.

    Science.gov (United States)

    Montague, Enid; Xu, Jie; Chen, Ping-Yu; Asan, Onur; Barrett, Bruce P; Chewning, Betty

    2011-10-01

    The aim of this study was to examine whether lag sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multiuser health care settings in which trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Nonverbal communication patterns are important aspects of clinician-patient interactions and may affect patient outcomes. The eye gaze behaviors of clinicians and patients in 110 videotaped medical encounters were analyzed using the lag sequential method to identify significant behavior sequences. Lag sequential analysis included both event-based lag and time-based lag. Results from event-based lag analysis showed that the patient's gaze followed that of the clinician, whereas the clinician's gaze did not follow the patient's. Time-based sequential analysis showed that responses from the patient usually occurred within 2 s after the initial behavior of the clinician. Our data suggest that the clinician's gaze significantly affects the medical encounter but that the converse is not true. Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs.

  13. Bursts and heavy tails in temporal and sequential dynamics of foraging decisions.

    Directory of Open Access Journals (Sweden)

    Kanghoon Jung

    2014-08-01

    Full Text Available A fundamental understanding of behavior requires predicting when and what an individual will choose. However, the actual temporal and sequential dynamics of successive choices made among multiple alternatives remain unclear. In the current study, we tested the hypothesis that there is a general bursting property in both the timing and sequential patterns of foraging decisions. We conducted a foraging experiment in which rats chose among four different foods over a continuous two-week time period. Regarding when choices were made, we found bursts of rapidly occurring actions, separated by time-varying inactive periods, partially based on a circadian rhythm. Regarding what was chosen, we found sequential dynamics in affective choices characterized by two key features: (a a highly biased choice distribution; and (b preferential attachment, in which the animals were more likely to choose what they had previously chosen. To capture the temporal dynamics, we propose a dual-state model consisting of active and inactive states. We also introduce a satiation-attainment process for bursty activity, and a non-homogeneous Poisson process for longer inactivity between bursts. For the sequential dynamics, we propose a dual-control model consisting of goal-directed and habit systems, based on outcome valuation and choice history, respectively. This study provides insights into how the bursty nature of behavior emerges from the interaction of different underlying systems, leading to heavy tails in the distribution of behavior over time and choices.

  14. Sequential designs for sensitivity analysis of functional inputs in computer experiments

    International Nuclear Information System (INIS)

    Fruth, J.; Roustant, O.; Kuhnt, S.

    2015-01-01

    Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time. - Highlights: • Sensitivity analysis method for functional and scalar inputs is presented. • We focus on the discovery of most influential parts of the functional domain. • We investigate economical sequential methodology based on piecewise constant functions. • Normalized sensitivity indices are introduced and investigated theoretically. • Successful application to sheet metal forming on two functional inputs

  15. Bursts and Heavy Tails in Temporal and Sequential Dynamics of Foraging Decisions

    Science.gov (United States)

    Jung, Kanghoon; Jang, Hyeran; Kralik, Jerald D.; Jeong, Jaeseung

    2014-01-01

    A fundamental understanding of behavior requires predicting when and what an individual will choose. However, the actual temporal and sequential dynamics of successive choices made among multiple alternatives remain unclear. In the current study, we tested the hypothesis that there is a general bursting property in both the timing and sequential patterns of foraging decisions. We conducted a foraging experiment in which rats chose among four different foods over a continuous two-week time period. Regarding when choices were made, we found bursts of rapidly occurring actions, separated by time-varying inactive periods, partially based on a circadian rhythm. Regarding what was chosen, we found sequential dynamics in affective choices characterized by two key features: (a) a highly biased choice distribution; and (b) preferential attachment, in which the animals were more likely to choose what they had previously chosen. To capture the temporal dynamics, we propose a dual-state model consisting of active and inactive states. We also introduce a satiation-attainment process for bursty activity, and a non-homogeneous Poisson process for longer inactivity between bursts. For the sequential dynamics, we propose a dual-control model consisting of goal-directed and habit systems, based on outcome valuation and choice history, respectively. This study provides insights into how the bursty nature of behavior emerges from the interaction of different underlying systems, leading to heavy tails in the distribution of behavior over time and choices. PMID:25122498

  16. A cohort study of maternal and neonatal morbidity in relation to use of sequential instruments at operative vaginal delivery.

    LENUS (Irish Health Repository)

    Murphy, Deirdre J

    2012-02-01

    OBJECTIVE: To evaluate the risk factors and maternal and neonatal morbidity associated with sequential use of instruments (vacuum and forceps) at operative vaginal delivery. STUDY DESIGN: A cohort study of 1360 nulliparous women delivered by a single instrument (vacuum or forceps) or by both instruments, within two university teaching hospitals in Scotland and England. Outcomes were compared for use of sequential instruments versus use of any single instrument. A sub-group analysis compared sequential instruments versus forceps alone. Outcomes of interest included anal sphincter tears, postpartum haemorrhage, urinary retention, urinary incontinence, prolonged hospital admission, neonatal trauma, low Apgar scores, abnormal cord bloods and admission to the neonatal intensive care unit (NICU). RESULTS: Use of sequential instruments at operative vaginal delivery was associated with fetal malpositions, Odds Ratio (OR) 1.8 (95% Confidence Interval (CI) 1.3-2.6), and large neonatal head circumference (>37 cm) (OR 5.0, 95% CI 2.6-9.7) but not with maternal obesity or grade of operator. Sequential use of instruments was associated with greater maternal and neonatal morbidity than single instrument use (anal sphincter tear 17.4% versus 8.4%, adjusted OR 2.1, 95% CI 1.2-3.3; umbilical artery pH <7.10, 13.8% versus 5.0%, adjusted OR 3.3, 95% CI 1.7-6.2). Sequential instrument use had greater morbidity than single instrument use with forceps alone (anal sphincter tear OR 1.8, 95% CI 1.1-2.9; umbilical artery pH <7.10 OR 3.0, 95% CI 1.7-5.5). CONCLUSIONS: The use of sequential instruments significantly increases maternal and neonatal morbidity. Obstetricians need training in the appropriate selection and use of instruments with the aim of completing delivery safely with one instrument.

  17. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    Science.gov (United States)

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  18. Sequential and simultaneous choices: testing the diet selection and sequential choice models.

    Science.gov (United States)

    Freidin, Esteban; Aw, Justine; Kacelnik, Alex

    2009-03-01

    We investigate simultaneous and sequential choices in starlings, using Charnov's Diet Choice Model (DCM) and Shapiro, Siller and Kacelnik's Sequential Choice Model (SCM) to integrate function and mechanism. During a training phase, starlings encountered one food-related option per trial (A, B or R) in random sequence and with equal probability. A and B delivered food rewards after programmed delays (shorter for A), while R ('rejection') moved directly to the next trial without reward. In this phase we measured latencies to respond. In a later, choice, phase, birds encountered the pairs A-B, A-R and B-R, the first implementing a simultaneous choice and the second and third sequential choices. The DCM predicts when R should be chosen to maximize intake rate, and SCM uses latencies of the training phase to predict choices between any pair of options in the choice phase. The predictions of both models coincided, and both successfully predicted the birds' preferences. The DCM does not deal with partial preferences, while the SCM does, and experimental results were strongly correlated to this model's predictions. We believe that the SCM may expose a very general mechanism of animal choice, and that its wider domain of success reflects the greater ecological significance of sequential over simultaneous choices.

  19. Costs of achieving live birth from assisted reproductive technology: a comparison of sequential single and double embryo transfer approaches.

    Science.gov (United States)

    Crawford, Sara; Boulet, Sheree L; Mneimneh, Allison S; Perkins, Kiran M; Jamieson, Denise J; Zhang, Yujia; Kissin, Dmitry M

    2016-02-01

    To assess treatment and pregnancy/infant-associated medical costs and birth outcomes for assisted reproductive technology (ART) cycles in a subset of patients using elective double embryo (ET) and to project the difference in costs and outcomes had the cycles instead been sequential single ETs (fresh followed by frozen if the fresh ET did not result in live birth). Retrospective cohort study using 2012 and 2013 data from the National ART Surveillance System. Infertility treatment centers. Fresh, autologous double ETs performed in 2012 among ART patients younger than 35 years of age with no prior ART use who cryopreserved at least one embryo. Sequential single and double ETs. Actual live birth rates and estimated ART treatment and pregnancy/infant-associated medical costs for double ET cycles started in 2012 and projected ART treatment and pregnancy/infant-associated medical costs if the double ET cycles had been performed as sequential single ETs. The estimated total ART treatment and pregnancy/infant-associated medical costs were $580.9 million for 10,001 double ETs started in 2012. If performed as sequential single ETs, estimated costs would have decreased by $195.0 million to $386.0 million, and live birth rates would have increased from 57.7%-68.0%. Sequential single ETs, when clinically appropriate, can reduce total ART treatment and pregnancy/infant-associated medical costs by reducing multiple births without lowering live birth rates. Published by Elsevier Inc.

  20. Sequential compression biomechanical device in patients with critical limb ischemia and nonreconstructible peripheral vascular disease.

    LENUS (Irish Health Repository)

    Sultan, Sherif

    2011-08-01

    Critical limb ischemia (CLI) patients who are unsuitable for intervention face the dire prospect of primary amputation. Sequential compression biomechanical device (SCBD) therapy provides a limb salvage option for these patients. This study assessed the outcome of SCBD in severe CLI patients who otherwise would face an amputation. Primary end points were limb salvage and 30-day mortality. Secondary end points were hemodynamic outcomes (increase in popliteal artery flow and toe pressure), ulcer healing, quality-adjusted time without symptoms of disease or toxicity of treatment (Q-TwiST), and cost-effectiveness.

  1. Characterization of a sequential pipeline approach to automatic tissue segmentation from brain MR Images

    International Nuclear Information System (INIS)

    Hou, Zujun; Huang, Su

    2008-01-01

    Quantitative analysis of gray matter and white matter in brain magnetic resonance imaging (MRI) is valuable for neuroradiology and clinical practice. Submission of large collections of MRI scans to pipeline processing is increasingly important. We characterized this process and suggest several improvements. To investigate tissue segmentation from brain MR images through a sequential approach, a pipeline that consecutively executes denoising, skull/scalp removal, intensity inhomogeneity correction and intensity-based classification was developed. The denoising phase employs a 3D-extension of the Bayes-Shrink method. The inhomogeneity is corrected by an improvement of the Dawant et al.'s method with automatic generation of reference points. The N3 method has also been evaluated. Subsequently the brain tissue is segmented into cerebrospinal fluid, gray matter and white matter by a generalized Otsu thresholding technique. Intensive comparisons with other sequential or iterative methods have been carried out using simulated and real images. The sequential approach with judicious selection on the algorithm selection in each stage is not only advantageous in speed, but also can attain at least as accurate segmentation as iterative methods under a variety of noise or inhomogeneity levels. A sequential approach to tissue segmentation, which consecutively executes the wavelet shrinkage denoising, scalp/skull removal, inhomogeneity correction and intensity-based classification was developed to automatically segment the brain tissue into CSF, GM and WM from brain MR images. This approach is advantageous in several common applications, compared with other pipeline methods. (orig.)

  2. Optimal Sequential Diagnostic Strategy Generation Considering Test Placement Cost for Multimode Systems

    Directory of Open Access Journals (Sweden)

    Shigang Zhang

    2015-10-01

    Full Text Available Sequential fault diagnosis is an approach that realizes fault isolation by executing the optimal test step by step. The strategy used, i.e., the sequential diagnostic strategy, has great influence on diagnostic accuracy and cost. Optimal sequential diagnostic strategy generation is an important step in the process of diagnosis system construction, which has been studied extensively in the literature. However, previous algorithms either are designed for single mode systems or do not consider test placement cost. They are not suitable to solve the sequential diagnostic strategy generation problem considering test placement cost for multimode systems. Therefore, this problem is studied in this paper. A formulation is presented. Two algorithms are proposed, one of which is realized by system transformation and the other is newly designed. Extensive simulations are carried out to test the effectiveness of the algorithms. A real-world system is also presented. All the results show that both of them have the ability to solve the diagnostic strategy generation problem, and they have different characteristics.

  3. Optimal Sequential Diagnostic Strategy Generation Considering Test Placement Cost for Multimode Systems

    Science.gov (United States)

    Zhang, Shigang; Song, Lijun; Zhang, Wei; Hu, Zheng; Yang, Yongmin

    2015-01-01

    Sequential fault diagnosis is an approach that realizes fault isolation by executing the optimal test step by step. The strategy used, i.e., the sequential diagnostic strategy, has great influence on diagnostic accuracy and cost. Optimal sequential diagnostic strategy generation is an important step in the process of diagnosis system construction, which has been studied extensively in the literature. However, previous algorithms either are designed for single mode systems or do not consider test placement cost. They are not suitable to solve the sequential diagnostic strategy generation problem considering test placement cost for multimode systems. Therefore, this problem is studied in this paper. A formulation is presented. Two algorithms are proposed, one of which is realized by system transformation and the other is newly designed. Extensive simulations are carried out to test the effectiveness of the algorithms. A real-world system is also presented. All the results show that both of them have the ability to solve the diagnostic strategy generation problem, and they have different characteristics. PMID:26457709

  4. Distortionary effects of a production-sharing fiscal system in a sequential modular offshore petroleum project

    Science.gov (United States)

    Neves de Campos, Thiago

    This research examines the distortionary effects of a discovered and undeveloped sequential modular offshore project under five different designs for a production-sharing agreement (PSA). The model differs from previous research by looking at the effect of taxation from the perspective of a host government, where the objective is to maximize government utility over government revenue generated by the project and the non-pecuniary benefits to society. This research uses Modern Asset Pricing (MAP) theory, which is able to provide a good measure of the asset value accruing to various stakeholders in the project combined with the optimal decision rule for the development of the investment opportunity. Monte Carlo simulation was also applied to incorporate into the model the most important sources of risk associated with the project and to account for non-linearity in the cash flows. For a complete evaluation of how the fiscal system affects the project development, an investor's behavioral model was constructed, incorporating three operational decisions: investment timing, capacity size and early abandonment. The model considers four sources of uncertainty that affect the project value and the firm's optimal decision: the long run oil price and short-run deviations from that price, cost escalation and the reservoir recovery rate. The optimizations outcomes show that all fiscal systems evaluated produce distortion over the companies' optimal decisions, and companies adjust their choices to avoid taxation in different ways according to the fiscal system characteristics. Moreover, it is revealed that fiscal systems with tax provisions that try to capture additional project profits based on production profitability measures leads to stronger distortions in the project investment and output profile. It is also shown that a model based on a fixed percentage rate is the system that creates the least distortion. This is because companies will be subjected to the same

  5. A framework of knowledge creation processes in participatory simulation of hospital work systems

    DEFF Research Database (Denmark)

    Andersen, Simone Nyholm; Broberg, Ole

    2017-01-01

    Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we...... suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis....... The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals...

  6. Sequential probability ratio controllers for safeguards radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles

  7. Small median tumor diameter at cure threshold (lung cancers in male smokers predicts both chest X-ray and CT screening outcomes in a novel simulation framework.

    Science.gov (United States)

    Goldwasser, Deborah L; Kimmel, Marek

    2013-01-01

    The effectiveness of population-wide lung cancer screening strategies depends on the underlying natural course of lung cancer. We evaluate the expected stage distribution in the Mayo CT screening study under an existing simulation model of non-small cell lung cancer (NSCLC) progression calibrated to the Mayo lung project (MLP). Within a likelihood framework, we evaluate whether the probability of 5-year NSCLC survival conditional on tumor diameter at detection depends significantly on screening detection modality, namely chest X-ray and computed tomography. We describe a novel simulation framework in which tumor progression depends on cellular proliferation and mutation within a stem cell compartment of the tumor. We fit this model to randomized trial data from the MLP and produce estimates of the median radiologic size at the cure threshold. We examine the goodness of model fit with respect to radiologic tumor size and 5-year NSCLC survival among incident cancers in both the MLP and Mayo CT studies. An existing model of NSCLC progression under-predicts the number of advanced-stage incident NSCLCs among males in the Mayo CT study (p-value = 0.004). The probability of 5-year NSCLC survival conditional on tumor diameter depends significantly on detection modality (p-value = 0.0312). In our new model, selected solution sets having a median tumor diameter of 16.2-22.1 mm at cure threshold among aggressive NSCLCs predict both MLP and Mayo CT outcomes. We conclude that the median lung tumor diameter at cure threshold among aggressive NSCLCs in male smokers may be small (<20 mm). Copyright © 2012 UICC.

  8. Comparison of population-averaged and cluster-specific models for the analysis of cluster randomized trials with missing binary outcomes: a simulation study

    Directory of Open Access Journals (Sweden)

    Ma Jinhui

    2013-01-01

    Full Text Available Abstracts Background The objective of this simulation study is to compare the accuracy and efficiency of population-averaged (i.e. generalized estimating equations (GEE and cluster-specific (i.e. random-effects logistic regression (RELR models for analyzing data from cluster randomized trials (CRTs with missing binary responses. Methods In this simulation study, clustered responses were generated from a beta-binomial distribution. The number of clusters per trial arm, the number of subjects per cluster, intra-cluster correlation coefficient, and the percentage of missing data were allowed to vary. Under the assumption of covariate dependent missingness, missing outcomes were handled by complete case analysis, standard multiple imputation (MI and within-cluster MI strategies. Data were analyzed using GEE and RELR. Performance of the methods was assessed using standardized bias, empirical standard error, root mean squared error (RMSE, and coverage probability. Results GEE performs well on all four measures — provided the downward bias of the standard error (when the number of clusters per arm is small is adjusted appropriately — under the following scenarios: complete case analysis for CRTs with a small amount of missing data; standard MI for CRTs with variance inflation factor (VIF 50. RELR performs well only when a small amount of data was missing, and complete case analysis was applied. Conclusion GEE performs well as long as appropriate missing data strategies are adopted based on the design of CRTs and the percentage of missing data. In contrast, RELR does not perform well when either standard or within-cluster MI strategy is applied prior to the analysis.

  9. Equivalence between quantum simultaneous games and quantum sequential games

    OpenAIRE

    Kobayashi, Naoki

    2007-01-01

    A framework for discussing relationships between different types of games is proposed. Within the framework, quantum simultaneous games, finite quantum simultaneous games, quantum sequential games, and finite quantum sequential games are defined. In addition, a notion of equivalence between two games is defined. Finally, the following three theorems are shown: (1) For any quantum simultaneous game G, there exists a quantum sequential game equivalent to G. (2) For any finite quantum simultaneo...

  10. Discrimination between sequential and simultaneous virtual channels with electrical hearing

    OpenAIRE

    Landsberger, David; Galvin, John J.

    2011-01-01

    In cochlear implants (CIs), simultaneous or sequential stimulation of adjacent electrodes can produce intermediate pitch percepts between those of the component electrodes. However, it is unclear whether simultaneous and sequential virtual channels (VCs) can be discriminated. In this study, CI users were asked to discriminate simultaneous and sequential VCs; discrimination was measured for monopolar (MP) and bipolar + 1 stimulation (BP + 1), i.e., relatively broad and focused stimulation mode...

  11. C-quence: a tool for analyzing qualitative sequential data.

    Science.gov (United States)

    Duncan, Starkey; Collier, Nicholson T

    2002-02-01

    C-quence is a software application that matches sequential patterns of qualitative data specified by the user and calculates the rate of occurrence of these patterns in a data set. Although it was designed to facilitate analyses of face-to-face interaction, it is applicable to any data set involving categorical data and sequential information. C-quence queries are constructed using a graphical user interface. The program does not limit the complexity of the sequential patterns specified by the user.

  12. Simulation-based multiprofessional obstetric anaesthesia training conducted in situ versus off-site leads to similar individual and team outcomes: a randomised educational trial

    NARCIS (Netherlands)

    Sorensen, J.L.; Vleuten, C. van der; Rosthoj, S.; Ostergaard, D.; Leblanc, V.; Johansen, M.; Ekelund, K.; Starkopf, L.; Lindschou, J.; Gluud, C.; Weikop, P.; Ottesen, B.

    2015-01-01

    OBJECTIVE: To investigate the effect of in situ simulation (ISS) versus off-site simulation (OSS) on knowledge, patient safety attitude, stress, motivation, perceptions of simulation, team performance and organisational impact. DESIGN: Investigator-initiated single-centre randomised superiority

  13. Discrimination between sequential and simultaneous virtual channels with electrical hearing.

    Science.gov (United States)

    Landsberger, David; Galvin, John J

    2011-09-01

    In cochlear implants (CIs), simultaneous or sequential stimulation of adjacent electrodes can produce intermediate pitch percepts between those of the component electrodes. However, it is unclear whether simultaneous and sequential virtual channels (VCs) can be discriminated. In this study, CI users were asked to discriminate simultaneous and sequential VCs; discrimination was measured for monopolar (MP) and bipolar + 1 stimulation (BP + 1), i.e., relatively broad and focused stimulation modes. For sequential VCs, the interpulse interval (IPI) varied between 0.0 and 1.8 ms. All stimuli were presented at comfortably loud, loudness-balanced levels at a 250 pulse per second per electrode (ppse) stimulation rate. On average, CI subjects were able to reliably discriminate between sequential and simultaneous VCs. While there was no significant effect of IPI or stimulation mode on VC discrimination, some subjects exhibited better VC discrimination with BP + 1 stimulation. Subjects' discrimination between sequential and simultaneous VCs was correlated with electrode discrimination, suggesting that spatial selectivity may influence perception of sequential VCs. To maintain equal loudness, sequential VC amplitudes were nearly double those of simultaneous VCs, presumably resulting in a broader spread of excitation. These results suggest that perceptual differences between simultaneous and sequential VCs might be explained by differences in the spread of excitation. © 2011 Acoustical Society of America

  14. Lineup composition, suspect position, and the sequential lineup advantage.

    Science.gov (United States)

    Carlson, Curt A; Gronlund, Scott D; Clark, Steven E

    2008-06-01

    N. M. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001) argued that sequential lineups reduce the likelihood of mistaken eyewitness identification. Experiment 1 replicated the design of R. C. L. Lindsay and G. L. Wells (1985), the first study to show the sequential lineup advantage. However, the innocent suspect was chosen at a lower rate in the simultaneous lineup, and no sequential lineup advantage was found. This led the authors to hypothesize that protection from a sequential lineup might emerge only when an innocent suspect stands out from the other lineup members. In Experiment 2, participants viewed a simultaneous or sequential lineup with either the guilty suspect or 1 of 3 innocent suspects. Lineup fairness was varied to influence the degree to which a suspect stood out. A sequential lineup advantage was found only for the unfair lineups. Additional analyses of suspect position in the sequential lineups showed an increase in the diagnosticity of suspect identifications as the suspect was placed later in the sequential lineup. These results suggest that the sequential lineup advantage is dependent on lineup composition and suspect position. (c) 2008 APA, all rights reserved

  15. Group-sequential analysis may allow for early trial termination

    DEFF Research Database (Denmark)

    Gerke, Oke; Vilstrup, Mie H; Halekoh, Ulrich

    2017-01-01

    BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG-PET/CT mea......BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG...

  16. Sequential infiltration synthesis for advanced lithography

    Energy Technology Data Exchange (ETDEWEB)

    Darling, Seth B.; Elam, Jeffrey W.; Tseng, Yu-Chih; Peng, Qing

    2017-10-10

    A plasma etch resist material modified by an inorganic protective component via sequential infiltration synthesis (SIS) and methods of preparing the modified resist material. The modified resist material is characterized by an improved resistance to a plasma etching or related process relative to the unmodified resist material, thereby allowing formation of patterned features into a substrate material, which may be high-aspect ratio features. The SIS process forms the protective component within the bulk resist material through a plurality of alternating exposures to gas phase precursors which infiltrate the resist material. The plasma etch resist material may be initially patterned using photolithography, electron-beam lithography or a block copolymer self-assembly process.

  17. Clinical evaluation of synthetic aperture sequential beamforming

    DEFF Research Database (Denmark)

    Hansen, Peter Møller; Hemmsen, Martin Christian; Lange, Theis

    2012-01-01

    In this study clinically relevant ultrasound images generated with synthetic aperture sequential beamforming (SASB) is compared to images generated with a conventional technique. The advantage of SASB is the ability to produce high resolution ultrasound images with a high frame rate and at the same...... time massively reduce the amount of generated data. SASB was implemented in a system consisting of a conventional ultrasound scanner connected to a PC via a research interface. This setup enables simultaneous recording with both SASB and conventional technique. Eighteen volunteers were ultrasound...... scanned abdominally, and 84 sequence pairs were recorded. Each sequence pair consists of two simultaneous recordings of the same anatomical location with SASB and conventional B-mode imaging. The images were evaluated in terms of spatial resolution, contrast, unwanted artifacts, and penetration depth...

  18. Sequential cooling insert for turbine stator vane

    Science.gov (United States)

    Jones, Russel B

    2017-04-04

    A sequential flow cooling insert for a turbine stator vane of a small gas turbine engine, where the impingement cooling insert is formed as a single piece from a metal additive manufacturing process such as 3D metal printing, and where the insert includes a plurality of rows of radial extending impingement cooling air holes alternating with rows of radial extending return air holes on a pressure side wall, and where the insert includes a plurality of rows of chordwise extending second impingement cooling air holes on a suction side wall. The insert includes alternating rows of radial extending cooling air supply channels and return air channels that form a series of impingement cooling on the pressure side followed by the suction side of the insert.

  19. Gleason-Busch theorem for sequential measurements

    Science.gov (United States)

    Flatt, Kieran; Barnett, Stephen M.; Croke, Sarah

    2017-12-01

    Gleason's theorem is a statement that, given some reasonable assumptions, the Born rule used to calculate probabilities in quantum mechanics is essentially unique [A. M. Gleason, Indiana Univ. Math. J. 6, 885 (1957), 10.1512/iumj.1957.6.56050]. We show that Gleason's theorem contains within it also the structure of sequential measurements, and along with this the state update rule. We give a small set of axioms, which are physically motivated and analogous to those in Busch's proof of Gleason's theorem [P. Busch, Phys. Rev. Lett. 91, 120403 (2003), 10.1103/PhysRevLett.91.120403], from which the familiar Kraus operator form follows. An axiomatic approach has practical relevance as well as fundamental interest, in making clear those assumptions which underlie the security of quantum communication protocols. Interestingly, the two-time formalism is seen to arise naturally in this approach.

  20. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-01

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  1. Sequential Stereotype Priming: A Meta-Analysis.

    Science.gov (United States)

    Kidder, Ciara K; White, Katherine R; Hinojos, Michelle R; Sandoval, Mayra; Crites, Stephen L

    2017-08-01

    Psychological interest in stereotype measurement has spanned nearly a century, with researchers adopting implicit measures in the 1980s to complement explicit measures. One of the most frequently used implicit measures of stereotypes is the sequential priming paradigm. The current meta-analysis examines stereotype priming, focusing specifically on this paradigm. To contribute to ongoing discussions regarding methodological rigor in social psychology, one primary goal was to identify methodological moderators of the stereotype priming effect-whether priming is due to a relation between the prime and target stimuli, the prime and target response, participant task, stereotype dimension, stimulus onset asynchrony (SOA), and stimuli type. Data from 39 studies yielded 87 individual effect sizes from 5,497 participants. Analyses revealed that stereotype priming is significantly moderated by the presence of prime-response relations, participant task, stereotype dimension, target stimulus type, SOA, and prime repetition. These results carry both practical and theoretical implications for future research on stereotype priming.

  2. Sequential Acral Lentiginous Melanomas of the Foot

    Directory of Open Access Journals (Sweden)

    Jiro Uehara

    2010-12-01

    Full Text Available A 64-year-old Japanese woman had a lightly brown-blackish pigmented macule (1.2 cm in diameter on the left sole of her foot. She received surgical excision following a diagnosis of acral lentiginous melanoma (ALM, which was confirmed histopathologically. One month after the operation, a second melanoma lesion was noticed adjacent to the grafted site. Histopathologically, the two lesions had no continuity, but HMB-45 and cyclin D1 double-positive cells were detected not only on aggregates of atypical melanocytes but also on single cells near the cutting edge of the first lesion. The unique occurrence of a sequential lesion of a primary melanoma might be caused by stimulated subclinical field cells during the wound healing process following the initial operation. This case warrants further investigation to establish the appropriate surgical margin of ALM lesions.

  3. Dancing Twins: Stellar Hierarchies That Formed Sequentially?

    Science.gov (United States)

    Tokovinin, Andrei

    2018-04-01

    This paper draws attention to the class of resolved triple stars with moderate ratios of inner and outer periods (possibly in a mean motion resonance) and nearly circular, mutually aligned orbits. Moreover, stars in the inner pair are twins with almost identical masses, while the mass sum of the inner pair is comparable to the mass of the outer component. Such systems could be formed either sequentially (inside-out) by disk fragmentation with subsequent accretion and migration, or by a cascade hierarchical fragmentation of a rotating cloud. Orbits of the outer and inner subsystems are computed or updated in four such hierarchies: LHS 1070 (GJ 2005, periods 77.6 and 17.25 years), HIP 9497 (80 and 14.4 years), HIP 25240 (1200 and 47.0 years), and HIP 78842 (131 and 10.5 years).

  4. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-05

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  5. Sequential Therapy in Metastatic Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Bradford R Hirsch

    2016-04-01

    Full Text Available The treatment of metastatic renal cell carcinoma (mRCC has changed dramatically in the past decade. As the number of available agents, and related volume of research, has grown, it is increasingly complex to know how to optimally treat patients. The authors are practicing medical oncologists at the US Oncology Network, the largest community-based network of oncology providers in the country, and represent the leadership of the Network's Genitourinary Research Committee. We outline our thought process in approaching sequential therapy of mRCC and the use of real-world data to inform our approach. We also highlight the evolving literature that will impact practicing oncologists in the near future.

  6. Microstructure history effect during sequential thermomechanical processing

    International Nuclear Information System (INIS)

    Yassar, Reza S.; Murphy, John; Burton, Christina; Horstemeyer, Mark F.; El kadiri, Haitham; Shokuhfar, Tolou

    2008-01-01

    The key to modeling the material processing behavior is the linking of the microstructure evolution to its processing history. This paper quantifies various microstructural features of an aluminum automotive alloy that undergoes sequential thermomechanical processing which is comprised hot rolling of a 150-mm billet to a 75-mm billet, rolling to 3 mm, annealing, and then cold rolling to a 0.8-mm thickness sheet. The microstructural content was characterized by means of electron backscatter diffraction, scanning electron microscopy, and transmission electron microscopy. The results clearly demonstrate the evolution of precipitate morphologies, dislocation structures, and grain orientation distributions. These data can be used to improve material models that claim to capture the history effects of the processing materials

  7. Monitoring sequential electron transfer with EPR

    International Nuclear Information System (INIS)

    Thurnauer, M.C.; Feezel, L.L.; Snyder, S.W.; Tang, J.; Norris, J.R.; Morris, A.L.; Rustandi, R.R.

    1989-01-01

    A completely general model which treats electron spin polarization (ESP) found in a system in which radical pairs with different magnetic interactions are formed sequentially has been described. This treatment has been applied specifically to the ESP found in the bacterial reaction center. Test cases show clearly how parameters such as structure, lifetime, and magnetic interactions within the successive radical pairs affect the ESP, and demonstrate that previous treatments of this problem have been incomplete. The photosynthetic bacterial reaction center protein is an ideal system for testing the general model of ESP. The radical pair which exhibits ESP, P 870 + Q - (P 870 + is the oxidized, primary electron donor, a bacteriochlorophyll special pair and Q - is the reduced, primary quinone acceptor) is formed via sequential electron transport through the intermediary radical pair P 870 + I - (I - is the reduced, intermediary electron acceptor, a bacteriopheophytin). In addition, it is possible to experimentally vary most of the important parameters, such as the lifetime of the intermediary radical pair and the magnetic interactions in each pair. It has been shown how selective isotopic substitution ( 1 H or 2 H) on P 870 , I and Q affects the ESP of the EPR spectrum of P 870 + Q - , observed at two different microwave frequencies, in Fe 2+ -depleted bacterial reaction centers of Rhodobacter sphaeroides R26. Thus, the relative magnitudes of the magnetic properties (nuclear hyperfine and g-factor differences) which influence ESP development were varied. The results support the general model of ESP in that they suggest that the P 870 + Q - radical pair interactions are the dominant source of ESP production in 2 H bacterial reaction centers

  8. Social Cognitive Antecedents of Fruit and Vegetable Consumption in Truck Drivers: A Sequential Mediation Analysis.

    Science.gov (United States)

    Hamilton, Kyra; Vayro, Caitlin; Schwarzer, Ralf

    2015-01-01

    To examine a mechanism by which social cognitive factors may predict fruit and vegetable consumption in long-haul truck drivers. Dietary self-efficacy, positive outcome expectancies, and intentions were assessed in 148 Australian truck drivers, and 1 week later they reported their fruit and vegetable consumption. A theory-guided sequential mediation model was specified that postulated self-efficacy and intention as mediators between outcome expectancies and behavior. The hypothesized model was confirmed. A direct effect of outcome expectancies was no longer present when mediators were included, and all indirect effects were significant, including the 2-mediator chain (β = .15; P role of outcome expectancies and self-efficacy are important to consider for understanding and predicting healthy eating intentions in truck drivers. Copyright © 2015 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  9. Simulation reframed.

    Science.gov (United States)

    Kneebone, Roger L

    2016-01-01

    Simulation is firmly established as a mainstay of clinical education, and extensive research has demonstrated its value. Current practice uses inanimate simulators (with a range of complexity, sophistication and cost) to address the patient 'as body' and trained actors or lay people (Simulated Patients) to address the patient 'as person'. These approaches are often separate.Healthcare simulation to date has been largely for the training and assessment of clinical 'insiders', simulating current practices. A close coupling with the clinical world restricts access to the facilities and practices of simulation, often excluding patients, families and publics. Yet such perspectives are an essential component of clinical practice. This paper argues that simulation offers opportunities to move outside a clinical 'insider' frame and create connections with other individuals and groups. Simulation becomes a bridge between experts whose worlds do not usually intersect, inviting an exchange of insights around embodied practices-the 'doing' of medicine-without jeopardising the safety of actual patients.Healthcare practice and education take place within a clinical frame that often conceals parallels with other domains of expert practice. Valuable insights emerge by viewing clinical practice not only as the application of medical science but also as performance and craftsmanship.Such connections require a redefinition of simulation. Its essence is not expensive elaborate facilities. Developments such as hybrid, distributed and sequential simulation offer examples of how simulation can combine 'patient as body' with 'patient as person' at relatively low cost, democratising simulation and exerting traction beyond the clinical sphere.The essence of simulation is a purposeful design, based on an active process of selection from an originary world, abstraction of what is criterial and re - presentation in another setting for a particular purpose or audience. This may be done within

  10. A Semi-Potential for Finite and Infinite Sequential Games (Extended Abstract

    Directory of Open Access Journals (Sweden)

    Stéphane Le Roux

    2016-09-01

    Full Text Available We consider a dynamical approach to sequential games. By restricting the convertibility relation over strategy profiles, we obtain a semi-potential (in the sense of Kukushkin, and we show that in finite games the corresponding restriction of better-response dynamics will converge to a Nash equilibrium in quadratic time. Convergence happens on a per-player basis, and even in the presence of players with cyclic preferences, the players with acyclic preferences will stabilize. Thus, we obtain a candidate notion for rationality in the presence of irrational agents. Moreover, the restriction of convertibility can be justified by a conservative updating of beliefs about the other players strategies. For infinite sequential games we can retain convergence to a Nash equilibrium (in some sense, if the preferences are given by continuous payoff functions; or obtain a transfinite convergence if the outcome sets of the game are Delta^0_2 sets.

  11. DYNAMIC ANALYSIS OF THE BULK TRITIUM SHIPPING PACKAGE SUBJECTED TO CLOSURE TORQUES AND SEQUENTIAL IMPACTS

    International Nuclear Information System (INIS)

    Wu, T; Paul Blanton, P; Kurt Eberl, K

    2007-01-01

    This paper presents a finite-element technique to simulate the structural responses and to evaluate the cumulative damage of a radioactive material packaging requiring bolt closure-tightening torque and subjected to the scenarios of the Hypothetical Accident Conditions (HAC) defined in the Code of Federal Regulations Title 10 part 71 (10CFR71). Existing finite-element methods for modeling closure stresses from bolt pre-load are not readily adaptable to dynamic analyses. The HAC events are required to occur sequentially per 10CFR71 and thus the evaluation of the cumulative damage is desirable. Generally, each HAC event is analyzed separately and the cumulative damage is partially addressed by superposition. This results in relying on additional physical testing to comply with 10CFR71 requirements for assessment of cumulative damage. The proposed technique utilizes the combination of kinematic constraints, rigid-body motions and structural deformations to overcome some of the difficulties encountered in modeling the effect of cumulative damage. This methodology provides improved numerical solutions in compliance with the 10CFR71 requirements for sequential HAC tests. Analyses were performed for the Bulk Tritium Shipping Package (BTSP) designed by Savannah River National Laboratory to demonstrate the applications of the technique. The methodology proposed simulates the closure bolt torque preload followed by the sequential HAC events, the 30-foot drop and the 30-foot dynamic crush. The analytical results will be compared to the package test data

  12. A novel method for the sequential removal and separation of multiple heavy metals from wastewater.

    Science.gov (United States)

    Fang, Li; Li, Liang; Qu, Zan; Xu, Haomiao; Xu, Jianfang; Yan, Naiqiang

    2018-01-15

    A novel method was developed and applied for the treatment of simulated wastewater containing multiple heavy metals. A sorbent of ZnS nanocrystals (NCs) was synthesized and showed extraordinary performance for the removal of Hg 2+ , Cu 2+ , Pb 2+ and Cd 2+ . The removal efficiencies of Hg 2+ , Cu 2+ , Pb 2+ and Cd 2+ were 99.9%, 99.9%, 90.8% and 66.3%, respectively. Meanwhile, it was determined that solubility product (K sp ) of heavy metal sulfides was closely related to adsorption selectivity of various heavy metals on the sorbent. The removal efficiency of Hg 2+ was higher than that of Cd 2+ , while the K sp of HgS was lower than that of CdS. It indicated that preferential adsorption of heavy metals occurred when the K sp of the heavy metal sulfide was lower. In addition, the differences in the K sp of heavy metal sulfides allowed for the exchange of heavy metals, indicating the potential application for the sequential removal and separation of heavy metals from wastewater. According to the cumulative adsorption experimental results, multiple heavy metals were sequentially adsorbed and separated from the simulated wastewater in the order of the K sp of their sulfides. This method holds the promise of sequentially removing and separating multiple heavy metals from wastewater. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Sequential decision reliability concept and failure rate assessment

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1990-11-01

    Conventionally, a reliability concept is considered together with both each basic unit and their integration in a complicated large scale system such as a nuclear power plant (NPP). Basically, as the plant's operational status is determined by the information obtained from various sensors, the plant's reliability and the risk assessment is closely related to the reliability of the sensory information and hence the sensor components. However, considering the relevant information-processing systems, e.g. fault detection processors, there exists a further question about the reliability of such systems, specifically the reliability of the systems' decision-based outcomes by means of which the further actions are performed. To this end, a general sequential decision reliability concept and the failure rate assessment methodology is introduced. The implications of the methodology are investigated and the importance of the decision reliability concept in system operation is demonstrated by means of sensory signals in real-time from the Borssele NPP in the Netherlands. (author). 21 refs.; 8 figs

  14. The role of sequential chemoradiation for local advanced oropharyngeal carcinoma

    International Nuclear Information System (INIS)

    Masterson, Liam; Tanweer, Faiz

    2013-01-01

    This study aims to assess survival, prognostic indicators, and pattern of failure for advanced oropharyngeal cancer treated by induction chemotherapy followed by concomitant chemoradiation (sequential CRT). A retrospective review of 80 consecutive patients who underwent chemoradiation [doublet cisplatin and 5-fluorouracil (PF)] for local advanced oropharyngeal carcinoma at a tertiary center from March 2003 to July 2008 is reported. Seven studies utilizing a similar protocol were reviewed, and all outcomes are collated. At a median follow-up of 32 months, the 3-year overall survival was 75%. Tumor size (p<0.001), age at presentation (p<0.002), and failure to complete the full course of induction chemotherapy (p<0.01) were all found to be significant factors affecting survival. Induction chemotherapy followed by concomitant chemoradiation utilizing doublet PF is an effective treatment for local advanced oropharyngeal carcinoma. At present, the addition of a taxane to the PF regimen cannot be assumed to provide benefit until further evidence emerges from a representative controlled trial. (author)

  15. Decentralized enforcement, sequential bargaining, and the clean development mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Hovi, Jon

    2001-07-01

    While there is a vast literature both on international bargaining and on how international agreements can be enforced, very little work has been done on how bargaining and enforcement interact. An important exception is Fearon (1998), who models international cooperation as a two-stage process in which the bargaining process is constrained by a need for decentralized enforcement (meaning that the agreement must be enforced by the parties themselves rather than a third party, such as a court). Using the Clean Development Mechanism as an example, the present paper proposes a different model of this kind of interaction. The model follows Fearon's in so far as we both use the infinitely repeated Prisoners' Dilemma to capture the enforcement phase of the game. However, while Fearon depicts the bargaining stage as a War of Attrition, the present model sees that stage as a sequential bargaining game of the Staahl-Rubinstein type. The implications of the present model are compared both to those of the Staahl-Rubinstein model and to those of the Fearon model. A surprising conclusion is that a need for decentralized enforcement tends to make the bargaining outcome more symmetrical than otherwise. Thus, the impact of bargaining power is actually smaller when the resulting agreement must be enforced by the parties themselves than it is if enforcement is taken care of by a third party. (author)

  16. Simultaneous and sequential implantation of intacs and verisyse phakic intraocular lens for refractive improvement in keratectasia.

    Science.gov (United States)

    Moshirfar, Majid; Fenzl, Carlton R; Meyer, Jay J; Neuffer, Marcus C; Espandar, Ladan; Mifflin, Mark D

    2011-02-01

    To evaluate the safety, efficacy, and visual outcomes of simultaneous and sequential implantation of Intacs (Addition Technology, Inc, Sunnyvale, CA) and Verisyse phakic intraocular lens (AMO, Santa Ana, CA) in selected cases of ectatic corneal disease. John A. Moran Eye Center, University of Utah, UT. Prospective data were collected from 19 eyes of 12 patients (5 eyes, post-laser in situ keratomileusis ectasia and 14 eyes, keratoconus). Intacs segments were implanted followed by insertion of a phakic Verisyse lens at the same session (12 eyes) in the simultaneous group or several months later (7 eyes) in the sequential group. The uncorrected visual acuity, best spectacle-corrected visual acuity (BSCVA), and manifest refraction were recorded at each visit. No intraoperative or postoperative complications were observed. At the last follow-up (19 ± 6 months), in the simultaneous group, mean spherical error was -0.79 ± 1.0 diopter (D) (range, -2.0 to +1.50 D) and cylindrical error +2.06 ± 1.21 D (range, +0.5 to +3.75 D). In the sequential group, at the last follow-up, at 36 ± 21 months, the mean spherical error was -1.64 ± 1.31 D (range, -3.25 to +1.0 D) and cylindrical error +2.07 ± 1.03 D (range, +0.75 to +3.25 D). There were no significant differences in mean uncorrected visual acuity or BSCVA between the 2 groups preoperatively or postoperatively. No eye lost lines of preoperative BSCVA. Combined insertion of Intacs and Verisyse was safe and effective in all cases. The outcomes of the simultaneous implantation of the Intacs and Verisyse lens in 1 surgery were similar to the results achieved with sequential implantation using 2 surgeries.

  17. Campbell and moment measures for finite sequential spatial processes

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette)

    2006-01-01

    textabstractWe define moment and Campbell measures for sequential spatial processes, prove a Campbell-Mecke theorem, and relate the results to their counterparts in the theory of point processes. In particular, we show that any finite sequential spatial process model can be derived as the vector

  18. Reading Remediation Based on Sequential and Simultaneous Processing.

    Science.gov (United States)

    Gunnison, Judy; And Others

    1982-01-01

    The theory postulating a dichotomy between sequential and simultaneous processing is reviewed and its implications for remediating reading problems are reviewed. Research is cited on sequential-simultaneous processing for early and advanced reading. A list of remedial strategies based on the processing dichotomy addresses decoding and lexical…

  19. Induction of simultaneous and sequential malolactic fermentation in durian wine.

    Science.gov (United States)

    Taniasuri, Fransisca; Lee, Pin-Rou; Liu, Shao-Quan

    2016-08-02

    This study represented for the first time the impact of malolactic fermentation (MLF) induced by Oenococcus oeni and its inoculation strategies (simultaneous vs. sequential) on the fermentation performance as well as aroma compound profile of durian wine. There was no negative impact of simultaneous inoculation of O. oeni and Saccharomyces cerevisiae on the growth and fermentation kinetics of S. cerevisiae as compared to sequential fermentation. Simultaneous MLF did not lead to an excessive increase in volatile acidity as compared to sequential MLF. The kinetic changes of organic acids (i.e. malic, lactic, succinic, acetic and α-ketoglutaric acids) varied with simultaneous and sequential MLF relative to yeast alone. MLF, regardless of inoculation mode, resulted in higher production of fermentation-derived volatiles as compared to control (alcoholic fermentation only), including esters, volatile fatty acids, and terpenes, except for higher alcohols. Most indigenous volatile sulphur compounds in durian were decreased to trace levels with little differences among the control, simultaneous and sequential MLF. Among the different wines, the wine with simultaneous MLF had higher concentrations of terpenes and acetate esters while sequential MLF had increased concentrations of medium- and long-chain ethyl esters. Relative to alcoholic fermentation only, both simultaneous and sequential MLF reduced acetaldehyde substantially with sequential MLF being more effective. These findings illustrate that MLF is an effective and novel way of modulating the volatile and aroma compound profile of durian wine. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. A Survey of Multi-Objective Sequential Decision-Making

    NARCIS (Netherlands)

    Roijers, D.M.; Vamplew, P.; Whiteson, S.; Dazeley, R.

    2013-01-01

    Sequential decision-making problems with multiple objectives arise naturally in practice and pose unique challenges for research in decision-theoretic planning and learning, which has largely focused on single-objective settings. This article surveys algorithms designed for sequential

  1. Dynamics-based sequential memory: Winnerless competition of patterns

    International Nuclear Information System (INIS)

    Seliger, Philip; Tsimring, Lev S.; Rabinovich, Mikhail I.

    2003-01-01

    We introduce a biologically motivated dynamical principle of sequential memory which is based on winnerless competition (WLC) of event images. This mechanism is implemented in a two-layer neural model of sequential spatial memory. We present the learning dynamics which leads to the formation of a WLC network. After learning, the system is capable of associative retrieval of prerecorded sequences of patterns

  2. Sequential, progressive, equal-power, reflective beam-splitter arrays

    Science.gov (United States)

    Manhart, Paul K.

    2017-11-01

    The equations to calculate equal-power reflectivity of a sequential series of beam splitters is presented. Non-sequential optical design examples are offered for uniform illumination using diode lasers. Objects created using Boolean operators and Swept Surfaces can create objects capable of reflecting light into predefined elevation and azimuth angles. Analysis of the illumination patterns for the array are also presented.

  3. The sequential price of anarchy for atomic congestion games

    NARCIS (Netherlands)

    de Jong, Jasper; Uetz, Marc Jochen; Liu, Tie-Yan; Qi, Qi; Ye, Yinyu

    2014-01-01

    In situations without central coordination, the price of anarchy relates the quality of any Nash equilibrium to the quality of a global optimum. Instead of assuming that all players choose their actions simultaneously, we consider games where players choose their actions sequentially. The sequential

  4. Quantum Probability Zero-One Law for Sequential Terminal Events

    Science.gov (United States)

    Rehder, Wulf

    1980-07-01

    On the basis of the Jauch-Piron quantum probability calculus a zero-one law for sequential terminal events is proven, and the significance of certain crucial axioms in the quantum probability calculus is discussed. The result shows that the Jauch-Piron set of axioms is appropriate for the non-Boolean algebra of sequential events.

  5. Lineup Composition, Suspect Position, and the Sequential Lineup Advantage

    Science.gov (United States)

    Carlson, Curt A.; Gronlund, Scott D.; Clark, Steven E.

    2008-01-01

    N. M. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001) argued that sequential lineups reduce the likelihood of mistaken eyewitness identification. Experiment 1 replicated the design of R. C. L. Lindsay and G. L. Wells (1985), the first study to show the sequential lineup advantage. However, the innocent suspect was chosen at a lower rate…

  6. Accounting for Heterogeneous Returns in Sequential Schooling Decisions

    NARCIS (Netherlands)

    Zamarro, G.

    2006-01-01

    This paper presents a method for estimating returns to schooling that takes into account that returns may be heterogeneous among agents and that educational decisions are made sequentially.A sequential decision model is interesting because it explicitly considers that the level of education of each

  7. Sequential blind identification of underdetermined mixtures using a novel deflation scheme.

    Science.gov (United States)

    Zhang, Mingjian; Yu, Simin; Wei, Gang

    2013-09-01

    In this brief, we consider the problem of blind identification in underdetermined instantaneous mixture cases, where there are more sources than sensors. A new blind identification algorithm, which estimates the mixing matrix in a sequential fashion, is proposed. By using the rank-1 detecting device, blind identification is reformulated as a constrained optimization problem. The identification of one column of the mixing matrix hence reduces to an optimization task for which an efficient iterative algorithm is proposed. The identification of the other columns of the mixing matrix is then carried out by a generalized eigenvalue decomposition-based deflation method. The key merit of the proposed deflation method is that it does not suffer from error accumulation. The proposed sequential blind identification algorithm provides more flexibility and better robustness than its simultaneous counterpart. Comparative simulation results demonstrate the superior performance of the proposed algorithm over the simultaneous blind identification algorithm.

  8. Development and sensitivity analysis of a fullykinetic model of sequential reductive dechlorination in subsurface

    DEFF Research Database (Denmark)

    Malaguerra, Flavio; Chambon, Julie Claire Claudia; Albrechtsen, Hans-Jørgen

    2010-01-01

    and natural degradation of chlorinated solvents frequently occurs in the subsurface through sequential reductive dechlorination. However, the occurrence and the performance of natural sequential reductive dechlorination strongly depends on environmental factor such as redox conditions, presence of fermenting...... organic matter / electron donors, presence of specific biomass, etc. Here we develop a new fully-kinetic biogeochemical reactive model able to simulate chlorinated solvents degradation as well as production and consumption of molecular hydrogen. The model is validated using batch experiment data......Chlorinated hydrocarbons originating from point sources are amongst the most prevalent contaminants of ground water and often represent a serious threat to groundwater-based drinking water resources. Natural attenuation of contaminant plumes can play a major role in contaminated site management...

  9. PARTICLE FILTERING WITH SEQUENTIAL PARAMETER LEARNING FOR NONLINEAR BOLD fMRI SIGNALS.

    Science.gov (United States)

    Xia, Jing; Wang, Michelle Yongmei

    Analyzing the blood oxygenation level dependent (BOLD) effect in the functional magnetic resonance imaging (fMRI) is typically based on recent ground-breaking time series analysis techniques. This work represents a significant improvement over existing approaches to system identification using nonlinear hemodynamic models. It is important for three reasons. First, instead of using linearized approximations of the dynamics, we present a nonlinear filtering based on the sequential Monte Carlo method to capture the inherent nonlinearities in the physiological system. Second, we simultaneously estimate the hidden physiological states and the system parameters through particle filtering with sequential parameter learning to fully take advantage of the dynamic information of the BOLD signals. Third, during the unknown static parameter learning, we employ the low-dimensional sufficient statistics for efficiency and avoiding potential degeneration of the parameters. The performance of the proposed method is validated using both the simulated data and real BOLD fMRI data.

  10. Sequential double photodetachment of He- in elliptically polarized laser fields

    Science.gov (United States)

    Génévriez, Matthieu; Dunseath, Kevin M.; Terao-Dunseath, Mariko; Urbain, Xavier

    2018-02-01

    Four-photon double detachment of the helium negative ion is investigated experimentally and theoretically for photon energies where the transient helium atom is in the 1 s 2 s 3S or 1 s 2 p P3o states, which subsequently ionize by absorption of three photons. Ionization is enhanced by intermediate resonances, giving rise to series of peaks in the He+ spectrum, which we study in detail. The He+ yield is measured in the wavelength ranges from 530 to 560 nm and from 685 to 730 nm and for various polarizations of the laser light. Double detachment is treated theoretically as a sequential process, within the framework of R -matrix theory for the first step and effective Hamiltonian theory for the second step. Experimental conditions are accurately modeled, and the measured and simulated yields are in good qualitative and, in some cases, quantitative agreement. Resonances in the double detachment spectra can be attributed to well-defined Rydberg states of the transient atom. The double detachment yield exhibits a strong dependence on the laser polarization which can be related to the magnetic quantum number of the intermediate atomic state. We also investigate the possibility of nonsequential double detachment with a two-color experiment but observe no evidence for it.

  11. How large are the consequences of covariate imbalance in cluster randomized trials: a simulation study with a continuous outcome and a binary covariate at the cluster level.

    Science.gov (United States)

    Moerbeek, Mirjam; van Schie, Sander

    2016-07-11

    The number of clusters in a cluster randomized trial is often low. It is therefore likely random assignment of clusters to treatment conditions results in covariate imbalance. There are no studies that quantify the consequences of covariate imbalance in cluster randomized trials on parameter and standard error bias and on power to detect treatment effects. The consequences of covariance imbalance in unadjusted and adjusted linear mixed models are investigated by means of a simulation study. The factors in this study are the degree of imbalance, the covariate effect size, the cluster size and the intraclass correlation coefficient. The covariate is binary and measured at the cluster level; the outcome is continuous and measured at the individual level. The results show covariate imbalance results in negligible parameter bias and small standard error bias in adjusted linear mixed models. Ignoring the possibility of covariate imbalance while calculating the sample size at the cluster level may result in a loss in power of at most 25 % in the adjusted linear mixed model. The results are more severe for the unadjusted linear mixed model: parameter biases up to 100 % and standard error biases up to 200 % may be observed. Power levels based on the unadjusted linear mixed model are often too low. The consequences are most severe for large clusters and/or small intraclass correlation coefficients since then the required number of clusters to achieve a desired power level is smallest. The possibility of covariate imbalance should be taken into account while calculating the sample size of a cluster randomized trial. Otherwise more sophisticated methods to randomize clusters to treatments should be used, such as stratification or balance algorithms. All relevant covariates should be carefully identified, be actually measured and included in the statistical model to avoid severe levels of parameter and standard error bias and insufficient power levels.

  12. Prostate Cancer Predictive Simulation Modelling, Assessing the Risk Technique (PCP-SMART): Introduction and Initial Clinical Efficacy Evaluation Data Presentation of a Simple Novel Mathematical Simulation Modelling Method, Devised to Predict the Outcome of Prostate Biopsy on an Individual Basis.

    Science.gov (United States)

    Spyropoulos, Evangelos; Kotsiris, Dimitrios; Spyropoulos, Katherine; Panagopoulos, Aggelos; Galanakis, Ioannis; Mavrikos, Stamatios

    2017-02-01

    We developed a mathematical "prostate cancer (PCa) conditions simulating" predictive model (PCP-SMART), from which we derived a novel PCa predictor (prostate cancer risk determinator [PCRD] index) and a PCa risk equation. We used these to estimate the probability of finding PCa on prostate biopsy, on an individual basis. A total of 371 men who had undergone transrectal ultrasound-guided prostate biopsy were enrolled in the present study. Given that PCa risk relates to the total prostate-specific antigen (tPSA) level, age, prostate volume, free PSA (fPSA), fPSA/tPSA ratio, and PSA density and that tPSA ≥ 50 ng/mL has a 98.5% positive predictive value for a PCa diagnosis, we hypothesized that correlating 2 variables composed of 3 ratios (1, tPSA/age; 2, tPSA/prostate volume; and 3, fPSA/tPSA; 1 variable including the patient's tPSA and the other, a tPSA value of 50 ng/mL) could operate as a PCa conditions imitating/simulating model. Linear regression analysis was used to derive the coefficient of determination (R 2 ), termed the PCRD index. To estimate the PCRD index's predictive validity, we used the χ 2 test, multiple logistic regression analysis with PCa risk equation formation, calculation of test performance characteristics, and area under the receiver operating characteristic curve analysis using SPSS, version 22 (P regression revealed the PCRD index as an independent PCa predictor, and the formulated risk equation was 91% accurate in predicting the probability of finding PCa. On the receiver operating characteristic analysis, the PCRD index (area under the curve, 0.926) significantly (P < .001) outperformed other, established PCa predictors. The PCRD index effectively predicted the prostate biopsy outcome, correctly identifying 9 of 10 men who were eventually diagnosed with PCa and correctly ruling out PCa for 9 of 10 men who did not have PCa. Its predictive power significantly outperformed established PCa predictors, and the formulated risk equation

  13. Constrained treatment planning using sequential beam selection

    International Nuclear Information System (INIS)

    Woudstra, E.; Storchi, P.R.M.

    2000-01-01

    In this paper an algorithm is described for automated treatment plan generation. The algorithm aims at delivery of the prescribed dose to the target volume without violation of constraints for target, organs at risk and the surrounding normal tissue. Pre-calculated dose distributions for all candidate orientations are used as input. Treatment beams are selected in a sequential way. A score function designed for beam selection is used for the simultaneous selection of beam orientations and weights. In order to determine the optimum choice for the orientation and the corresponding weight of each new beam, the score function is first redefined to account for the dose distribution of the previously selected beams. Addition of more beams to the plan is stopped when the target dose is reached or when no additional dose can be delivered without violating a constraint. In the latter case the score function is modified by importance factor changes to enforce better sparing of the organ with the limiting constraint and the algorithm is run again. (author)

  14. Phenomenology of the next sequential lepton

    International Nuclear Information System (INIS)

    Rizzo, T.G.

    1980-01-01

    We consider the phenomenology of a sequential, charged lepton in the mass range 6 --13 GeV. We find the semileptonic branching ratio of such a lepton to be approx. 13%; the dominant two-body modes are found to include the decay L → ν/sub L/F* with a branching ratio approx. 6%. In this analysis we assume that the mass of the lepton under consideration is lighter than the t quark such that decays such as L → ν/sub L/t-barq, where q= (d, s, or b) are kinematically forbidden. We also find that decays such as L → ν/sub L/B* (c-barb) can also be as large as approx. 6% depending on the mixing angles; the lifetime of such a lepton is found to be approx. 2.6 x 10 -12 M/sub L/ -5 sec, where M/sub L/ is in GeV

  15. The Origin of Sequential Chromospheric Brightenings

    Science.gov (United States)

    Kirk, M. S.; Balasubramaniam, K. S.; Jackiewicz, J.; Gilbert, H. R.

    2017-06-01

    Sequential chromospheric brightenings (SCBs) are often observed in the immediate vicinity of erupting flares and are associated with coronal mass ejections. Since their initial discovery in 2005, there have been several subsequent investigations of SCBs. These studies have used differing detection and analysis techniques, making it difficult to compare results between studies. This work employs the automated detection algorithm of Kirk et al. (Solar Phys. 283, 97, 2013) to extract the physical characteristics of SCBs in 11 flares of varying size and intensity. We demonstrate that the magnetic substructure within the SCB appears to have a significantly smaller area than the corresponding Hα emission. We conclude that SCBs originate in the lower corona around 0.1 R_{⊙} above the photosphere, propagate away from the flare center at speeds of 35 - 85 km s^{-1}, and have peak photosphere magnetic intensities of 148±2.9 G. In light of these measurements, we infer SCBs to be distinctive chromospheric signatures of erupting coronal mass ejections.

  16. Sequential decoders for large MIMO systems

    KAUST Repository

    Ali, Konpal S.

    2014-05-01

    Due to their ability to provide high data rates, multiple-input multiple-output (MIMO) systems have become increasingly popular. Decoding of these systems with acceptable error performance is computationally very demanding. In this paper, we employ the Sequential Decoder using the Fano Algorithm for large MIMO systems. A parameter called the bias is varied to attain different performance-complexity trade-offs. Low values of the bias result in excellent performance but at the expense of high complexity and vice versa for higher bias values. Numerical results are done that show moderate bias values result in a decent performance-complexity trade-off. We also attempt to bound the error by bounding the bias, using the minimum distance of a lattice. The variations in complexity with SNR have an interesting trend that shows room for considerable improvement. Our work is compared against linear decoders (LDs) aided with Element-based Lattice Reduction (ELR) and Complex Lenstra-Lenstra-Lovasz (CLLL) reduction. © 2014 IFIP.

  17. Social Influences in Sequential Decision Making.

    Directory of Open Access Journals (Sweden)

    Markus Schöbel

    Full Text Available People often make decisions in a social environment. The present work examines social influence on people's decisions in a sequential decision-making situation. In the first experimental study, we implemented an information cascade paradigm, illustrating that people infer information from decisions of others and use this information to make their own decisions. We followed a cognitive modeling approach to elicit the weight people give to social as compared to private individual information. The proposed social influence model shows that participants overweight their own private information relative to social information, contrary to the normative Bayesian account. In our second study, we embedded the abstract decision problem of Study 1 in a medical decision-making problem. We examined whether in a medical situation people also take others' authority into account in addition to the information that their decisions convey. The social influence model illustrates that people weight social information differentially according to the authority of other decision makers. The influence of authority was strongest when an authority's decision contrasted with private information. Both studies illustrate how the social environment provides sources of information that people integrate differently for their decisions.

  18. Social Influences in Sequential Decision Making

    Science.gov (United States)

    Schöbel, Markus; Rieskamp, Jörg; Huber, Rafael

    2016-01-01

    People often make decisions in a social environment. The present work examines social influence on people’s decisions in a sequential decision-making situation. In the first experimental study, we implemented an information cascade paradigm, illustrating that people infer information from decisions of others and use this information to make their own decisions. We followed a cognitive modeling approach to elicit the weight people give to social as compared to private individual information. The proposed social influence model shows that participants overweight their own private information relative to social information, contrary to the normative Bayesian account. In our second study, we embedded the abstract decision problem of Study 1 in a medical decision-making problem. We examined whether in a medical situation people also take others’ authority into account in addition to the information that their decisions convey. The social influence model illustrates that people weight social information differentially according to the authority of other decision makers. The influence of authority was strongest when an authority's decision contrasted with private information. Both studies illustrate how the social environment provides sources of information that people integrate differently for their decisions. PMID:26784448

  19. Sequential acquisition of mutations in myelodysplastic syndromes.

    Science.gov (United States)

    Makishima, Hideki

    2017-01-01

    Recent progress in next-generation sequencing technologies allows us to discover frequent mutations throughout the coding regions of myelodysplastic syndromes (MDS), potentially providing us with virtually a complete spectrum of driver mutations in this disease. As shown by many study groups these days, such driver mutations are acquired in a gene-specific fashion. For instance, DDX41 mutations are observed in germline cells long before MDS presentation. In blood samples from healthy elderly individuals, somatic DNMT3A and TET2 mutations are detected as age-related clonal hematopoiesis and are believed to be a risk factor for hematological neoplasms. In MDS, mutations of genes such as NRAS and FLT3, designated as Type-1 genes, may be significantly associated with leukemic evolution. Another type (Type-2) of genes, including RUNX1 and GATA2, are related to progression from low-risk to high-risk MDS. Overall, various driver mutations are sequentially acquired in MDS, at a specific time, in either germline cells, normal hematopoietic cells, or clonal MDS cells.

  20. Building a Lego wall: Sequential action selection.

    Science.gov (United States)

    Arnold, Amy; Wing, Alan M; Rotshtein, Pia

    2017-05-01

    The present study draws together two distinct lines of enquiry into the selection and control of sequential action: motor sequence production and action selection in everyday tasks. Participants were asked to build 2 different Lego walls. The walls were designed to have hierarchical structures with shared and dissociated colors and spatial components. Participants built 1 wall at a time, under low and high load cognitive states. Selection times for correctly completed trials were measured using 3-dimensional motion tracking. The paradigm enabled precise measurement of the timing of actions, while using real objects to create an end product. The experiment demonstrated that action selection was slowed at decision boundary points, relative to boundaries where no between-wall decision was required. Decision points also affected selection time prior to the actual selection window. Dual-task conditions increased selection errors. Errors mostly occurred at boundaries between chunks and especially when these required decisions. The data support hierarchical control of sequenced behavior. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Sequential experimental design based generalised ANOVA

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    2016-07-15

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  2. Social Influences in Sequential Decision Making.

    Science.gov (United States)

    Schöbel, Markus; Rieskamp, Jörg; Huber, Rafael

    2016-01-01

    People often make decisions in a social environment. The present work examines social influence on people's decisions in a sequential decision-making situation. In the first experimental study, we implemented an information cascade paradigm, illustrating that people infer information from decisions of others and use this information to make their own decisions. We followed a cognitive modeling approach to elicit the weight people give to social as compared to private individual information. The proposed social influence model shows that participants overweight their own private information relative to social information, contrary to the normative Bayesian account. In our second study, we embedded the abstract decision problem of Study 1 in a medical decision-making problem. We examined whether in a medical situation people also take others' authority into account in addition to the information that their decisions convey. The social influence model illustrates that people weight social information differentially according to the authority of other decision makers. The influence of authority was strongest when an authority's decision contrasted with private information. Both studies illustrate how the social environment provides sources of information that people integrate differently for their decisions.

  3. Bilateral Giant Retinal Tear and Sequential Vitrectomy.

    Science.gov (United States)

    Mustapha, Mushawiahti; Roufail Franzco, Edward

    2017-01-01

    To describe the excellent outcome of surgery for bilateral giant retinal tears (GRTs) with better options of endotamponade. This is a case report of a 62-year-old man who presented with bilateral GRTs and associated retinal detachment. The tear in the right eye was supero-temporal and silicone oil was used as an endotamponade. The tear in the left eye was infero-temporal and perfluorocarbon liquid was used as an endotamponade. The outcome at 6 months after surgery was excellent with visual acuities of 6/6 in both eyes. Improved availability of endotamponade agents allows repair of bilateral GRTs to be done at the same time, with good surgical outcomes.

  4. Comparison of ablation centration after bilateral sequential versus simultaneous LASIK.

    Science.gov (United States)

    Lin, Jane-Ming; Tsai, Yi-Yu

    2005-01-01

    To compare ablation centration after bilateral sequential and simultaneous myopic LASIK. A retrospective randomized case series was performed of 670 eyes of 335 consecutive patients who had undergone either bilateral sequential (group 1) or simultaneous (group 2) myopic LASIK between July 2000 and July 2001 at the China Medical University Hospital, Taichung, Taiwan. The ablation centrations of the first and second eyes in the two groups were compared 3 months postoperatively. Of 670 eyes, 274 eyes (137 patients) comprised the sequential group and 396 eyes (198 patients) comprised the simultaneous group. Three months post-operatively, 220 eyes of 110 patients (80%) in the sequential group and 236 eyes of 118 patients (60%) in the simultaneous group provided topographic data for centration analysis. For the first eyes, mean decentration was 0.39 +/- 0.26 mm in the sequential group and 0.41 +/- 0.19 mm in the simultaneous group (P = .30). For the second eyes, mean decentration was 0.28 +/- 0.23 mm in the sequential group and 0.30 +/- 0.21 mm in the simultaneous group (P = .36). Decentration in the second eyes significantly improved in both groups (group 1, P = .02; group 2, P sequential group and 0.32 +/- 0.18 mm in the simultaneous group (P = .33). The difference of ablation center angles between the first and second eyes was 43.2 sequential group and 45.1 +/- 50.8 degrees in the simultaneous group (P = .42). Simultaneous bilateral LASIK is comparable to sequential surgery in ablation centration.

  5. The sequential trauma score - a new instrument for the sequential mortality prediction in major trauma*

    Directory of Open Access Journals (Sweden)

    Huber-Wagner S

    2010-05-01

    Full Text Available Abstract Background There are several well established scores for the assessment of the prognosis of major trauma patients that all have in common that they can be calculated at the earliest during intensive care unit stay. We intended to develop a sequential trauma score (STS that allows prognosis at several early stages based on the information that is available at a particular time. Study design In a retrospective, multicenter study using data derived from the Trauma Registry of the German Trauma Society (2002-2006, we identified the most relevant prognostic factors from the patients basic data (P, prehospital phase (A, early (B1, and late (B2 trauma room phase. Univariate and logistic regression models as well as score quality criteria and the explanatory power have been calculated. Results A total of 2,354 patients with complete data were identified. From the patients basic data (P, logistic regression showed that age was a significant predictor of survival (AUCmodel p, area under the curve = 0.63. Logistic regression of the prehospital data (A showed that blood pressure, pulse rate, Glasgow coma scale (GCS, and anisocoria were significant predictors (AUCmodel A = 0.76; AUCmodel P + A = 0.82. Logistic regression of the early trauma room phase (B1 showed that peripheral oxygen saturation, GCS, anisocoria, base excess, and thromboplastin time to be significant predictors of survival (AUCmodel B1 = 0.78; AUCmodel P +A + B1 = 0.85. Multivariate analysis of the late trauma room phase (B2 detected cardiac massage, abbreviated injury score (AIS of the head ≥ 3, the maximum AIS, the need for transfusion or massive blood transfusion, to be the most important predictors (AUCmodel B2 = 0.84; AUCfinal model P + A + B1 + B2 = 0.90. The explanatory power - a tool for the assessment of the relative impact of each segment to mortality - is 25% for P, 7% for A, 17% for B1 and 51% for B2. A spreadsheet for the easy calculation of the sequential trauma

  6. Improved Murine Blastocyst Quality and Development in a Single Culture Medium Compared to Sequential Culture Media.

    Science.gov (United States)

    Hennings, Justin M; Zimmer, Randall L; Nabli, Henda; Davis, J Wade; Sutovsky, Peter; Sutovsky, Miriam; Sharpe-Timms, Kathy L

    2016-03-01

    Validate single versus sequential culture media for murine embryo development. Prospective laboratory experiment. Assisted Reproduction Laboratory. Murine embryos. Thawed murine zygotes cultured for 3 or 5 days (d3 or d5) in single or sequential embryo culture media developed for human in vitro fertilization. On d3, zygotes developing to the 8 cell (8C) stage or greater were quantified using 4',6-diamidino-2-phenylindole (DAPI), and quality was assessed by morphological analysis. On d5, the number of embryos reaching the blastocyst stage was counted. DAPI was used to quantify total nuclei and inner cell mass nuclei. Localization of ubiquitin C-terminal hydrolase L1 (UCHL1) and ubiquitin C-terminal hydrolase L3 (UCHL3) was reference points for evaluating cell quality. Comparing outcomes in single versus to sequential media, the odds of embryos developing to the 8C stage on d3 were 2.34 time greater (P = .06). On d5, more embryos reached the blastocyst stage (P = culture. Human embryo studies are needed. © The Author(s) 2015.

  7. Estimating the Potential Impact of Tobacco Control Policies on Adverse Maternal and Child Health Outcomes in the United States Using the SimSmoke Tobacco Control Policy Simulation Model.

    Science.gov (United States)

    Levy, David; Mohlman, Mary Katherine; Zhang, Yian

    2016-05-01

    Numerous studies document the causal relationship between prenatal smoking and adverse maternal and child health (MCH) outcomes. Studies also reveal the impact that tobacco control policies have on prenatal smoking. The purpose of this study is to estimate the effect of tobacco control policies on prenatal smoking prevalence and adverse MCH outcomes. The US SimSmoke simulation model was extended to consider adverse MCH outcomes. The model estimates prenatal smoking prevalence and, applying standard attribution methods, uses estimates of MCH prevalence and relative smoking risks to estimate smoking-attributable MCH outcomes over time. The model then estimates the effect of tobacco control policies on adverse birth outcomes averted. Different tobacco control policies have varying impacts on the number of smoking-attributable adverse MCH birth outcomes. Higher cigarette taxes and comprehensive marketing bans individually have the biggest impact with a 5% to 10% reduction across all outcomes for the period from 2015 to 2065. The policies with the lowest impact (2%-3% decrease) during this period are cessation treatment, health warnings, and complete smoke-free laws. Combinations of all policies with each tax level lead to 23% to 28% decreases across all outcomes. Our findings demonstrate the substantial impact of strong tobacco control policies for preventing adverse MCH outcomes, including long-term health implications for children exposed to low birth weight and preterm birth. These benefits are often overlooked in discussions of tobacco control. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Radiotherapy with concurrent or sequential temozolomide in elderly patients with glioblastoma multiforme

    International Nuclear Information System (INIS)

    Hashem, Sameh A.; Salem, Ahmed; Al-Rashdan, Abdulla

    2012-01-01

    The objective of this article was to evaluate therapeutic outcomes of elderly patients with glioblastoma multiforme (GBM) treated by surgery followed by combined modality therapy and compare achievable outcomes to those of a younger age population. Seventy-eight adult patients with histologically confirmed grade IV astrocytoma were treated at King Hussein Cancer Center (Amman, Jordan) between September 2004 and December 2008. Records were retrospectively reviewed and included 55 males and 23 females between 19 and 78 years of age (median age 50 years). This case series included 20 patients aged 60 years or older. All patients underwent craniotomy followed radiotherapy and concurrent or sequential temozolomide. The follow-up ranged from 1 to 56 months (median 9.4 months). The median survival for the whole cohort was 13.8 months. The median survival for patients less than 60 years was 14.3 months and for patients 60 years or older was 12.3 months (P = 0.19). Among elderly patients, radical surgical resection (P = 0.002), concurrent delivery of chemoradiation (0.041) and radiotherapy dose ≥5400 cGy (P = 0.0001) conferred statistically significant improvements in overall survival. Management of GBM in elderly patients should include maximal surgical resection followed by radiotherapy and temozolomide whenever medically feasible. Outcomes comparable to those obtained in younger age groups can be expected. Our results indicate that concurrent chemoradiation is superior to sequential chemoradiation in these patients.

  9. Fast sequential Monte Carlo methods for counting and optimization

    CERN Document Server

    Rubinstein, Reuven Y; Vaisman, Radislav

    2013-01-01

    A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the

  10. Program For Parallel Discrete-Event Simulation

    Science.gov (United States)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  11. Sequential formation of subgroups in OB associations

    International Nuclear Information System (INIS)

    Elmegreen, B.G.; Lada, C.J.

    1977-01-01

    We reconsider the structure and formation of OB association in view of recent radio and infrared observations of the adjacent molecular clouds. As a result of this reexamination, we propose that OB subgroups are formed in a step-by-step process which involves the propagation of ionization (I) and shock (S) fronts through a molecular cloud complex. OB stars formed at the edge of a molecular cloud drive these I-S fronts into the cloud. A layer of dense neutral material accumulates between the I and S fronts and eventually becomes gravitationally unstable. This process is analyzed in detail. Several arguments concerning the temperature and mass of this layer suggest that a new OB subgroup will form. After approximately one-half million years, these stars will emerge from and disrupt the star-forming layer. A new shock will be driven into the remaining molecular cloud and will initiate another cycle of star formation.Several observed properties of OB associations are shown to follow from a sequential star-forming mechanism. These include the spatial separation and systematic differences in age of OB subgroups in a given association, the regularity of subgroup masses, the alignment of subgroups along the galactic plane, and their physical expansion. Detailed observations of ionization fronts, masers, IR sources, and molecular clouds are also in agreement with this model. Finally, this mechanism provides a means of dissipating a molecular cloud and exposing less massive stars (e.g., T Tauri stars) which may have formed ahead of the shock as part of the original cloud collapsed and fragmented

  12. District heating in sequential energy supply

    International Nuclear Information System (INIS)

    Persson, Urban; Werner, Sven

    2012-01-01

    Highlights: ► European excess heat recovery and utilisation by district heat distribution. ► Heat recovery in district heating systems – a structural energy efficiency measure. ► Introduction of new theoretical concepts to express excess heat recovery. ► Fourfold potential for excess heat utilisation in EU27 compared to current levels. ► Large scale excess heat recovery – a collaborative challenge for future Europe. -- Abstract: Increased recovery of excess heat from thermal power generation and industrial processes has great potential to reduce primary energy demands in EU27. In this study, current excess heat utilisation levels by means of district heat distribution are assessed and expressed by concepts such as recovery efficiency, heat recovery rate, and heat utilisation rate. For two chosen excess heat activities, current average EU27 heat recovery levels are compared to currently best Member State practices, whereby future potentials of European excess heat recovery and utilisation are estimated. The principle of sequential energy supply is elaborated to capture the conceptual idea of excess heat recovery in district heating systems as a structural and organisational energy efficiency measure. The general conditions discussed concerning expansion of heat recovery into district heating systems include infrastructure investments in district heating networks, collaboration agreements, maintained value chains, policy support, world market energy prices, allocation of synergy benefits, and local initiatives. The main conclusion from this study is that a future fourfold increase of current EU27 excess heat utilisation by means of district heat distribution to residential and service sectors is conceived as plausible if applying best Member State practice. This estimation is higher than the threefold increase with respect to direct feasible distribution costs estimated by the same authors in a previous study. Hence, no direct barriers appear with

  13. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  14. An Efficient System Based On Closed Sequential Patterns for Web Recommendations

    OpenAIRE

    Utpala Niranjan; R.B.V. Subramanyam; V-Khana

    2010-01-01

    Sequential pattern mining, since its introduction has received considerable attention among the researchers with broad applications. The sequential pattern algorithms generally face problems when mining long sequential patterns or while using very low support threshold. One possible solution of such problems is by mining the closed sequential patterns, which is a condensed representation of sequential patterns. Recently, several researchers have utilized the sequential pattern discovery for d...

  15. Sunk costs equal sunk boats? The effect of entry costs in a transboundary sequential fishery

    DEFF Research Database (Denmark)

    Punt, M. J.

    2017-01-01

    that for other fisheries substantial sunk investments are needed. In this paper I investigate the effect of such sunk entry costs in a sequential fisheries. I model the uncertainty as a shock to the stock dependent fishing costs, in a two player game, where one of the players faces sunk entry costs. I find that......, depending on parameters, sunk costs can i) increase the competitive pressure on the fish stock compared to a game where entry is free ii) act as a deterrence mechanism and iii) act as a commitment device. I conclude that entry costs can play a crucial role because they can change the outcome of the game...

  16. Adrenal vein sampling in primary aldosteronism: concordance of simultaneous vs sequential sampling.

    Science.gov (United States)

    Almarzooqi, Mohamed-Karji; Chagnon, Miguel; Soulez, Gilles; Giroux, Marie-France; Gilbert, Patrick; Oliva, Vincent L; Perreault, Pierre; Bouchard, Louis; Bourdeau, Isabelle; Lacroix, André; Therasse, Eric

    2017-02-01

    Many investigators believe that basal adrenal venous sampling (AVS) should be done simultaneously, whereas others opt for sequential AVS for simplicity and reduced cost. This study aimed to evaluate the concordance of sequential and simultaneous AVS methods. Between 1989 and 2015, bilateral simultaneous sets of basal AVS were obtained twice within 5 min, in 188 consecutive patients (59 women and 129 men; mean age: 53.4 years). Selectivity was defined by adrenal-to-peripheral cortisol ratio ≥2, and lateralization was defined as an adrenal aldosterone-to-cortisol ratio ≥2, the contralateral side. Sequential AVS was simulated using right sampling at -5 min (t = -5) and left sampling at 0 min (t = 0). There was no significant difference in mean selectivity ratio (P = 0.12 and P = 0.42 for the right and left sides respectively) and in mean lateralization ratio (P = 0.93) between t = -5 and t = 0. Kappa for selectivity between 2 simultaneous AVS was 0.71 (95% CI: 0.60-0.82), whereas it was 0.84 (95% CI: 0.76-0.92) and 0.85 (95% CI: 0.77-0.93) between sequential and simultaneous AVS at respectively -5 min and at 0 min. Kappa for lateralization between 2 simultaneous AVS was 0.84 (95% CI: 0.75-0.93), whereas it was 0.86 (95% CI: 0.78-0.94) and 0.80 (95% CI: 0.71-0.90) between sequential AVS and simultaneous AVS at respectively -5 min at 0 min. Concordance between simultaneous and sequential AVS was not different than that between 2 repeated simultaneous AVS in the same patient. Therefore, a better diagnostic performance is not a good argument to select the AVS method. © 2017 European Society of Endocrinology.

  17. Sequential versus "sandwich" sequencing of adjuvant chemoradiation for the treatment of stage III uterine endometrioid adenocarcinoma.

    Science.gov (United States)

    Lu, Sharon M; Chang-Halpenny, Christine; Hwang-Graziano, Julie

    2015-04-01

    To compare the efficacy and tolerance of adjuvant chemotherapy and radiotherapy delivered in sequential (chemotherapy followed by radiation) versus "sandwich" fashion (chemotherapy, interval radiation, and remaining chemotherapy) after surgery in patients with FIGO stage III uterine endometrioid adenocarcinoma. From 2004 to 2011, we identified 51 patients treated at our institution fitting the above criteria. All patients received surgical staging followed by adjuvant chemoradiation (external-beam radiation therapy (EBRT) with or without high-dose rate (HDR) vaginal brachytherapy (VB)). Of these, 73% and 27% of patients received their adjuvant therapy in sequential and sandwich fashion, respectively. There were no significant differences in clinical or pathologic factors between patients treated with either regimen. Thirty-nine (76%) patients had stage IIIC disease. The majority of patients received 6 cycles of paclitaxel with carboplatin or cisplatin. Median EBRT dose was 45 Gy and 54% of patients received HDR VB boost (median dose 21 Gy). There were no significant differences in the estimated 5-year overall survival, local progression-free survival, and distant metastasis-free survival between the sequential and sandwich groups: 87% vs. 77% (p=0.37), 89% vs. 100% (p=0.21), and 78% vs. 85% (p=0.79), respectively. No grade 3-4 genitourinary or gastrointestinal toxicities were reported in either group. There was a trend towards higher incidence of grade 3-4 hematologic toxicity in the sandwich group. Adjuvant chemoradiation for FIGO stage III endometrioid uterine cancer given in either sequential or sandwich fashion appears to offer equally excellent early clinical outcomes and acceptably low toxicity. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    Energy Technology Data Exchange (ETDEWEB)

    Man, Jun [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees of freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.

  19. Virtual Learning Simulations in High School: Effects on Cognitive and Non-cognitive Outcomes and Implications on the Development of STEM Academic and Career Choice

    OpenAIRE

    Thisgaard, Malene; Makransky, Guido

    2017-01-01

    The present study compared the value of using a virtual learning simulation compared to traditional lessons on the topic of evolution, and investigated if the virtual learning simulation could serve as a catalyst for STEM academic and career development, based on social cognitive career theory. The investigation was conducted using a crossover repeated measures design based on a sample of 128 high school biology/biotech students. The results showed that the virtual learning simulation increas...

  20. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  1. Sequential lineups: shift in criterion or decision strategy?

    Science.gov (United States)

    Gronlund, Scott D

    2004-04-01

    R. C. L. Lindsay and G. L. Wells (1985) argued that a sequential lineup enhanced discriminability because it elicited use of an absolute decision strategy. E. B. Ebbesen and H. D. Flowe (2002) argued that a sequential lineup led witnesses to adopt a more conservative response criterion, thereby affecting bias, not discriminability. Height was encoded as absolute (e.g., 6 ft [1.83 m] tall) or relative (e.g., taller than). If a sequential lineup elicited an absolute decision strategy, the principle of transfer-appropriate processing predicted that performance should be best when height was encoded absolutely. Conversely, if a simultaneous lineup elicited a relative decision strategy, performance should be best when height was encoded relatively. The predicted interaction was observed, providing direct evidence for the decision strategies explanation of what happens when witnesses view a sequential lineup.

  2. Relations between the simultaneous and sequential transfer of two nucleons

    International Nuclear Information System (INIS)

    Satchler, G.R.

    1982-01-01

    The author discusses the perturbative treatment of simultaneous and sequential two-nucleon transfer reactions with special regards to the DWBA. As examples the (t,p), (p,t), and (α,d) reactions are considered. (HSI)

  3. Retrieval of sea surface velocities using sequential Ocean Colour ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    pended sediment dispersion patterns, in sequential two time lapsed images. .... face advective velocities consists essentially of iden- tifying the ... matrix is time consuming, a significant reduction .... Chauhan, P. 2002 Personal Communication.

  4. Process tomography via sequential measurements on a single quantum system

    CSIR Research Space (South Africa)

    Bassa, H

    2015-09-01

    Full Text Available The authors utilize a discrete (sequential) measurement protocol to investigate quantum process tomography of a single two-level quantum system, with an unknown initial state, undergoing Rabi oscillations. The ignorance of the dynamical parameters...

  5. Generalized infimum and sequential product of quantum effects

    International Nuclear Information System (INIS)

    Li Yuan; Sun Xiuhong; Chen Zhengli

    2007-01-01

    The quantum effects for a physical system can be described by the set E(H) of positive operators on a complex Hilbert space H that are bounded above by the identity operator I. For A, B(set-membership sign)E(H), the operation of sequential product A(convolution sign)B=A 1/2 BA 1/2 was proposed as a model for sequential quantum measurements. A nice investigation of properties of the sequential product has been carried over [Gudder, S. and Nagy, G., 'Sequential quantum measurements', J. Math. Phys. 42, 5212 (2001)]. In this note, we extend some results of this reference. In particular, a gap in the proof of Theorem 3.2 in this reference is overcome. In addition, some properties of generalized infimum A sqcap B are studied

  6. Sequential Low Cost Interventions Double Hand Hygiene Rates ...

    African Journals Online (AJOL)

    Sequential Low Cost Interventions Double Hand Hygiene Rates Among Medical Teams in a Resource Limited Setting. Results of a Hand Hygiene Quality Improvement Project Conducted At University Teaching Hospital of Kigali (Chuk), Kigali, Rwanda.

  7. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    2016-09-03

    Sep 3, 2016 ... Sequential mixed methods research is an effective approach for ... show the effectiveness of the research method. ... qualitative data before quantitative datasets ..... whereby both types of data are collected simultaneously.

  8. Combined Orbital Fractures: Surgical Strategy of Sequential Repair

    Directory of Open Access Journals (Sweden)

    Su Won Hur

    2015-07-01

    Full Text Available BackgroundReconstruction of combined orbital floor and medial wall fractures with a comminuted inferomedial strut (IMS is challenging and requires careful practice. We present our surgical strategy and postoperative outcomes.MethodsWe divided 74 patients who underwent the reconstruction of the orbital floor and medial wall concomitantly into a comminuted IMS group (41 patients and non-comminuted IMS group (33 patients. In the comminuted IMS group, we first reconstructed the floor stably and then the medial wall by using separate implant pieces. In the non-comminuted IMS group, we reconstructed the floor and the medial wall with a single large implant.ResultsIn the follow-up of 6 to 65 months, most patients with diplopia improved in the first-week except one, who eventually improved at 1 year. All patients with an EOM limitation improved during the first month of follow-up. Enophthalmos (displacement, 2 mm was observed in two patients. The orbit volume measured on the CT scans was statistically significantly restored in both groups. No complications related to the surgery were observed.ConclusionsWe recommend the reconstruction of orbit walls in the comminuted IMS group by using the following surgical strategy: usage of multiple pieces of rigid implants instead of one large implant, sequential repair first of the floor and then of the medial wall, and a focus on the reconstruction of key areas. Our strategy of step-by-step reconstruction has the benefits of easy repair, less surgical trauma, and minimal stress to the surgeon.

  9. "Time sequential high dose of Cytarabine in acute myelocytic leukemia "

    Directory of Open Access Journals (Sweden)

    Ghavamzadeh A

    2003-05-01

    Full Text Available Given preliminary evidence of timed, sequential chemotherapy of high dose cytosine arabinoside the current study was initiated to assess the side effects and efficacy of this regimen in patients with newly acute myelocytic leukemia (AML. Nineteen adults who referred to Hematology-Oncology and Bone Marrow Transplantation (BMT research center of Tehran University of Medical Sciences were enrolled in a trial from Aug 1999 to Nov 2000. All patients had a Karnofski classification above 60%. At this time induction therapy consisted of daunorubicin or idarubicin given at a dose of 60 mg/m² and 12 mg/m² IV respectively on days 1-3, and cytarabine (Ara-C 100 mg/m² intravenously by continuous infusion on days 1-7, followed by Ara-C 1000 mg/m² given on day 8-10 every 12 hours by IV infusion. Consolidation therapy started after 35th day. Of 19 fully evaluable patients, 10 patients achieved a complete remission, whereas 36.6% patients succumbed to death due to regeneration failure. The clinical data show that the overall survival rate from diagnosis 55.5% (95% CI, 30.8-78.5 at 6 months for the entire cohort of the patients. Disease free survival is also 50% (95% CI, 26-74. Mean duration of death due to treatment was 20 days (range 17-29 after beginning the regimen. Presenting WBC counts, French-American-British (FAB classification, sex and age were not useful prognostic variables. Fever, diarrhea, nausea and vomiting and GI hemorrhage were seen in 19, 6, 4, 7 patients respectively. It seems the 3+7+3 regimen is a promising approach for the AML patients regarding to high complete remission rate, but more supportive care should be considered. Furthermore any, benefit in long-term outcome can’t be determined regardless to the choice of post remission therapy (e.g., GCSF, appropriate antibiotics and etc.

  10. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    Science.gov (United States)

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  11. Biaxially mechanical tuning of 2-D reversible and irreversible surface topologies through simultaneous and sequential wrinkling.

    Science.gov (United States)

    Yin, Jie; Yagüe, Jose Luis; Boyce, Mary C; Gleason, Karen K

    2014-02-26

    Controlled buckling is a facile means of structuring surfaces. The resulting ordered wrinkling topologies provide surface properties and features desired for multifunctional applications. Here, we study the biaxially dynamic tuning of two-dimensional wrinkled micropatterns under cyclic mechanical stretching/releasing/restretching simultaneously or sequentially. A biaxially prestretched PDMS substrate is coated with a stiff polymer deposited by initiated chemical vapor deposition (iCVD). Applying a mechanical release/restretch cycle in two directions loaded simultaneously or sequentially to the wrinkled system results in a variety of dynamic and tunable wrinkled geometries, the evolution of which is investigated using in situ optical profilometry, numerical simulations, and theoretical modeling. Results show that restretching ordered herringbone micropatterns, created through sequential release of biaxial prestrain, leads to reversible and repeatable surface topography. The initial flat surface and the same wrinkled herringbone pattern are obtained alternatively after cyclic release/restretch processes, owing to the highly ordered structure leaving no avenue for trapping irregular topological regions during cycling as further evidenced by the uniformity of strains distributions and negligible residual strain. Conversely, restretching disordered labyrinth micropatterns created through simultaneous release shows an irreversible surface topology whether after sequential or simultaneous restretching due to creation of irregular surface topologies with regions of highly concentrated strain upon formation of the labyrinth which then lead to residual strains and trapped topologies upon cycling; furthermore, these trapped topologies depend upon the subsequent strain histories as well as the cycle. The disordered labyrinth pattern varies after each cyclic release/restretch process, presenting residual shallow patterns instead of achieving a flat state. The ability to

  12. Temporal characteristics of radiologists’ and novices’ lesion detection in viewing medical images presented rapidly and sequentially

    Directory of Open Access Journals (Sweden)

    Ryoichi Nakashima

    2016-10-01

    Full Text Available Although viewing multiple stacks of medical images presented on a display is a relatively new but useful medical task, little is known about this task. Particularly, it is unclear how radiologists search for lesions in this type of image reading. When viewing cluttered and dynamic displays, continuous motion itself does not capture attention. Thus, it is effective for the target detection that observers’ attention is captured by the onset signal of a suddenly appearing target among the continuously moving distractors (i.e., a passive viewing strategy. This can be applied to stack viewing tasks, because lesions often show up as transient signals in medical images which are sequentially presented simulating a dynamic and smoothly transforming image progression of organs. However, it is unclear whether observers can detect a target when the target appears at the beginning of a sequential presentation where the global apparent motion onset signal (i.e., signal of the initiation of the apparent motion by sequential presentation occurs. We investigated the ability of radiologists to detect lesions during such tasks by comparing the performances of radiologists and novices. Results show that overall performance of radiologists is better than novices. Furthermore, the temporal locations of lesions in CT image sequences, i.e., when a lesion appears in an image sequence, does not affect the performance of radiologists, whereas it does affect the performance of novices. Results indicate that novices have greater difficulty in detecting a lesion appearing early than late in the image sequence. We suggest that radiologists have other mechanisms to detect lesions in medical images with little attention which novices do not have. This ability is critically important when viewing rapid sequential presentations of multiple CT images, such as stack viewing tasks.

  13. Concatenated coding system with iterated sequential inner decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis; Paaske, Erik

    1995-01-01

    We describe a concatenated coding system with iterated sequential inner decoding. The system uses convolutional codes of very long constraint length and operates on iterations between an inner Fano decoder and an outer Reed-Solomon decoder......We describe a concatenated coding system with iterated sequential inner decoding. The system uses convolutional codes of very long constraint length and operates on iterations between an inner Fano decoder and an outer Reed-Solomon decoder...

  14. Multichannel, sequential or combined X-ray spectrometry

    International Nuclear Information System (INIS)

    Florestan, J.

    1979-01-01

    X-ray spectrometer qualities and defects are evaluated for sequential and multichannel categories. Multichannel X-ray spectrometer has time-coherency advantage and its results could be more reproducible; on the other hand some spatial incoherency limits low percentage and traces applications, specially when backgrounds are very variable. In this last case, sequential X-ray spectrometer would find again great usefulness [fr

  15. A Survey of Multi-Objective Sequential Decision-Making

    OpenAIRE

    Roijers, D.M.; Vamplew, P.; Whiteson, S.; Dazeley, R.

    2013-01-01

    Sequential decision-making problems with multiple objectives arise naturally in practice and pose unique challenges for research in decision-theoretic planning and learning, which has largely focused on single-objective settings. This article surveys algorithms designed for sequential decision-making problems with multiple objectives. Though there is a growing body of literature on this subject, little of it makes explicit under what circumstances special methods are needed to solve multi-obj...

  16. Configural and component processing in simultaneous and sequential lineup procedures

    OpenAIRE

    Flowe, HD; Smith, HMJ; Karoğlu, N; Onwuegbusi, TO; Rai, L

    2015-01-01

    Configural processing supports accurate face recognition, yet it has never been examined within the context of criminal identification lineups. We tested, using the inversion paradigm, the role of configural processing in lineups. Recent research has found that face discrimination accuracy in lineups is better in a simultaneous compared to a sequential lineup procedure. Therefore, we compared configural processing in simultaneous and sequential lineups to examine whether there are differences...

  17. Sequential weak continuity of null Lagrangians at the boundary

    Czech Academy of Sciences Publication Activity Database

    Kalamajska, A.; Kraemer, S.; Kružík, Martin

    2014-01-01

    Roč. 49, 3/4 (2014), s. 1263-1278 ISSN 0944-2669 R&D Projects: GA ČR GAP201/10/0357 Institutional support: RVO:67985556 Keywords : null Lagrangians * nonhomogeneous nonlinear mappings * sequential weak/in measure continuity Subject RIV: BA - General Mathematics Impact factor: 1.518, year: 2014 http://library.utia.cas.cz/separaty/2013/MTR/kruzik-sequential weak continuity of null lagrangians at the boundary.pdf

  18. A Trust-region-based Sequential Quadratic Programming Algorithm

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Poulsen, Niels Kjølstad

    This technical note documents the trust-region-based sequential quadratic programming algorithm used in other works by the authors. The algorithm seeks to minimize a convex nonlinear cost function subject to linear inequalty constraints and nonlinear equality constraints.......This technical note documents the trust-region-based sequential quadratic programming algorithm used in other works by the authors. The algorithm seeks to minimize a convex nonlinear cost function subject to linear inequalty constraints and nonlinear equality constraints....

  19. Sequential bayes estimation algorithm with cubic splines on uniform meshes

    International Nuclear Information System (INIS)

    Hossfeld, F.; Mika, K.; Plesser-Walk, E.

    1975-11-01

    After outlining the principles of some recent developments in parameter estimation, a sequential numerical algorithm for generalized curve-fitting applications is presented combining results from statistical estimation concepts and spline analysis. Due to its recursive nature, the algorithm can be used most efficiently in online experimentation. Using computer-sumulated and experimental data, the efficiency and the flexibility of this sequential estimation procedure is extensively demonstrated. (orig.) [de

  20. Sequential contrast-enhanced MR imaging of the penis.

    Science.gov (United States)

    Kaneko, K; De Mouy, E H; Lee, B E

    1994-04-01

    To determine the enhancement patterns of the penis at magnetic resonance (MR) imaging. Sequential contrast material-enhanced MR images of the penis in a flaccid state were obtained in 16 volunteers (12 with normal penile function and four with erectile dysfunction). Subjects with normal erectile function showed gradual and centrifugal enhancement of the corpora cavernosa, while those with erectile dysfunction showed poor enhancement with abnormal progression. Sequential contrast-enhanced MR imaging provides additional morphologic information for the evaluation of erectile dysfunction.

  1. Reliability Evaluation of Distribution System Considering Sequential Characteristics of Distributed Generation

    Directory of Open Access Journals (Sweden)

    Sheng Wanxing

    2016-01-01

    Full Text Available In allusion to the randomness of output power of distributed generation (DG, a reliability evaluation model based on sequential Monte Carlo simulation (SMCS for distribution system with DG is proposed. Operating states of the distribution system can be sampled by SMCS in chronological order thus the corresponding output power of DG can be generated. The proposed method has been tested on feeder F4 of IEEE-RBTS Bus 6. The results show that reliability evaluation of distribution system considering the uncertainty of output power of DG can be effectively implemented by SMCS.

  2. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    Science.gov (United States)

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  3. Sequential unconstrained minimization algorithms for constrained optimization

    International Nuclear Information System (INIS)

    Byrne, Charles

    2008-01-01

    The problem of minimizing a function f(x):R J → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G k (x)=f(x)+g k (x), to obtain x k . The auxiliary functions g k (x):D subset of R J → R + are nonnegative on the set D, each x k is assumed to lie within D, and the objective is to minimize the continuous function f:R J → R over x in the set C = D-bar, the closure of D. We assume that such minimizers exist, and denote one such by x-circumflex. We assume that the functions g k (x) satisfy the inequalities 0≤g k (x)≤G k-1 (x)-G k-1 (x k-1 ), for k = 2, 3, .... Using this assumption, we show that the sequence {(x k )} is decreasing and converges to f(x-circumflex). If the restriction of f(x) to D has bounded level sets, which happens if x-circumflex is unique and f(x) is closed, proper and convex, then the sequence {x k } is bounded, and f(x*)=f(x-circumflex), for any cluster point x*. Therefore, if x-circumflex is unique, x* = x-circumflex and {x k } → x-circumflex. When x-circumflex is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton–Raphson method. The proof techniques used for SUMMA can be extended to obtain related results

  4. Group-sequential analysis may allow for early trial termination

    DEFF Research Database (Denmark)

    Gerke, Oke; Vilstrup, Mie H; Halekoh, Ulrich

    2017-01-01

    BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG-PET/CT mea......BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG...... assumed to be normally distributed, and sequential one-sided hypothesis tests on the population standard deviation of the differences against a hypothesised value of 1.5 were performed, employing an alpha spending function. The fixed-sample analysis (N = 45) was compared with the group-sequential analysis...... strategies comprising one (at N = 23), two (at N = 15, 30), or three interim analyses (at N = 11, 23, 34), respectively, which were defined post hoc. RESULTS: When performing interim analyses with one third and two thirds of patients, sufficient agreement could be concluded after the first interim analysis...

  5. Sequential reconstruction of driving-forces from nonlinear nonstationary dynamics

    Science.gov (United States)

    Güntürkün, Ulaş

    2010-07-01

    This paper describes a functional analysis-based method for the estimation of driving-forces from nonlinear dynamic systems. The driving-forces account for the perturbation inputs induced by the external environment or the secular variations in the internal variables of the system. The proposed algorithm is applicable to the problems for which there is too little or no prior knowledge to build a rigorous mathematical model of the unknown dynamics. We derive the estimator conditioned on the differentiability of the unknown system’s mapping, and smoothness of the driving-force. The proposed algorithm is an adaptive sequential realization of the blind prediction error method, where the basic idea is to predict the observables, and retrieve the driving-force from the prediction error. Our realization of this idea is embodied by predicting the observables one-step into the future using a bank of echo state networks (ESN) in an online fashion, and then extracting the raw estimates from the prediction error and smoothing these estimates in two adaptive filtering stages. The adaptive nature of the algorithm enables to retrieve both slowly and rapidly varying driving-forces accurately, which are illustrated by simulations. Logistic and Moran-Ricker maps are studied in controlled experiments, exemplifying chaotic state and stochastic measurement models. The algorithm is also applied to the estimation of a driving-force from another nonlinear dynamic system that is stochastic in both state and measurement equations. The results are judged by the posterior Cramer-Rao lower bounds. The method is finally put into test on a real-world application; extracting sun’s magnetic flux from the sunspot time series.

  6. On Modeling Large-Scale Multi-Agent Systems with Parallel, Sequential and Genuinely Asynchronous Cellular Automata

    International Nuclear Information System (INIS)

    Tosic, P.T.

    2011-01-01

    We study certain types of Cellular Automata (CA) viewed as an abstraction of large-scale Multi-Agent Systems (MAS). We argue that the classical CA model needs to be modified in several important respects, in order to become a relevant and sufficiently general model for the large-scale MAS, and so that thus generalized model can capture many important MAS properties at the level of agent ensembles and their long-term collective behavior patterns. We specifically focus on the issue of inter-agent communication in CA, and propose sequential cellular automata (SCA) as the first step, and genuinely Asynchronous Cellular Automata (ACA) as the ultimate deterministic CA-based abstract models for large-scale MAS made of simple reactive agents. We first formulate deterministic and nondeterministic versions of sequential CA, and then summarize some interesting configuration space properties (i.e., possible behaviors) of a restricted class of sequential CA. In particular, we compare and contrast those properties of sequential CA with the corresponding properties of the classical (that is, parallel and perfectly synchronous) CA with the same restricted class of update rules. We analytically demonstrate failure of the studied sequential CA models to simulate all possible behaviors of perfectly synchronous parallel CA, even for a very restricted class of non-linear totalistic node update rules. The lesson learned is that the interleaving semantics of concurrency, when applied to sequential CA, is not refined enough to adequately capture the perfect synchrony of parallel CA updates. Last but not least, we outline what would be an appropriate CA-like abstraction for large-scale distributed computing insofar as the inter-agent communication model is concerned, and in that context we propose genuinely asynchronous CA. (author)

  7. Local control rate and prognosis after sequential chemoradiation for small cell carcinoma of the bladder

    International Nuclear Information System (INIS)

    Meijer, Richard P.; Meinhardt, Wim; Poel, Henk G. van der; Rhijn, Bas W. van; Kerst, J. Martijn; Pos, Floris J.; Horenblas, Simon; Bex, Axel

    2013-01-01

    The objectives of this study were to assess the long-term outcome and the risk for local recurrence of patients with small cell carcinoma of the bladder (SCCB) treated with neoadjuvant chemotherapy followed by external beam radiotherapy (sequential chemoradiation). All consecutive patients with primary small cell carcinoma of the bladder (n=66), treated in our institution between 1993 and 2011 were retrospectively evaluated from an institutional database. Only patients with limited disease (Tx-4N0-1M0) small cell carcinoma of the bladder treated with sequential chemoradiation (n=27) were included in this study. Recurrence rates, overall survival and cancer-specific survival were analyzed using the Kaplan-Meier method. Median time to recurrence was 20 months, median overall survival 26 months, 5-year overall survival 22.2%, median cancer-specific survival 47 months and 5-year cancer-specific survival 39.6%. For complete responders after neoadjuvant chemotherapy (n=19), median cancer-specific survival was 52 months with a 5-year cancer-specific survival 45.9% versus a median cancer-specific survival of 22 months and 5-year cancer-specific survival 0.0% for incomplete responders (n=8; P=0.034). Eight patients (29.6%) underwent transurethral resections (TUR-BT) for local recurrences in the bladder. At the end of follow up, four patients had undergone cystectomy for recurrence of disease resulting in a bladder-preservation rate of 85.2%. Median time to local recurrence was 29 months and median time to distant recurrence was 10 months. Sequential chemoradiation for limited disease small cell carcinoma of the bladder results in a reasonable outcome with a high bladder preservation rate. Response to neoadjuvant chemotherapy represents a significant prognostic factor in this patient population. (author)

  8. MaxEnt queries and sequential sampling

    International Nuclear Information System (INIS)

    Riegler, Peter; Caticha, Nestor

    2001-01-01

    In this paper we pose the question: After gathering N data points, at what value of the control parameter should the next measurement be done? We propose an on-line algorithm which samples optimally by maximizing the gain in information on the parameters to be measured. We show analytically that the information gain is maximum for those potential measurements whose outcome is most unpredictable, i.e. for which the predictive distribution has maximum entropy. The resulting algorithm is applied to exponential analysis

  9. Human visual system automatically encodes sequential regularities of discrete events.

    Science.gov (United States)

    Kimura, Motohiro; Schröger, Erich; Czigler, István; Ohira, Hideki

    2010-06-01

    For our adaptive behavior in a dynamically changing environment, an essential task of the brain is to automatically encode sequential regularities inherent in the environment into a memory representation. Recent studies in neuroscience have suggested that sequential regularities embedded in discrete sensory events are automatically encoded into a memory representation at the level of the sensory system. This notion is largely supported by evidence from investigations using auditory mismatch negativity (auditory MMN), an event-related brain potential (ERP) correlate of an automatic memory-mismatch process in the auditory sensory system. However, it is still largely unclear whether or not this notion can be generalized to other sensory modalities. The purpose of the present study was to investigate the contribution of the visual sensory system to the automatic encoding of sequential regularities using visual mismatch negativity (visual MMN), an ERP correlate of an automatic memory-mismatch process in the visual sensory system. To this end, we conducted a sequential analysis of visual MMN in an oddball sequence consisting of infrequent deviant and frequent standard stimuli, and tested whether the underlying memory representation of visual MMN generation contains only a sensory memory trace of standard stimuli (trace-mismatch hypothesis) or whether it also contains sequential regularities extracted from the repetitive standard sequence (regularity-violation hypothesis). The results showed that visual MMN was elicited by first deviant (deviant stimuli following at least one standard stimulus), second deviant (deviant stimuli immediately following first deviant), and first standard (standard stimuli immediately following first deviant), but not by second standard (standard stimuli immediately following first standard). These results are consistent with the regularity-violation hypothesis, suggesting that the visual sensory system automatically encodes sequential

  10. Sequential reduction of external networks for the security- and short circuit monitor in power system control centers

    Energy Technology Data Exchange (ETDEWEB)

    Dietze, P [Siemens A.G., Erlangen (Germany, F.R.). Abt. ESTE

    1978-01-01

    For the evaluation of the effects of switching operations or simulation of line, transformer, and generator outages the influence of interconnected neighbor networks is modelled by network equivalents in the process computer. The basic passive conductivity model is produced by sequential reduction and adapted to fit the active network behavior. The reduction routine uses the admittance matrix, sparse technique and optimal ordering; it is applicable to process computer applications.

  11. Using Visual Simulation Tools And Learning Outcomes-Based Curriculum To Help Transportation Engineering Students And Practitioners To Better Understand And Design Traffic Signal Control Systems

    Science.gov (United States)

    2012-06-01

    The use of visual simulation tools to convey complex concepts has become a useful tool in education as well as in research. : This report describes a project that developed curriculum and visualization tools to train transportation engineering studen...

  12. Native Frames: Disentangling Sequential from Concerted Three-Body Fragmentation

    Science.gov (United States)

    Rajput, Jyoti; Severt, T.; Berry, Ben; Jochim, Bethany; Feizollah, Peyman; Kaderiya, Balram; Zohrabi, M.; Ablikim, U.; Ziaee, Farzaneh; Raju P., Kanaka; Rolles, D.; Rudenko, A.; Carnes, K. D.; Esry, B. D.; Ben-Itzhak, I.

    2018-03-01

    A key question concerning the three-body fragmentation of polyatomic molecules is the distinction of sequential and concerted mechanisms, i.e., the stepwise or simultaneous cleavage of bonds. Using laser-driven fragmentation of OCS into O++C++S+ and employing coincidence momentum imaging, we demonstrate a novel method that enables the clear separation of sequential and concerted breakup. The separation is accomplished by analyzing the three-body fragmentation in the native frame associated with each step and taking advantage of the rotation of the intermediate molecular fragment, CO2 + or CS2 + , before its unimolecular dissociation. This native-frame method works for any projectile (electrons, ions, or photons), provides details on each step of the sequential breakup, and enables the retrieval of the relevant spectra for sequential and concerted breakup separately. Specifically, this allows the determination of the branching ratio of all these processes in OCS3 + breakup. Moreover, we find that the first step of sequential breakup is tightly aligned along the laser polarization and identify the likely electronic states of the intermediate dication that undergo unimolecular dissociation in the second step. Finally, the separated concerted breakup spectra show clearly that the central carbon atom is preferentially ejected perpendicular to the laser field.

  13. Synthesizing genetic sequential logic circuit with clock pulse generator.

    Science.gov (United States)

    Chuang, Chia-Hua; Lin, Chun-Liang

    2014-05-28

    Rhythmic clock widely occurs in biological systems which controls several aspects of cell physiology. For the different cell types, it is supplied with various rhythmic frequencies. How to synthesize a specific clock signal is a preliminary but a necessary step to further development of a biological computer in the future. This paper presents a genetic sequential logic circuit with a clock pulse generator based on a synthesized genetic oscillator, which generates a consecutive clock signal whose frequency is an inverse integer multiple to that of the genetic oscillator. An analogous electronic waveform-shaping circuit is constructed by a series of genetic buffers to shape logic high/low levels of an oscillation input in a basic sinusoidal cycle and generate a pulse-width-modulated (PWM) output with various duty cycles. By controlling the threshold level of the genetic buffer, a genetic clock pulse signal with its frequency consistent to the genetic oscillator is synthesized. A synchronous genetic counter circuit based on the topology of the digital sequential logic circuit is triggered by the clock pulse to synthesize the clock signal with an inverse multiple frequency to the genetic oscillator. The function acts like a frequency divider in electronic circuits which plays a key role in the sequential logic circuit with specific operational frequency. A cascaded genetic logic circuit generating clock pulse signals is proposed. Based on analogous implement of digital sequential logic circuits, genetic sequential logic circuits can be constructed by the proposed approach to generate various clock signals from an oscillation signal.

  14. Energy-separated sequential irradiation for ripple pattern tailoring on silicon surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Tanuj [Department of Physics, Central University of Haryana, Jant-Pali, Mahendergarh 1123029 (India); Inter University Accelerator Centre, Aruna Asaf Ali Marg, New Delhi 110067 (India); Kumar, Manish, E-mail: manishbharadwaj@gmail.com [Department of Physics, Central University of Rajasthan, Kishangarh 305801 (India); Panchal, Vandana [Department of Physics, National Institute of Technology, Kurukshetra 136119 (India); Sahoo, P.K. [School of Physical Sciences, National Institute of Science Education and Research, Bhubaneswar 751005 (India); Kanjilal, D. [Inter University Accelerator Centre, Aruna Asaf Ali Marg, New Delhi 110067 (India)

    2015-12-01

    Highlights: • A new process for controlling the near-surface amorphization of ripples on Si surfaces. • Ripples generation by 100 KeV Ar{sup +} and amorphization control by 60 KeV Ar{sup +} irradiation. • Advantage of energy-separated irradiation demonstrated by detailed RBS and AFM studies. • Relevant mechanism is presented on the basis of DAMAGE and SIMNRA simulations. • Key role of solid flow towards the amorphous/crystalline interface is demonstrated. - Abstract: Nanoscale ripples on semiconductor surfaces have potential application in biosensing and optoelectronics, but suffer from uncontrolled surface-amorphization when prepared by conventional ion-irradiation methods. A two-step, energy-separated sequential-irradiation enables simultaneous control of surface-amorphization and ripple-dimensions on Si(1 0 0). The evolution of ripples using 100 keV Ar{sup +} bombardment and further tuning of the patterns using a sequential-irradiation by 60 keV Ar{sup +} at different fluences are demonstrated. The advantage of this approach as opposed to increased fluence at the same energy is clarified by atomic force microscopy and Rutherford backscattering spectroscopy investigations. The explanation of our findings is presented through DAMAGE simulation.

  15. Modeling and Predicting AD Progression by Regression Analysis of Sequential Clinical Data

    KAUST Repository

    Xie, Qing

    2016-02-23

    Alzheimer\\'s Disease (AD) is currently attracting much attention in elders\\' care. As the increasing availability of massive clinical diagnosis data, especially the medical images of brain scan, it is highly significant to precisely identify and predict the potential AD\\'s progression based on the knowledge in the diagnosis data. In this paper, we follow a novel sequential learning framework to model the disease progression for AD patients\\' care. Different from the conventional approaches using only initial or static diagnosis data to model the disease progression for different durations, we design a score-involved approach and make use of the sequential diagnosis information in different disease stages to jointly simulate the disease progression. The actual clinical scores are utilized in progress to make the prediction more pertinent and reliable. We examined our approach by extensive experiments on the clinical data provided by the Alzheimer\\'s Disease Neuroimaging Initiative (ADNI). The results indicate that the proposed approach is more effective to simulate and predict the disease progression compared with the existing methods.

  16. Modeling and Predicting AD Progression by Regression Analysis of Sequential Clinical Data

    KAUST Repository

    Xie, Qing; Wang, Su; Zhu, Jia; Zhang, Xiangliang

    2016-01-01

    Alzheimer's Disease (AD) is currently attracting much attention in elders' care. As the increasing availability of massive clinical diagnosis data, especially the medical images of brain scan, it is highly significant to precisely identify and predict the potential AD's progression based on the knowledge in the diagnosis data. In this paper, we follow a novel sequential learning framework to model the disease progression for AD patients' care. Different from the conventional approaches using only initial or static diagnosis data to model the disease progression for different durations, we design a score-involved approach and make use of the sequential diagnosis information in different disease stages to jointly simulate the disease progression. The actual clinical scores are utilized in progress to make the prediction more pertinent and reliable. We examined our approach by extensive experiments on the clinical data provided by the Alzheimer's Disease Neuroimaging Initiative (ADNI). The results indicate that the proposed approach is more effective to simulate and predict the disease progression compared with the existing methods.

  17. Safety test No. S-6, launch pad abort sequential test Phase II: solid propellant fire

    International Nuclear Information System (INIS)

    Snow, E.C.

    1975-08-01

    In preparation for the Lincoln Laboratory's LES 8/9 space mission, a series of tests was performed to evaluate the nuclear safety capability of the Multi-Hundred Watt (MHW) Radioisotope Thermoelectric Generator (RTG) to be used to supply power for the satellite. One such safety test is Test No. S-6, Launch Pad Abort Sequential Test. The objective of this test was to subject the RTG and its components to the sequential environments characteristic of a catastrophic launch pad accident to evaluate their capability to contain the 238 PuO 2 fuel. This sequence of environments was to have consisted of the blast overpressure and fragments, followed by the fireball, low velocity impact on the launch pad, and solid propellant fire. The blast overpressure and fragments were subsequently eliminated from this sequence. The procedures and results of Phase II of Test S-6, Solid Propellant Fire are presented. In this phase of the test, a simulant Fuel Sphere Assembly (FSA) and a mockup of a damaged Heat Source Assembly (HSA) were subjected to single proximity solid propellant fires of approximately 10-min duration. Steel was introduced into both tests to simulate the effects of launch pad debris and the solid rocket motor (SRM) casing that might be present in the fire zone. (TFD)

  18. Comparison of sequential and single extraction in order to estimate environmental impact of metals from fly ash

    Directory of Open Access Journals (Sweden)

    Tasić Aleksandra M.

    2016-01-01

    Full Text Available The aim of this paper was to simulate leaching of metals from fly ash in different environmental conditions using ultrasound and microwave-assisted extraction techniques. Single-agent extraction and sequential extraction procedures were used to determine the levels of different metals leaching. The concentration of metals (Al, Fe, Mn, Cd, Co, Cr, Ni, Pb, Cu, As, Be in fly ash extracts were measured by Inductively Coupled Plasma-Atomic Emission Spectrometry. Single-agent extractions of metals were conducted during sonication times of 10, 20, 30, 40 and 50 min. Single-agent extraction with deionized water was also undertaken by exposing samples to microwave radiation at the temperature of 50°C. The sequential extraction was undertaken according to the BCR procedure which was modified and applied to study the partitioning of metals in coal fly ash. The microwave-assisted sequential extraction was performed at different extraction temperatures: 50, 100 and 150°C. The partitioning of metals between the individual fractions was investigated and discussed. The efficiency of the extraction process for each step was examined. In addition, the results of the microwave-assisted sequential extraction are compared to the results obtained by standard ASTM method. The mobility of most elements contained in fly ash is markedly pH sensitive. [Projekat Ministarstva nauke Republike Srbije, br. 172030, br. 176006 i br. III43009

  19. Lexical decoder for continuous speech recognition: sequential neural network approach

    International Nuclear Information System (INIS)

    Iooss, Christine

    1991-01-01

    The work presented in this dissertation concerns the study of a connectionist architecture to treat sequential inputs. In this context, the model proposed by J.L. Elman, a recurrent multilayers network, is used. Its abilities and its limits are evaluated. Modifications are done in order to treat erroneous or noisy sequential inputs and to classify patterns. The application context of this study concerns the realisation of a lexical decoder for analytical multi-speakers continuous speech recognition. Lexical decoding is completed from lattices of phonemes which are obtained after an acoustic-phonetic decoding stage relying on a K Nearest Neighbors search technique. Test are done on sentences formed from a lexicon of 20 words. The results are obtained show the ability of the proposed connectionist model to take into account the sequentiality at the input level, to memorize the context and to treat noisy or erroneous inputs. (author) [fr

  20. Computing Sequential Equilibria for Two-Player Games

    DEFF Research Database (Denmark)

    Miltersen, Peter Bro; Sørensen, Troels Bjerre

    2006-01-01

    Koller, Megiddo and von Stengel showed how to efficiently compute minimax strategies for two-player extensive-form zero-sum games with imperfect information but perfect recall using linear programming and avoiding conversion to normal form. Koller and Pfeffer pointed out that the strategies...... obtained by the algorithm are not necessarily sequentially rational and that this deficiency is often problematic for the practical applications. We show how to remove this deficiency by modifying the linear programs constructed by Koller, Megiddo and von Stengel so that pairs of strategies forming...... a sequential equilibrium are computed. In particular, we show that a sequential equilibrium for a two-player zero-sum game with imperfect information but perfect recall can be found in polynomial time. In addition, the equilibrium we find is normal-form perfect. Our technique generalizes to general-sum games...

  1. Computing sequential equilibria for two-player games

    DEFF Research Database (Denmark)

    Miltersen, Peter Bro

    2006-01-01

    Koller, Megiddo and von Stengel showed how to efficiently compute minimax strategies for two-player extensive-form zero-sum games with imperfect information but perfect recall using linear programming and avoiding conversion to normal form. Their algorithm has been used by AI researchers...... for constructing prescriptive strategies for concrete, often fairly large games. Koller and Pfeffer pointed out that the strategies obtained by the algorithm are not necessarily sequentially rational and that this deficiency is often problematic for the practical applications. We show how to remove this deficiency...... by modifying the linear programs constructed by Koller, Megiddo and von Stengel so that pairs of strategies forming a sequential equilibrium are computed. In particular, we show that a sequential equilibrium for a two-player zero-sum game with imperfect information but perfect recall can be found in polynomial...

  2. A Bayesian sequential processor approach to spectroscopic portal system decisions

    Energy Technology Data Exchange (ETDEWEB)

    Sale, K; Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Gosnell, T; Chambers, D

    2007-07-31

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waiting for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.

  3. Configural and component processing in simultaneous and sequential lineup procedures.

    Science.gov (United States)

    Flowe, Heather D; Smith, Harriet M J; Karoğlu, Nilda; Onwuegbusi, Tochukwu O; Rai, Lovedeep

    2016-01-01

    Configural processing supports accurate face recognition, yet it has never been examined within the context of criminal identification lineups. We tested, using the inversion paradigm, the role of configural processing in lineups. Recent research has found that face discrimination accuracy in lineups is better in a simultaneous compared to a sequential lineup procedure. Therefore, we compared configural processing in simultaneous and sequential lineups to examine whether there are differences. We had participants view a crime video, and then they attempted to identify the perpetrator from a simultaneous or sequential lineup. The test faces were presented either upright or inverted, as previous research has shown that inverting test faces disrupts configural processing. The size of the inversion effect for faces was the same across lineup procedures, indicating that configural processing underlies face recognition in both procedures. Discrimination accuracy was comparable across lineup procedures in both the upright and inversion condition. Theoretical implications of the results are discussed.

  4. Visual short-term memory for sequential arrays.

    Science.gov (United States)

    Kumar, Arjun; Jiang, Yuhong

    2005-04-01

    The capacity of visual short-term memory (VSTM) for a single visual display has been investigated in past research, but VSTM for multiple sequential arrays has been explored only recently. In this study, we investigate the capacity of VSTM across two sequential arrays separated by a variable stimulus onset asynchrony (SOA). VSTM for spatial locations (Experiment 1), colors (Experiments 2-4), orientations (Experiments 3 and 4), and conjunction of color and orientation (Experiment 4) were tested, with the SOA across the two sequential arrays varying from 100 to 1,500 msec. We find that VSTM for the trailing array is much better than VSTM for the leading array, but when averaged across the two arrays VSTM has a constant capacity independent of the SOA. We suggest that multiple displays compete for retention in VSTM and that separating information into two temporally discrete groups does not enhance the overall capacity of VSTM.

  5. Binomial outcomes in dataset with some clusters of size two: can the dependence of twins be accounted for? A simulation study comparing the reliability of statistical methods based on a dataset of preterm infants.

    Science.gov (United States)

    Sauzet, Odile; Peacock, Janet L

    2017-07-20

    The analysis of perinatal outcomes often involves datasets with some multiple births. These are datasets mostly formed of independent observations and a limited number of clusters of size two (twins) and maybe of size three or more. This non-independence needs to be accounted for in the statistical analysis. Using simulated data based on a dataset of preterm infants we have previously investigated the performance of several approaches to the analysis of continuous outcomes in the presence of some clusters of size two. Mixed models have been developed for binomial outcomes but very little is known about their reliability when only a limited number of small clusters are present. Using simulated data based on a dataset of preterm infants we investigated the performance of several approaches to the analysis of binomial outcomes in the presence of some clusters of size two. Logistic models, several methods of estimation for the logistic random intercept models and generalised estimating equations were compared. The presence of even a small percentage of twins means that a logistic regression model will underestimate all parameters but a logistic random intercept model fails to estimate the correlation between siblings if the percentage of twins is too small and will provide similar estimates to logistic regression. The method which seems to provide the best balance between estimation of the standard error and the parameter for any percentage of twins is the generalised estimating equations. This study has shown that the number of covariates or the level two variance do not necessarily affect the performance of the various methods used to analyse datasets containing twins but when the percentage of small clusters is too small, mixed models cannot capture the dependence between siblings.

  6. Binomial outcomes in dataset with some clusters of size two: can the dependence of twins be accounted for? A simulation study comparing the reliability of statistical methods based on a dataset of preterm infants

    Directory of Open Access Journals (Sweden)

    Odile Sauzet

    2017-07-01

    Full Text Available Abstract Background The analysis of perinatal outcomes often involves datasets with some multiple births. These are datasets mostly formed of independent observations and a limited number of clusters of size two (twins and maybe of size three or more. This non-independence needs to be accounted for in the statistical analysis. Using simulated data based on a dataset of preterm infants we have previously investigated the performance of several approaches to the analysis of continuous outcomes in the presence of some clusters of size two. Mixed models have been developed for binomial outcomes but very little is known about their reliability when only a limited number of small clusters are present. Methods Using simulated data based on a dataset of preterm infants we investigated the performance of several approaches to the analysis of binomial outcomes in the presence of some clusters of size two. Logistic models, several methods of estimation for the logistic random intercept models and generalised estimating equations were compared. Results The presence of even a small percentage of twins means that a logistic regression model will underestimate all parameters but a logistic random intercept model fails to estimate the correlation between siblings if the percentage of twins is too small and will provide similar estimates to logistic regression. The method which seems to provide the best balance between estimation of the standard error and the parameter for any percentage of twins is the generalised estimating equations. Conclusions This study has shown that the number of covariates or the level two variance do not necessarily affect the performance of the various methods used to analyse datasets containing twins but when the percentage of small clusters is too small, mixed models cannot capture the dependence between siblings.

  7. Sequential determination of important ecotoxic radionuclides in nuclear waste samples

    International Nuclear Information System (INIS)

    Bilohuscin, J.

    2016-01-01

    In the dissertation thesis we focused on the development and optimization of a sequential determination method for radionuclides 93 Zr, 94 Nb, 99 Tc and 126 Sn, employing extraction chromatography sorbents TEVA (R) Resin and Anion Exchange Resin, supplied by Eichrom Industries. Prior to the attestation of sequential separation of these proposed radionuclides from radioactive waste samples, a unique sequential procedure of 90 Sr, 239 Pu, 241 Am separation from urine matrices was tried, using molecular recognition sorbents of AnaLig (R) series and extraction chromatography sorbent DGA (R) Resin. On these experiments, four various sorbents were continually used for separation, including PreFilter Resin sorbent, which removes interfering organic materials present in raw urine. After the acquisition of positive results of this sequential procedure followed experiments with a 126 Sn separation using TEVA (R) Resin and Anion Exchange Resin sorbents. Radiochemical recoveries obtained from samples of radioactive evaporate concentrates and sludge showed high efficiency of the separation, while values of 126 Sn were under the minimum detectable activities MDA. Activity of 126 Sn was determined after ingrowth of daughter nuclide 126m Sb on HPGe gamma detector, with minimal contamination of gamma interfering radionuclides with decontamination factors (D f ) higher then 1400 for 60 Co and 47000 for 137 Cs. Based on the acquired experiments and results of these separation procedures, a complex method of sequential separation of 93 Zr, 94 Nb, 99 Tc and 126 Sn was proposed, which included optimization steps similar to those used in previous parts of the dissertation work. Application of the sequential separation method for sorbents TEVA (R) Resin and Anion Exchange Resin on real samples of radioactive wastes provided satisfactory results and an economical, time sparing, efficient method. (author)

  8. Sequential analysis in neonatal research-systematic review.

    Science.gov (United States)

    Lava, Sebastiano A G; Elie, Valéry; Ha, Phuong Thi Viet; Jacqz-Aigrain, Evelyne

    2018-05-01

    As more new drugs are discovered, traditional designs come at their limits. Ten years after the adoption of the European Paediatric Regulation, we performed a systematic review on the US National Library of Medicine and Excerpta Medica database of sequential trials involving newborns. Out of 326 identified scientific reports, 21 trials were included. They enrolled 2832 patients, of whom 2099 were analyzed: the median number of neonates included per trial was 48 (IQR 22-87), median gestational age was 28.7 (IQR 27.9-30.9) weeks. Eighteen trials used sequential techniques to determine sample size, while 3 used continual reassessment methods for dose-finding. In 16 studies reporting sufficient data, the sequential design allowed to non-significantly reduce the number of enrolled neonates by a median of 24 (31%) patients (IQR - 4.75 to 136.5, p = 0.0674) with respect to a traditional trial. When the number of neonates finally included in the analysis was considered, the difference became significant: 35 (57%) patients (IQR 10 to 136.5, p = 0.0033). Sequential trial designs have not been frequently used in Neonatology. They might potentially be able to reduce the number of patients in drug trials, although this is not always the case. What is known: • In evaluating rare diseases in fragile populations, traditional designs come at their limits. About 20% of pediatric trials are discontinued, mainly because of recruitment problems. What is new: • Sequential trials involving newborns were infrequently used and only a few (n = 21) are available for analysis. • The sequential design allowed to non-significantly reduce the number of enrolled neonates by a median of 24 (31%) patients (IQR - 4.75 to 136.5, p = 0.0674).

  9. Periodontal treatment for preventing adverse pregnancy outcomes

    DEFF Research Database (Denmark)

    Schwendicke, Falk; Karimbux, Nadeem; Allareddy, Veerasathpurush

    2015-01-01

    OBJECTIVES: Periodontal treatment might reduce adverse pregnancy outcomes. The efficacy of periodontal treatment to prevent preterm birth, low birth weight, and perinatal mortality was evaluated using meta-analysis and trial sequential analysis. METHODS: An existing systematic review was updated...... risk of random errors. RESULTS: Thirteen randomized clinical trials evaluating 6283 pregnant women were meta-analyzed. Four and nine trials had low and high risk of bias, respectively. Overall, periodontal treatment had no significant effect on preterm birth (odds ratio [95% confidence interval] 0.......79 [0.57-1.10]) or low birth weight (0.69 [0.43-1.13]). Trial sequential analysis demonstrated that futility was not reached for any of the outcomes. For populations with moderate occurrence (periodontal treatment was not efficacious for any of the outcomes...

  10. Contribution of body surface mapping to clinical outcome after surgical ablation of postinfarction ventricular tachycardia

    NARCIS (Netherlands)

    van Dessel, Pascal F.; van Hemel, Norbert M.; Groenewegen, Arne Sippens; de Bakker, Jacques M.; Linnebank, André C.; Defauw, Jo J.

    2002-01-01

    This article investigates the influence of body surface mapping on outcome of ventricular antiarrhythmic surgery. Preoperative mapping is advocated to optimize map-guided antiarrhythmic surgery of postinfarction ventricular tachycardia. We sequentially analyzed the results of catheter activation

  11. TELEGRAPHS TO INCANDESCENT LAMPS: A SEQUENTIAL PROCESS OF INNOVATION

    Directory of Open Access Journals (Sweden)

    Laurence J. Malone

    2000-01-01

    Full Text Available This paper outlines a sequential process of technological innovation in the emergence of the electrical industry in the United States from 1830 to 1880. Successive inventions that realize the commercial possibilities of electricity provided the foundation for an industry where technical knowledge, invention and diffusion were ultimately consolidated within the managerial structure of new firms. The genesis of the industry is traced, sequentially, through the development of the telegraph, arc light and incandescent lamp. Exploring the origins of the telegraph and incandescent lamp reveals a process where a series of inventions and firms result from successful efforts touse scientific principles to create new commodities and markets.

  12. Properties of simultaneous and sequential two-nucleon transfer

    International Nuclear Information System (INIS)

    Pinkston, W.T.; Satchler, G.R.

    1982-01-01

    Approximate forms of the first- and second-order distorted-wave Born amplitudes are used to study the overall structure, particularly the selection rules, of the amplitudes for simultaneous and sequential transfer of two nucleons. The role of the spin-state assumed for the intermediate deuterons in sequential (t, p) reactions is stressed. The similarity of one-step and two-step amplitudes for (α, d) reactions is exhibited, and the consequent absence of any obvious J-dependence in their interference is noted. (orig.)

  13. Sequential approach to Colombeau's theory of generalized functions

    International Nuclear Information System (INIS)

    Todorov, T.D.

    1987-07-01

    J.F. Colombeau's generalized functions are constructed as equivalence classes of the elements of a specially chosen ultrapower of the class of the C ∞ -functions. The elements of this ultrapower are considered as sequences of C ∞ -functions, so in a sense, the sequential construction presented here refers to the original Colombeau theory just as, for example, the Mikusinski sequential approach to the distribution theory refers to the original Schwartz theory of distributions. The paper could be used as an elementary introduction to the Colombeau theory in which recently a solution was found to the problem of multiplication of Schwartz distributions. (author). Refs

  14. Objective and Subjective Measures of Simultaneous vs Sequential Bilateral Cochlear Implants in Adults: A Randomized Clinical Trial.

    Science.gov (United States)

    Kraaijenga, Véronique J C; Ramakers, Geerte G J; Smulders, Yvette E; van Zon, Alice; Stegeman, Inge; Smit, Adriana L; Stokroos, Robert J; Hendrice, Nadia; Free, Rolien H; Maat, Bert; Frijns, Johan H M; Briaire, Jeroen J; Mylanus, E A M; Huinck, Wendy J; Van Zanten, Gijsbert A; Grolman, Wilko

    2017-09-01

    To date, no randomized clinical trial on the comparison between simultaneous and sequential bilateral cochlear implants (BiCIs) has been performed. To investigate the hearing capabilities and the self-reported benefits of simultaneous BiCIs compared with those of sequential BiCIs. A multicenter randomized clinical trial was conducted between January 12, 2010, and September 2, 2012, at 5 tertiary referral centers among 40 participants eligible for BiCIs. Main inclusion criteria were postlingual severe to profound hearing loss, age 18 to 70 years, and a maximum duration of 10 years without hearing aid use in both ears. Data analysis was conducted from May 24 to June 12, 2016. The simultaneous BiCI group received 2 cochlear implants during 1 surgical procedure. The sequential BiCI group received 2 cochlear implants with an interval of 2 years between implants. First, the results 1 year after receiving simultaneous BiCIs were compared with the results 1 year after receiving sequential BiCIs. Second, the results of 3 years of follow-up for both groups were compared separately. The primary outcome measure was speech intelligibility in noise from straight ahead. Secondary outcome measures were speech intelligibility in noise from spatially separated sources, speech intelligibility in silence, localization capabilities, and self-reported benefits assessed with various hearing and quality of life questionnaires. Nineteen participants were randomized to receive simultaneous BiCIs (11 women and 8 men; median age, 52 years [interquartile range, 36-63 years]), and another 19 participants were randomized to undergo sequential BiCIs (8 women and 11 men; median age, 54 years [interquartile range, 43-64 years]). Three patients did not receive a second cochlear implant and were unavailable for follow-up. Comparable results were found 1 year after simultaneous or sequential BiCIs for speech intelligibility in noise from straight ahead (difference, 0.9 dB [95% CI, -3.1 to 4.4 dB]) and

  15. A framework for evaluating forest restoration alternatives and their outcomes, over time, to inform monitoring: Bioregional inventory originated simulation under management

    Science.gov (United States)

    Jeremy S. Fried; Theresa B. Jain; Sara Loreno; Robert F. Keefe; Conor K. Bell

    2017-01-01

    The BioSum modeling framework summarizes current and prospective future forest conditions under alternative management regimes along with their costs, revenues and product yields. BioSum translates Forest Inventory and Analysis (FIA) data for input to the Forest Vegetation Simulator (FVS), summarizes FVS outputs for input to the treatment operations cost model (OpCost...

  16. A comparison of sequential and information-based methods for determining the co-integration rank in heteroskedastic VAR MODELS

    DEFF Research Database (Denmark)

    Cavaliere, Giuseppe; Angelis, Luca De; Rahbek, Anders

    2015-01-01

    In this article, we investigate the behaviour of a number of methods for estimating the co-integration rank in VAR systems characterized by heteroskedastic innovation processes. In particular, we compare the efficacy of the most widely used information criteria, such as Akaike Information Criterion....... The relative finite-sample properties of the different methods are investigated by means of a Monte Carlo simulation study. For the simulation DGPs considered in the analysis, we find that the BIC-based procedure and the bootstrap sequential test procedure deliver the best overall performance in terms......-based method to over-estimate the co-integration rank in relatively small sample sizes....

  17. Early embryo development in a sequential versus single medium: a randomized study

    Directory of Open Access Journals (Sweden)

    D'Hooghe Thomas M

    2010-07-01

    Full Text Available Abstract Background The success of in vitro fertilization techniques is defined by multiple factors including embryo culture conditions, related to the composition of the culture medium. In view of the lack of solid scientific data and in view of the current general belief that sequential media are superior to single media, the aim of this randomized study was to compare the embryo quality in two types of culture media. Methods In this study, the embryo quality on day 3 was measured as primary outcome. In total, 147 patients younger than 36 years treated with IVF/ICSI during the first or second cycle were included in this study. Embryos were randomly cultured in a sequential (group A or a single medium (group B to compare the embryo quality on day 1, day 2 and day 3. The embryo quality was compared in both groups using a Chi-square test with a significance level of 0.05. Results At day 1, the percentage of embryos with a cytoplasmic halo was higher in group B (46% than in group A (32%. At day 2, number of blastomeres, degree of fragmentation and the percentage of unequally sized blastomeres were higher in group B than in group A. At day 3, a higher percentage of embryos had a higher number of blastomeres and unequally sized blastomeres in group B. The number of good quality embryos (GQE was comparable in both groups. The embryo utilization rate was higher in group B (56% compared to group A (49%. Conclusions Although, no significant difference in the number of GQE was found in both media, the utilization rate was significantly higher when the embryos were cultured in the single medium compared to the sequential medium. The results of this study have a possible positive effect on the cumulative cryo-augmented pregnancy rate. Trial registration number NCT01094314

  18. Comparative study of lesions created by high-intensity focused ultrasound using sequential discrete and continuous scanning strategies.

    Science.gov (United States)

    Fan, Tingbo; Liu, Zhenbo; Zhang, Dong; Tang, Mengxing

    2013-03-01

    Lesion formation and temperature distribution induced by high-intensity focused ultrasound (HIFU) were investigated both numerically and experimentally via two energy-delivering strategies, i.e., sequential discrete and continuous scanning modes. Simulations were presented based on the combination of Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and bioheat equation. Measurements were performed on tissue-mimicking phantoms sonicated by a 1.12-MHz single-element focused transducer working at an acoustic power of 75 W. Both the simulated and experimental results show that, in the sequential discrete mode, obvious saw-tooth-like contours could be observed for the peak temperature distribution and the lesion boundaries, with the increasing interval space between two adjacent exposure points. In the continuous scanning mode, more uniform peak temperature distributions and lesion boundaries would be produced, and the peak temperature values would decrease significantly with the increasing scanning speed. In addition, compared to the sequential discrete mode, the continuous scanning mode could achieve higher treatment efficiency (lesion area generated per second) with a lower peak temperature. The present studies suggest that the peak temperature and tissue lesion resulting from the HIFU exposure could be controlled by adjusting the transducer scanning speed, which is important for improving the HIFU treatment efficiency.

  19. Construction of computational program of aging in insulating materials for searching reversed sequential test conditions to give damage equivalent to simultaneous exposure of heat and radiation

    International Nuclear Information System (INIS)

    Fuse, Norikazu; Homma, Hiroya; Okamoto, Tatsuki

    2013-01-01

    Two consecutive numerical calculations on degradation of polymeric insulations under thermal and radiation environment are carried out to simulate so-called reversal sequential acceleration test. The aim of the calculation is to search testing conditions which provide material damage equivalent to the case of simultaneous exposure of heat and radiation. At least following four parameters are needed to be considered in the sequential method; dose rate and exposure time in radiation, as well as temperature and aging time in heating. The present paper discusses the handling of these parameters and shows some trial calculation results. (author)

  20. Fast regularizing sequential subspace optimization in Banach spaces

    International Nuclear Information System (INIS)

    Schöpfer, F; Schuster, T

    2009-01-01

    We are concerned with fast computations of regularized solutions of linear operator equations in Banach spaces in case only noisy data are available. To this end we modify recently developed sequential subspace optimization methods in such a way that the therein employed Bregman projections onto hyperplanes are replaced by Bregman projections onto stripes whose width is in the order of the noise level

  1. A sequential hypothesis test based on a generalized Azuma inequality

    NARCIS (Netherlands)

    Reijsbergen, D.P.; Scheinhardt, Willem R.W.; de Boer, Pieter-Tjerk

    We present a new power-one sequential hypothesis test based on a bound for the probability that a bounded zero-mean martingale ever crosses a curve of the form $a(n+k)^b$. The proof of the bound is of independent interest.

  2. A generally applicable sequential alkaline phosphatase immunohistochemical double staining

    NARCIS (Netherlands)

    van der Loos, Chris M.; Teeling, Peter

    2008-01-01

    A universal type of sequential double alkaline phosphatase immunohistochemical staining is described that can be used for formalin-fixed, paraffin-embedded and cryostat tissue sections from human and mouse origin. It consists of two alkaline phosphatase detection systems including enzymatic

  3. Excessive pressure in multichambered cuffs used for sequential compression therapy

    NARCIS (Netherlands)

    Segers, P; Belgrado, JP; Leduc, A; Leduc, O; Verdonck, P

    2002-01-01

    Background and Purpose. Pneumatic compression devices, used as part of the therapeutic strategy for lymphatic drainage, often have cuffs with multiple chambers that are, inflated sequentially. The purpose of this study was to investigate (1) the relationship between cuff chamber pressure

  4. Retrieval of sea surface velocities using sequential Ocean Colour

    Indian Academy of Sciences (India)

    The Indian remote sensing satellite, IRS-P4 (Oceansat-I) launched on May 26th, 1999 carried two sensors on board, i.e., the Ocean Colour Monitor (OCM) and the Multi-frequency Scanning Microwave Radiometer (MSMR) dedicated for oceanographic research. Sequential data of IRS-P4 OCM has been analysed over parts ...

  5. Sequential and Biomechanical Factors Constrain Timing and Motion in Tapping

    NARCIS (Netherlands)

    Loehr, J.D.; Palmer, C.

    2009-01-01

    The authors examined how timing accuracy in tapping sequences is influenced by sequential effects of preceding finger movements and biomechanical interdependencies among fingers. Skilled pianists tapped Sequences at 3 rates; in each sequence, a finger whose motion was more or less independent of

  6. What determines the impact of context on sequential action?

    NARCIS (Netherlands)

    Ruitenberg, M.F.L.; Verwey, Willem B.; Abrahamse, E.L.

    2015-01-01

    In the current study we build on earlier observations that memory-based sequential action is better in the original learning context than in other contexts. We examined whether changes in the perceptual context have differential impact across distinct processing phases (preparation versus execution

  7. The Efficacy of Sequential Therapy in Eradication of Helicobacter ...

    African Journals Online (AJOL)

    2017-05-22

    May 22, 2017 ... pylori (H. pylori) eradication rates of standard triple, sequential and quadruple therapies including claritromycin regimes in this study. Materials and Methods: A total of 160 patients with dyspeptic symptoms were enrolled to the study. The patients were randomized to four groups of treatment protocols.

  8. The efficacy of sequential therapy in eradication of Helicobacter ...

    African Journals Online (AJOL)

    ... the Helicobacter pylori (H. pylori) eradication rates of standard triple, sequential and quadruple therapies including claritromycin regimes in this study. Materials and Methods: A total of 160 patients with dyspeptic symptoms were enrolled to the study. The patients were randomized to four groups of treatment protocols.

  9. In Vivo Evaluation of Synthetic Aperture Sequential Beamforming

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Hansen, Peter Møller; Lange, Theis

    2012-01-01

    Ultrasound in vivo imaging using synthetic aperture sequential beamformation (SASB) is compared with conventional imaging in a double blinded study using side-by-side comparisons. The objective is to evaluate if the image quality in terms of penetration depth, spatial resolution, contrast...

  10. Quantum chromodynamics as the sequential fragmenting with inactivation

    International Nuclear Information System (INIS)

    Botet, R.

    1996-01-01

    We investigate the relation between the modified leading log approximation of the perturbative QCD and the sequential binary fragmentation process. We will show that in the absence of inactivation, this process is equivalent to the QCD gluodynamics. The inactivation term yields a precise prescription of how to include the hadronization in the QCD equations. (authors)

  11. Quantum chromodynamics as the sequential fragmenting with inactivation

    Energy Technology Data Exchange (ETDEWEB)

    Botet, R. [Paris-11 Univ., 91 - Orsay (France). Lab. de Physique des Solides; Ploszajczak, M. [Grand Accelerateur National d`Ions Lourds (GANIL), 14 - Caen (France)

    1996-12-31

    We investigate the relation between the modified leading log approximation of the perturbative QCD and the sequential binary fragmentation process. We will show that in the absence of inactivation, this process is equivalent to the QCD gluodynamics. The inactivation term yields a precise prescription of how to include the hadronization in the QCD equations. (authors). 15 refs.

  12. The Motivating Language of Principals: A Sequential Transformative Strategy

    Science.gov (United States)

    Holmes, William Tobias

    2012-01-01

    This study implemented a Sequential Transformative Mixed Methods design with teachers (as recipients) and principals (to give voice) in the examination of principal talk in two different school accountability contexts (Continuously Improving and Continuously Zigzag) using the conceptual framework of Motivating Language Theory. In phase one,…

  13. The sequential price of anarchy for atomic congestion games

    NARCIS (Netherlands)

    de Jong, Jasper; Uetz, Marc Jochen

    2014-01-01

    In situations without central coordination, the price of anarchy relates the quality of any Nash equilibrium to the quality of a global optimum. Instead of assuming that all players choose their actions simultaneously, here we consider games where players choose their actions sequentially. The

  14. Sequential infiltration synthesis for enhancing multiple-patterning lithography

    Science.gov (United States)

    Darling, Seth B.; Elam, Jeffrey W.; Tseng, Yu-Chih

    2017-06-20

    Simplified methods of multiple-patterning photolithography using sequential infiltration synthesis to modify the photoresist such that it withstands plasma etching better than unmodified resist and replaces one or more hard masks and/or a freezing step in MPL processes including litho-etch-litho-etch photolithography or litho-freeze-litho-etch photolithography.

  15. The sequential structure of brain activation predicts skill.

    Science.gov (United States)

    Anderson, John R; Bothell, Daniel; Fincham, Jon M; Moon, Jungaa

    2016-01-29

    In an fMRI study, participants were trained to play a complex video game. They were scanned early and then again after substantial practice. While better players showed greater activation in one region (right dorsal striatum) their relative skill was better diagnosed by considering the sequential structure of whole brain activation. Using a cognitive model that played this game, we extracted a characterization of the mental states that are involved in playing a game and the statistical structure of the transitions among these states. There was a strong correspondence between this measure of sequential structure and the skill of different players. Using multi-voxel pattern analysis, it was possible to recognize, with relatively high accuracy, the cognitive states participants were in during particular scans. We used the sequential structure of these activation-recognized states to predict the skill of individual players. These findings indicate that important features about information-processing strategies can be identified from a model-based analysis of the sequential structure of brain activation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    Sequential mixed methods research is an effective approach for investigating complex problems, but it has not been extensively used in construction management research. In South Africa, the HIV/AIDS pandemic has seen construction management taking on a vital responsibility since the government called upon the ...

  17. Algorithm for Non-proportional Loading in Sequentially Linear Analysis

    NARCIS (Netherlands)

    Yu, C.; Hoogenboom, P.C.J.; Rots, J.G.; Saouma, V.; Bolander, J.; Landis, E.

    2016-01-01

    Sequentially linear analysis (SLA) is an alternative to the Newton-Raphson method for analyzing the nonlinear behavior of reinforced concrete and masonry structures. In this paper SLA is extended to load cases that are applied one after the other, for example first dead load and then wind load. It

  18. Concurrent Learning of Control in Multi agent Sequential Decision Tasks

    Science.gov (United States)

    2018-04-17

    Concurrent Learning of Control in Multi-agent Sequential Decision Tasks The overall objective of this project was to develop multi-agent reinforcement... learning (MARL) approaches for intelligent agents to autonomously learn distributed control policies in decentral- ized partially observable... learning of policies in Dec-POMDPs, established performance bounds, evaluated these algorithms both theoretically and empirically, The views

  19. Sequential stenotic strictures of the small bowel leading to obstruction

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Small bowel obstructions (SBOs) are primarily caused by adhesions, hernias, neoplasms, or inflammatory strictures. Intraluminal strictures are an uncommon cause of SBO. This report describes our findings in a unique case of sequential, stenotic intraluminal strictures of the small intestine, discusses the differential diagnosis of intraluminal intestinal strictures, and reviews the literature regarding intraluminal pathology.

  20. Decomposition of Copper (II) Sulfate Pentahydrate: A Sequential Gravimetric Analysis.

    Science.gov (United States)

    Harris, Arlo D.; Kalbus, Lee H.

    1979-01-01

    Describes an improved experiment of the thermal dehydration of copper (II) sulfate pentahydrate. The improvements described here are control of the temperature environment and a quantitative study of the decomposition reaction to a thermally stable oxide. Data will suffice to show sequential gravimetric analysis. (Author/SA)

  1. Standardized method for reproducing the sequential X-rays flap

    International Nuclear Information System (INIS)

    Brenes, Alejandra; Molina, Katherine; Gudino, Sylvia

    2009-01-01

    A method is validated to estandardize in the taking, developing and analysis of bite-wing radiographs taken in sequential way, in order to compare and evaluate detectable changes in the evolution of the interproximal lesions through time. A radiographic positioner called XCP® is modified by means of a rigid acrylic guide, to achieve proper of the X ray equipment core positioning relative to the XCP® ring and the reorientation during the sequential x-rays process. 16 subjects of 4 to 40 years old are studied for a total number of 32 registries. Two x-rays of the same block of teeth of each subject have been taken in sequential way, with a minimal difference of 30 minutes between each one, before the placement of radiographic attachment. The images have been digitized with a Super Cam® scanner and imported to a software. The measurements in X and Y-axis for both x-rays were performed to proceed to compare. The intraclass correlation index (ICI) has shown that the proposed method is statistically related to measurement (mm) obtained in the X and Y-axis for both sequential series of x-rays (p=0.01). The measures of central tendency and dispersion have shown that the usual occurrence is indifferent between the two measurements (Mode 0.000 and S = 0083 and 0.109) and that the probability of occurrence of different values is lower than expected. (author) [es

  2. Sequential Analysis of Metals in Municipal Dumpsite Composts of ...

    African Journals Online (AJOL)

    ... Ni) in Municipal dumpsite compost were determined by the sequential extraction method. Chemical parameters such as pH, conductivity, and organic carbon contents of the samples were also determined. Analysis of the extracts was carried out by atomic absorption spectrophotometer machine (Buck Scientific VPG 210).

  3. Investigation of the sequential validity of quality improvement team ...

    African Journals Online (AJOL)

    Background: Self-assessment is widely used in the health care improvement collaboratives quality improvement (QI) teams' to assess their own performance. There is mixed evidence on the validity of this approach. This study investigated sequential validity of self-assessments in a QI HIV collaborative in Tanzania.

  4. The one-shot deviation principle for sequential rationality

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Whitta-Jacobsen, Hans Jørgen; Sloth, Birgitte

    1996-01-01

    We present a decentralization result which is useful for practical and theoretical work with sequential equilibrium, perfect Bayesian equilibrium, and related equilibrium concepts for extensive form games. A weak consistency condition is sufficient to obtain an analogy to the well known One-Stage......-Stage-Deviation Principle for subgame perfect equilibrium...

  5. A solution for automatic parallelization of sequential assembly code

    Directory of Open Access Journals (Sweden)

    Kovačević Đorđe

    2013-01-01

    Full Text Available Since modern multicore processors can execute existing sequential programs only on a single core, there is a strong need for automatic parallelization of program code. Relying on existing algorithms, this paper describes one new software solution tool for parallelization of sequential assembly code. The main goal of this paper is to develop the parallelizator which reads sequential assembler code and at the output provides parallelized code for MIPS processor with multiple cores. The idea is the following: the parser translates assembler input file to program objects suitable for further processing. After that the static single assignment is done. Based on the data flow graph, the parallelization algorithm separates instructions on different cores. Once sequential code is parallelized by the parallelization algorithm, registers are allocated with the algorithm for linear allocation, and the result at the end of the program is distributed assembler code on each of the cores. In the paper we evaluate the speedup of the matrix multiplication example, which was processed by the parallelizator of assembly code. The result is almost linear speedup of code execution, which increases with the number of cores. The speed up on the two cores is 1.99, while on 16 cores the speed up is 13.88.

  6. Making Career Decisions--A Sequential Elimination Approach.

    Science.gov (United States)

    Gati, Itamar

    1986-01-01

    Presents a model for career decision making based on the sequential elimination of occupational alternatives, an adaptation for career decisions of Tversky's (1972) elimination-by-aspects theory of choice. The expected utility approach is reviewed as a representative compensatory model for career decisions. Advantages, disadvantages, and…

  7. Biohydrogen production from beet molasses by sequential dark and photofermentation

    NARCIS (Netherlands)

    Özgür, E.; Mars, A.E.; Peksel, B.; Louwerse, A.; Yücel, M.; Gündüz, U.; Claassen, P.A.M.; Eroglu, I.

    2010-01-01

    Biological hydrogen production using renewable resources is a promising possibility to generate hydrogen in a sustainable way. In this study, a sequential dark and photofermentation has been employed for biohydrogen production using sugar beet molasses as a feedstock. An extreme thermophile

  8. Influence of synchronous and sequential stimulation on muscle fatigue

    NARCIS (Netherlands)

    Thomsen, M.; Thomsen, M.; Veltink, Petrus H.

    1997-01-01

    In acute experiments the sciatic nerve of the rat is electrically stimulated to induce fatigue in the medial Gastrocnemius muscle. Fatigue tests are carried out using intermittent stimulation of different compartments (sequential) or a single compartment (synchronous) of the sciatic nerve. The

  9. A Relational Account of Call-by-Value Sequentiality

    DEFF Research Database (Denmark)

    Riecke, Jon Gary; Sandholm, Anders Bo

    2002-01-01

    We construct a model for FPC, a purely functional, sequential, call-by-value language. The model is built from partial continuous functions, in the style of Plotkin, further constrained to be uniform with respect to a class of logical relations. We prove that the model is fully abstract....

  10. Sequential kidney scintiscanning before and after vascular reconstruction

    International Nuclear Information System (INIS)

    Siems, H.H.; Allenberg, J.R.; Hupp, T.; Clorius, J.H.

    1985-01-01

    In this follow-up study sequential scintigraphy was performed on 20 of selected patients up to 3.4 years after operation, the results are compared with the pre-operative examinations and with the surgical effect on the increased blood pressure. (orig./MG) [de

  11. Regular in-situ simulation training of paediatric Medical Emergency Team leads to sustained improvements in hospital response to deteriorating patients, improved outcomes in intensive care and financial savings.

    Science.gov (United States)

    Theilen, Ulf; Fraser, Laura; Jones, Patricia; Leonard, Paul; Simpson, Dave

    2017-06-01

    The introduction of a paediatric Medical Emergency Team (pMET) was accompanied by weekly in-situ simulation team training. Key ward staff participated in team training, focusing on recognition of the deteriorating child, teamwork and early involvement of senior staff. Following an earlier study [1], this investigation aimed to evaluate the long-term impact of ongoing regular team training on hospital response to deteriorating ward patients, patient outcome and financial implications. Prospective cohort study of all deteriorating in-patients in a tertiary paediatric hospital requiring admission to paediatric intensive care (PICU) the year before, 1year after and 3 years after the introduction of pMET and team training. Deteriorating patients were recognised more promptly (before/1year after/3years after pMET; median time 4/1.5/0.5h, pIntroduction of pMET coincided with significantly reduced hospital mortality (p<0.001). These results indicate that lessons learnt by ward staff during team training led to sustained improvements in the hospital response to critically deteriorating in-patients, significantly improved patient outcomes and substantial savings. Integration of regular in-situ simulation training of medical emergency teams, including key ward staff, in routine clinical care has potential application in all acute specialties. Copyright © 2017. Published by Elsevier B.V.

  12. Trial Sequential Analysis in systematic reviews with meta-analysis

    Directory of Open Access Journals (Sweden)

    Jørn Wetterslev

    2017-03-01

    Full Text Available Abstract Background Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors and too many false negative conclusions (type II errors. Methods We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. Results The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D2 measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in

  13. Assessments in outcome evaluation in aphasia therapy

    DEFF Research Database (Denmark)

    Isaksen, Jytte; Brouwer, Catherine E.

    2015-01-01

    Abstract Outcomes of aphasia therapy in Denmark are documented in evaluation sessions in which both the person with aphasia and the speech-language therapist take part. The participants negotiate agreements on the results of therapy. By means of conversation analysis, we study how such agreements...... on therapy outcome are reached interactionally. The sequential analysis of 34 video recordings focuses on a recurrent method for reaching agreements in these outcome evaluation sessions. In and through a special sequence of conversational assessment it is claimed that the person with aphasia has certain...

  14. Antipyretic therapy in critically ill patients with established sepsis: a trial sequential analysis.

    Directory of Open Access Journals (Sweden)

    Zhongheng Zhang

    Full Text Available antipyretic therapy for patients with sepsis has long been debated. The present study aimed to explore the beneficial effect of antipyretic therapy for ICU patients with sepsis.systematic review and trial sequential analysis of randomized controlled trials.Pubmed, Scopus, EBSCO and EMBASE were searched from inception to August 5, 2014.Mortality was dichotomized as binary outcome variable and odds ratio (OR was chosen to be the summary statistic. Pooled OR was calculated by using DerSimonian and Laird method. Statistical heterogeneity was assessed by using the statistic I2. Trial sequential analysis was performed to account for the small number of trials and patients.A total of 6 randomized controlled trials including 819 patients were included into final analysis. Overall, there was no beneficial effect of antipyretic therapy on mortality risk in patients with established sepsis (OR: 1.02, 95% CI: 0.50-2.05. The required information size (IS was 2582 and our analysis has not yet reached half of the IS. The Z-curve did not cross the O'Brien-Fleming α-spending boundary or reach the futility, indicating that the non-significant result was probably due to lack of statistical power.our study fails to identify any beneficial effect of antipyretic therapy on ICU patients with established diagnosis of sepsis. Due to limited number of total participants, more studies are needed to make a conclusive and reliable analysis.

  15. Sequential vs simultaneous revascularization in patients undergoing liver transplantation: A meta-analysis.

    Science.gov (United States)

    Wang, Jia-Zhong; Liu, Yang; Wang, Jin-Long; Lu, Le; Zhang, Ya-Fei; Lu, Hong-Wei; Li, Yi-Ming

    2015-06-14

    We undertook this meta-analysis to investigate the relationship between revascularization and outcomes after liver transplantation. A literature search was performed using MeSH and key words. The quality of the included studies was assessed using the Jadad Score and the Newcastle-Ottawa Scale. Heterogeneity was evaluated by the χ(2) and I (2) tests. The risk of publication bias was assessed using a funnel plot and Egger's test, and the risk of bias was assessed using a domain-based assessment tool. A sensitivity analysis was conducted by reanalyzing the data using different statistical approaches. Six studies with a total of 467 patients were included. Ischemic-type biliary lesions were significantly reduced in the simultaneous revascularization group compared with the sequential revascularization group (OR = 4.97, 95%CI: 2.45-10.07; P simultaneous revascularization group. Although warm ischemia time was prolonged in simultaneous revascularization group (MD = -25.84, 95%CI: -29.28-22.40; P sequential and simultaneous revascularization groups. Assessment of the risk of bias showed that the methods of random sequence generation and blinding might have been a source of bias. The sensitivity analysis strengthened the reliability of the results of this meta-analysis. The results of this study indicate that simultaneous revascularization in liver transplantation may reduce the incidence of ischemic-type biliary lesions and length of stay of patients in the ICU.

  16. Managerial adjustment and its limits: sequential fault in comparative perspective

    Directory of Open Access Journals (Sweden)

    Flávio da Cunha Rezende

    2008-01-01

    Full Text Available This article focuses on explanations for sequential faults in administrative reform. It deals with the limits of managerial adjustment in an approach that attempts to connect theory and empirical data, articulating three levels of analysis. The first level presents comparative evidence of sequential fault within reforms in national governments through a set of indicators geared toward understanding changes in the role of the state. In light of analyses of a representative set of comparative studies on reform implementation, the second analytical level proceeds to identify four typical mechanisms that are present in explanations on managerial adjustment faults. In this way, we seek to configure an explanatory matrix for theories on sequential fault. Next we discuss the experience of management reform in the Brazilian context, conferring special attention on one of the mechanisms that creates fault: the control dilemma. The major hypotheses that guide our article are that reforms lead to sequential fault and that there are at least four causal mechanisms that produce reforms: a transactions costs involved in producing reforms; b performance legacy; c predominance of fiscal adjustment and d the control dilemma. These mechanisms act separately or in concert, and act to decrease chances for a transformation of State managerial patterns. Major evidence that is analyzed in these articles lend consistency to the general argument that reforms have failed in their attempts to reduce public expenses, alter patterns of resource allocation, reduce the labor force and change the role of the State. Our major conclusion is that reforms fail sequentially and managerial adjustment displays considerable limitations, particularly those of a political nature.

  17. Accuracy of respiratory motion measurement of 4D-MRI: A comparison between cine and sequential acquisition.

    Science.gov (United States)

    Liu, Yilin; Yin, Fang-Fang; Rhee, DongJoo; Cai, Jing

    2016-01-01

    The authors have recently developed a cine-mode T2*/T1-weighted 4D-MRI technique and a sequential-mode T2-weighted 4D-MRI technique for imaging respiratory motion. This study aims at investigating which 4D-MRI image acquisition mode, cine or sequential, provides more accurate measurement of organ motion during respiration. A 4D digital extended cardiac-torso (XCAT) human phantom with a hypothesized tumor was used to simulate the image acquisition and the 4D-MRI reconstruction. The respiratory motion was controlled by the given breathing signal profiles. The tumor was manipulated to move continuously with the surrounding tissue. The motion trajectories were measured from both sequential- and cine-mode 4D-MRI images. The measured trajectories were compared with the average trajectory calculated from the input profiles, which was used as references. The error in 4D-MRI tumor motion trajectory (E) was determined. In addition, the corresponding respiratory motion amplitudes of all the selected 2D images for 4D reconstruction were recorded. Each of the amplitude was compared with the amplitude of its associated bin on the average breathing curve. The mean differences from the average breathing curve across all slice positions (D) were calculated. A total of 500 simulated respiratory profiles with a wide range of irregularity (Ir) were used to investigate the relationship between D and Ir. Furthermore, statistical analysis of E and D using XCAT controlled by 20 cancer patients' breathing profiles was conducted. Wilcoxon Signed Rank test was conducted to compare two modes. D increased faster for cine-mode (D = 1.17 × Ir + 0.23) than sequential-mode (D = 0.47 × Ir + 0.23) as irregularity increased. For the XCAT study using 20 cancer patients' breathing profiles, the median E values were significantly different: 0.12 and 0.10 cm for cine- and sequential-modes, respectively, with a p-value of 0.02. The median D values were significantly different: 0.47 and 0.24 cm for cine

  18. SCRLH-TL Based Sequential Rotation Feed Network for Broadband Circularly Polarized Antenna Array

    Directory of Open Access Journals (Sweden)

    B. F. Zong

    2016-04-01

    Full Text Available In this paper, a broadband circularly polarized (CP microstrip antenna array using composite right/left-handed transmission line (SCRLH-TL based sequential rotation (SR feed network is presented. The characteristics of a SCRLH-TL are initially investigated. Then, a broadband and low insertion loss 45º phase shifter is designed using the SCRLH-TL and the phase shifter is employed in constructing a SR feed network for CP antenna array. To validate the design method of the SR feed network, a 2×2 antenna array comprising sequentially rotated coupled stacked CP antenna elements is designed, fabricated and measured. Both the simulated and measured results indicate that the performances of the antenna element are further enhanced when the SR network is used. The antenna array exhibits the VSWR less than 1.8 dB from 4 GHz to 7 GHz and the 3 dB axial ratio (AR from 4.4 GHz to 6.8 GHz. Also, high peak gain of 13.7 dBic is obtained. Besides, the normalized radiation patterns at the operating frequencies are symmetrical and the side lobe levels are low at φ=0º and φ=90º.

  19. Removal of toluene by sequential adsorption-plasma oxidation: Mixed support and catalyst deactivation.

    Science.gov (United States)

    Qin, Caihong; Huang, Xuemin; Zhao, Junjie; Huang, Jiayu; Kang, Zhongli; Dang, Xiaoqing

    2017-07-15

    A sequential adsorption-plasma oxidation system was used to remove toluene from simulated dry air using γ-Al 2 O 3 , HZSM-5, a mixture of the two materials or their supported Mn-Ag catalyst as adsorbents under atmospheric pressure and room temperature. After 120min of plasma oxidation, γ-Al 2 O 3 had a better carbon balance (∼75%) than HZSM-5, but the CO 2 yield of γ-Al 2 O 3 was only ∼50%; and there was some desorption of toluene when γ-Al 2 O 3 was used. When a mixture of HZSM-5 and γ-Al 2 O 3 with a mass ratio of 1/2 was used, the carbon balance was up to 90% and 82% of this was CO 2 . The adsorption performance and electric discharge characteristics of the mixed supports were tested in order to rationalize this high CO x yield. After seven cycles of sequential adsorption-plasma oxidation, support and Mn-Ag catalyst deactivation occurred. The support and catalyst were characterized before and after deactivation by SEM, a BET method, XRD, XPS and GC-MS in order to probe the mechanism of their deactivation. 97.6% of the deactivated supports and 76% of the deactivated catalysts could be recovered by O 2 temperature-programmed oxidation. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Markov decision processes: a tool for sequential decision making under uncertainty.

    Science.gov (United States)

    Alagoz, Oguzhan; Hsu, Heather; Schaefer, Andrew J; Roberts, Mark S

    2010-01-01

    We provide a tutorial on the construction and evaluation of Markov decision processes (MDPs), which are powerful analytical tools used for sequential decision making under uncertainty that have been widely used in many industrial and manufacturing applications but are underutilized in medical decision making (MDM). We demonstrate the use of an MDP to solve a sequential clinical treatment problem under uncertainty. Markov decision processes generalize standard Markov models in that a decision process is embedded in the model and multiple decisions are made over time. Furthermore, they have significant advantages over standard decision analysis. We compare MDPs to standard Markov-based simulation models by solving the problem of the optimal timing of living-donor liver transplantation using both methods. Both models result in the same optimal transplantation policy and the same total life expectancies for the same patient and living donor. The computation time for solving the MDP model is significantly smaller than that for solving the Markov model. We briefly describe the growing literature of MDPs applied to medical decisions.