WorldWideScience

Sample records for parallel design clinical

  1. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  2. SOFTWARE FOR DESIGNING PARALLEL APPLICATIONS

    Directory of Open Access Journals (Sweden)

    M. K. Bouza

    2017-01-01

    Full Text Available The object of research is the tools to support the development of parallel programs in C/C ++. The methods and software which automates the process of designing parallel applications are proposed.

  3. Critical appraisal of arguments for the delayed-start design proposed as alternative to the parallel-group randomized clinical trial design in the field of rare disease.

    Science.gov (United States)

    Spineli, Loukia M; Jenz, Eva; Großhennig, Anika; Koch, Armin

    2017-08-17

    A number of papers have proposed or evaluated the delayed-start design as an alternative to the standard two-arm parallel group randomized clinical trial (RCT) design in the field of rare disease. However the discussion is felt to lack a sufficient degree of consideration devoted to the true virtues of the delayed start design and the implications either in terms of required sample-size, overall information, or interpretation of the estimate in the context of small populations. To evaluate whether there are real advantages of the delayed-start design particularly in terms of overall efficacy and sample size requirements as a proposed alternative to the standard parallel group RCT in the field of rare disease. We used a real-life example to compare the delayed-start design with the standard RCT in terms of sample size requirements. Then, based on three scenarios regarding the development of the treatment effect over time, the advantages, limitations and potential costs of the delayed-start design are discussed. We clarify that delayed-start design is not suitable for drugs that establish an immediate treatment effect, but for drugs with effects developing over time, instead. In addition, the sample size will always increase as an implication for a reduced time on placebo resulting in a decreased treatment effect. A number of papers have repeated well-known arguments to justify the delayed-start design as appropriate alternative to the standard parallel group RCT in the field of rare disease and do not discuss the specific needs of research methodology in this field. The main point is that a limited time on placebo will result in an underestimated treatment effect and, in consequence, in larger sample size requirements compared to those expected under a standard parallel-group design. This also impacts on benefit-risk assessment.

  4. Xyce parallel electronic simulator design.

    Energy Technology Data Exchange (ETDEWEB)

    Thornquist, Heidi K.; Rankin, Eric Lamont; Mei, Ting; Schiek, Richard Louis; Keiter, Eric Richard; Russo, Thomas V.

    2010-09-01

    This document is the Xyce Circuit Simulator developer guide. Xyce has been designed from the 'ground up' to be a SPICE-compatible, distributed memory parallel circuit simulator. While it is in many respects a research code, Xyce is intended to be a production simulator. As such, having software quality engineering (SQE) procedures in place to insure a high level of code quality and robustness are essential. Version control, issue tracking customer support, C++ style guildlines and the Xyce release process are all described. The Xyce Parallel Electronic Simulator has been under development at Sandia since 1999. Historically, Xyce has mostly been funded by ASC, the original focus of Xyce development has primarily been related to circuits for nuclear weapons. However, this has not been the only focus and it is expected that the project will diversify. Like many ASC projects, Xyce is a group development effort, which involves a number of researchers, engineers, scientists, mathmaticians and computer scientists. In addition to diversity of background, it is to be expected on long term projects for there to be a certain amount of staff turnover, as people move on to different projects. As a result, it is very important that the project maintain high software quality standards. The point of this document is to formally document a number of the software quality practices followed by the Xyce team in one place. Also, it is hoped that this document will be a good source of information for new developers.

  5. Parallel kinematics type, kinematics, and optimal design

    CERN Document Server

    Liu, Xin-Jun

    2014-01-01

    Parallel Kinematics- Type, Kinematics, and Optimal Design presents the results of 15 year's research on parallel mechanisms and parallel kinematics machines. This book covers the systematic classification of parallel mechanisms (PMs) as well as providing a large number of mechanical architectures of PMs available for use in practical applications. It focuses on the kinematic design of parallel robots. One successful application of parallel mechanisms in the field of machine tools, which is also called parallel kinematics machines, has been the emerging trend in advanced machine tools. The book describes not only the main aspects and important topics in parallel kinematics, but also references novel concepts and approaches, i.e. type synthesis based on evolution, performance evaluation and optimization based on screw theory, singularity model taking into account motion and force transmissibility, and others.   This book is intended for researchers, scientists, engineers and postgraduates or above with interes...

  6. Design considerations for parallel graphics libraries

    Science.gov (United States)

    Crockett, Thomas W.

    1994-01-01

    Applications which run on parallel supercomputers are often characterized by massive datasets. Converting these vast collections of numbers to visual form has proven to be a powerful aid to comprehension. For a variety of reasons, it may be desirable to provide this visual feedback at runtime. One way to accomplish this is to exploit the available parallelism to perform graphics operations in place. In order to do this, we need appropriate parallel rendering algorithms and library interfaces. This paper provides a tutorial introduction to some of the issues which arise in designing parallel graphics libraries and their underlying rendering algorithms. The focus is on polygon rendering for distributed memory message-passing systems. We illustrate our discussion with examples from PGL, a parallel graphics library which has been developed on the Intel family of parallel systems.

  7. Design paper: The CapOpus trial: a randomized, parallel-group, observer-blinded clinical trial of specialized addiction treatment versus treatment as usual for young patients with cannabis abuse and psychosis

    DEFF Research Database (Denmark)

    Hjorthøj, Carsten; Fohlmann, Allan; Larsen, Anne-Mette

    2008-01-01

    : The major objective for the CapOpus trial is to evaluate the additional effect on cannabis abuse of a specialized addiction treatment program adding group treatment and motivational interviewing to treatment as usual. DESIGN: The trial is designed as a randomized, parallel-group, observer-blinded clinical...

  8. A Topological Model for Parallel Algorithm Design

    Science.gov (United States)

    1991-09-01

    effort should be directed to planning, requirements analysis, specification and design, with 20% invested into the actual coding, and then the final 40...be olle more language to learn. And by investing the effort into improving the utility of ai, existing language instead of creating a new one, this...193) it abandons the notion of a process as a fundemental concept of parallel program design and that it facilitates program derivation by rigorously

  9. Pre-operative use of dexamethasone does not reduce incidence or intensity of bleaching-induced tooth sensitivity. A triple-blind, parallel-design, randomized clinical trial.

    Science.gov (United States)

    da Costa Poubel, Luiz Augusto; de Gouvea, Cresus Vinicius Deppes; Calazans, Fernanda Signorelli; Dip, Etyene Castro; Alves, Wesley Veltri; Marins, Stella Soares; Barcelos, Roberta; Barceleiro, Marcos Oliveira

    2018-04-25

    This study evaluated the effect of the administration of pre-operative dexamethasone on tooth sensitivity stemming from in-office bleaching. A triple-blind, parallel-design, randomized clinical trial was conducted on 70 volunteers who received dexamethasone or placebo capsules. The drugs were administered in a protocol of three daily 8-mg doses of the drug, starting 48 h before the in-office bleaching treatment. Two bleaching sessions with 37.5% hydrogen peroxide gel were performed with a 1-week interval. Tooth sensitivity (TS) was recorded on visual analog scales (VAS) and numeric rating scales (NRS) in different periods up to 48 h after bleaching. The color evaluations were also performed. The absolute risk of TS and its intensity were evaluated by using Fisher's exact test. Comparisons of the TS intensity (NRS and VAS data) were performed by using the Mann-Whitney U test and a two-way repeated measures ANOVA and Tukey's test, respectively. In both groups, a high risk of TS (Dexa 80% x Placebo 94%) was detected. No significant difference was observed in terms of TS intensity. A whitening of approximately 3 shade guide units of the VITA Classical was detected in both groups, which were statistically similar. It was concluded that the administration pre-operatively of dexamethasone, in the proposed protocol, does not reduce the incidence or intensity of bleaching-induced tooth sensitivity. The use of dexamethasone drug before in-office bleaching treatment does not reduce incidence or intensity of tooth sensitivity. NCT02956070.

  10. The clinical efficacy of reminiscence therapy in patients with mild-to-moderate Alzheimer disease: Study protocol for a randomized parallel-design controlled trial.

    Science.gov (United States)

    Li, Mo; Lyu, Ji-Hui; Zhang, Yi; Gao, Mao-Long; Li, Wen-Jie; Ma, Xin

    2017-12-01

    Alzheimer disease (AD) is one of the most common diseases among the older adults. Currently, various nonpharmacological interventions are used for the treatment of AD. Such as reminiscence therapy is being widely used in Western countries. However, it is often used as an empirical application in China; the evidence-based efficacy of reminiscence therapy in AD patients remains to be determined. Therefore, the aim of this research is to assess the effectives of reminiscence therapy for Chinese elderly. This is a randomized parallel-design controlled trial. Mild and moderate AD patients who are in the Beijing Geriatric Hospital, China will be randomized into control and intervention groups (n = 45 for each group). For the intervention group, along with conventional drug therapy, participants will be exposed to a reminiscence therapy of 35 to 45 minutes, 2 times/wk for 12 consecutive weeks. Patients in the control group will undergo conventional drug therapy only. The primary outcome measure will be the differences in Alzheimer disease Assessment Scale-Cognitive Section Score. The secondary outcome measures will be the differences in the Cornell scale for depression in dementia, Neuropsychiatric Inventory score, and Barthel Index scores at baseline, at 4 and 12 weeks of treatment, and 12 weeks after treatment. The protocols have been approved by the ethics committee of Beijing Geriatric Hospital of China (approval no. 2015-010). Findings will be disseminated through presentation at scientific conferences and in academic journals. Chinese Clinical Trial Registry identifier ChiCTR-INR-16009505. Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. All rights reserved.

  11. Conceptual design of multiple parallel switching controller

    International Nuclear Information System (INIS)

    Ugolini, D.; Yoshikawa, S.; Ozawa, K.

    1996-01-01

    This paper discusses the conceptual design and the development of a preliminary model of a multiple parallel switching (MPS) controller. The introduction of several advanced controllers has widened and improved the control capability of nonlinear dynamical systems. However, it is not possible to uniquely define a controller that always outperforms the others, and, in many situations, the controller providing the best control action depends on the operating conditions and on the intrinsic properties and behavior of the controlled dynamical system. The desire to combine the control action of several controllers with the purpose to continuously attain the best control action has motivated the development of the MPS controller. The MPS controller consists of a number of single controllers acting in parallel and of an artificial intelligence (AI) based selecting mechanism. The AI selecting mechanism analyzes the output of each controller and implements the one providing the best control performance. An inherent property of the MPS controller is the possibility to discard unreliable controllers while still being able to perform the control action. To demonstrate the feasibility and the capability of the MPS controller the simulation of the on-line operation control of a fast breeder reactor (FBR) evaporator is presented. (author)

  12. A Phase III, Multicenter, Parallel-Design Clinical Trial to Compare the Efficacy and Safety of 5% Minoxidil Foam Versus Vehicle in Women With Female Pattern Hair Loss.

    Science.gov (United States)

    Bergfeld, Wilma; Washenik, Ken; Callender, Valerie; Zhang, Paul; Quiza, Carlos; Doshi, Uday; Blume-Peytavi, Ulrike

    2016-07-01

    BACKGROUND Female pattern hair loss (FPHL) is a common hair disorder that affects millions of women. A new 5% minoxidil topical foam (MTF) formulation, which does not contain propylene glycol, has been developed. To compare the efficacy and safety of once-daily 5% MTF with vehicle foam for the treatment of FPHL. This was a Phase III, randomized, double-blind, vehicle-controlled, parallel-group, international multicenter trial (17 sites) in women aged at least 18 years with FPHL (grade D3 to D6 on the Savin Density Scale), treated once daily with 5% MTF or vehicle foam for 24 weeks. The co-primary efficacy endpoints were the change from baseline at week 24 in target area hair count (TAHC) and subject assessment of scalp coverage. Also evaluated were TAHC at week 12, expert panel review of hair regrowth at week 24, and change from baseline in total unit area density (TUAD, sum of hair diameters/cm2) at weeks 12 and 24. A total of 404 women were enrolled. At 12 and 24 weeks, 5% MTF treatment resulted in regrowth of 10.9 hairs/cm2 and 9.1 hairs/cm2 more than vehicle foam, respectively (both P<.0001). Improved scalp coverage at week 24 was observed by both subject self-assessment (0.69-point improvement over vehicle foam; P<.0001) and expert panel review (0.36-point improvement over the vehicle foam; P<.0001). TUAD increased by 658 μm/cm2 and 644 μm/cm2 more with 5% MTF than with vehicle foam at weeks 12 and 24, respectively (both P<.0001). MTF was well tolerated. A low incidence of scalp irritation and facial hypertrichosis was observed, with no clinically significant differences between groups. Five percent MTF once daily for 24 weeks was well tolerated and promoted hair regrowth in women with FPHL, resulting in improved scalp coverage and increased hair density compared with vehicle foam. ClinicalTrials.gov identifier: nCT01226459J Drugs Dermatol. 2016;15(7):874-881.

  13. A National Quality Improvement Collaborative for the clinical use of outcome measurement in specialised mental healthcare: results from a parallel group design and a nested cluster randomised controlled trial.

    Science.gov (United States)

    Metz, Margot J; Veerbeek, Marjolein A; Franx, Gerdien C; van der Feltz-Cornelis, Christina M; de Beurs, Edwin; Beekman, Aartjan T F

    2017-05-01

    Although the importance and advantages of measurement-based care in mental healthcare are well established, implementation in daily practice is complex and far from optimal. To accelerate the implementation of outcome measurement in routine clinical practice, a government-sponsored National Quality Improvement Collaborative was initiated in Dutch-specialised mental healthcare. To investigate the effects of this initiative, we combined a matched-pair parallel group design (21 teams) with a cluster randomised controlled trial (RCT) (6 teams). At the beginning and end, the primary outcome 'actual use and perceived clinical utility of outcome measurement' was assessed. In both designs, intervention teams demonstrated a significant higher level of implementation of outcome measurement than control teams. Overall effects were large (parallel group d =0.99; RCT d =1.25). The National Collaborative successfully improved the use of outcome measurement in routine clinical practice. None. © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license.

  14. Design strategies for irregularly adapting parallel applications

    International Nuclear Information System (INIS)

    Oliker, Leonid; Biswas, Rupak; Shan, Hongzhang; Sing, Jaswinder Pal

    2000-01-01

    Achieving scalable performance for dynamic irregular applications is eminently challenging. Traditional message-passing approaches have been making steady progress towards this goal; however, they suffer from complex implementation requirements. The use of a global address space greatly simplifies the programming task, but can degrade the performance of dynamically adapting computations. In this work, we examine two major classes of adaptive applications, under five competing programming methodologies and four leading parallel architectures. Results indicate that it is possible to achieve message-passing performance using shared-memory programming techniques by carefully following the same high level strategies. Adaptive applications have computational work loads and communication patterns which change unpredictably at runtime, requiring dynamic load balancing to achieve scalable performance on parallel machines. Efficient parallel implementations of such adaptive applications are therefore a challenging task. This work examines the implementation of two typical adaptive applications, Dynamic Remeshing and N-Body, across various programming paradigms and architectural platforms. We compare several critical factors of the parallel code development, including performance, programmability, scalability, algorithmic development, and portability

  15. Integrated Task And Data Parallel Programming: Language Design

    Science.gov (United States)

    Grimshaw, Andrew S.; West, Emily A.

    1998-01-01

    his research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers '95 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program m. Additional 1995 Activities During the fall I collaborated

  16. Design of a novel parallel reconfigurable machine tool

    CSIR Research Space (South Africa)

    Modungwa, D

    2008-06-01

    Full Text Available of meeting the demands for high mechanical dexterity adaptation as well as high stiffness necessary for mould and die re-conditioning. This paper presents, the design of parallel reconfigurable machine tool (PRMT) based on both application...

  17. Techniques applied in design optimization of parallel manipulators

    CSIR Research Space (South Africa)

    Modungwa, D

    2011-11-01

    Full Text Available the desired dexterous workspace " Robot.Comput.Integrated Manuf., vol. 23, pp. 38 - 46, 2007. [12] A.P. Murray, F. Pierrot, P. Dauchez and J.M. McCarthy, "A planar quaternion approach to the kinematic synthesis of a parallel manipulator " Robotica, vol... design of a three translational DoFs parallel manipulator " Robotica, vol. 24, pp. 239, 2005. [15] J. Angeles, "The robust design of parallel manipulators," in 1st Int. Colloquium, Collaborative Research Centre 562, 2002. [16] S. Bhattacharya, H...

  18. Design paper: The CapOpus trial: A randomized, parallel-group, observer-blinded clinical trial of specialized addiction treatment versus treatment as usual for young patients with cannabis abuse and psychosis

    Directory of Open Access Journals (Sweden)

    Gluud Christian

    2008-07-01

    Full Text Available Abstract Background A number of studies indicate a link between cannabis-use and psychosis as well as more severe psychosis in those with existing psychotic disorders. There is currently insufficient evidence to decide the optimal way to treat cannabis abuse among patients with psychosis. Objectives The major objective for the CapOpus trial is to evaluate the additional effect on cannabis abuse of a specialized addiction treatment program adding group treatment and motivational interviewing to treatment as usual. Design The trial is designed as a randomized, parallel-group, observer-blinded clinical trial. Patients are primarily recruited through early-psychosis detection teams, community mental health centers, and assertive community treatment teams. Patients are randomized to one of two treatment arms, both lasting six months: 1 specialized addiction treatment plus treatment as usual or 2 treatment as usual. The specialized addiction treatment is manualized and consists of both individual and group-based motivational interviewing and cognitive behavioral therapy, and incorporates both the family and the case manager of the patient. The primary outcome measure will be changes in amount of cannabis consumption over time. Other outcome measures will be psychosis symptoms, cognitive functioning, quality of life, social functioning, and cost-benefit analyses. Trial registration ClinicalTrials.gov NCT00484302.

  19. ADL: a graphical design language for real time parallel applications

    NARCIS (Netherlands)

    M.R. van Steen; T. Vogel; A. ten Dam

    1993-01-01

    textabstractDesigning parallel applications is generally experienced as a tedious and difficult task, especially when hard real-time performance requirements have to be met. This paper discusses on-going work concerning the construction of a Design Entry System which supports the design phase of

  20. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    Science.gov (United States)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  1. Design Patterns: establishing a discipline of parallel software engineering

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    Many core processors present us with a software challenge. We must turn our serial code into parallel code. To accomplish this wholesale transformation of our software ecosystem, we must define established practice is in parallel programming and then develop tools to support that practice. This leads to design patterns supported by frameworks optimized at runtime with advanced autotuning compilers. In this talk I provide an update of my ongoing research with the ParLab at UC Berkeley to realize this vision. In particular, I will describe our draft parallel pattern language, our early experiments with software frameworks, and the associated runtime optimization tools.About the speakerTim Mattson is a parallel programmer (Ph.D. Chemistry, UCSC, 1985). He does linear algebra, finds oil, shakes molecules, solves differential equations, and models electrons in simple atomic systems. He has spent his career working with computer scientists to make sure the needs of parallel applications programmers are met.Tim has ...

  2. Design and Transmission Analysis of an Asymmetrical Spherical Parallel Manipulator

    DEFF Research Database (Denmark)

    Wu, Guanglei; Caro, Stéphane; Wang, Jiawei

    2015-01-01

    analysis and optimal design of the proposed manipulator based on its kinematic analysis. The input and output transmission indices of the manipulator are defined for its optimum design based on the virtual coefficient between the transmission wrenches and twist screws. The sets of optimal parameters......This paper presents an asymmetrical spherical parallel manipulator and its transmissibility analysis. This manipulator contains a center shaft to both generate a decoupled unlimited-torsion motion and support the mobile platform for high positioning accuracy. This work addresses the transmission...... are identified and the distribution of the transmission index is visualized. Moreover, a comparative study regarding to the performances with the symmetrical spherical parallel manipulators is conducted and the comparison shows the advantages of the proposed manipulator with respect to its spherical parallel...

  3. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  4. Basic design of parallel computational program for probabilistic structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  5. A structured representation for parallel algorithm design on multicomputers

    International Nuclear Information System (INIS)

    Sun, Xian-He; Ni, L.M.

    1991-01-01

    Traditionally, parallel algorithms have been designed by brute force methods and fine-tuned on each architecture to achieve high performance. Rather than studying the design case by case, a systematic approach is proposed. A notation is first developed. Using this notation, most of the frequently used scientific and engineering applications can be presented by simple formulas. The formulas constitute the structured representation of the corresponding applications. The structured representation is simple, adequate and easy to understand. They also contain sufficient information about uneven allocation and communication latency degradations. With the structured representation, applications can be compared, classified and partitioned. Some of the basic building blocks, called computation models, of frequently used applications are identified and studied. Most applications are combinations of some computation models. The structured representation relates general applications to computation models. Studying computation models leads to a guideline for efficient parallel algorithm design for general applications. 6 refs., 7 figs

  6. Design of high-performance parallelized gene predictors in MATLAB.

    Science.gov (United States)

    Rivard, Sylvain Robert; Mailloux, Jean-Gabriel; Beguenane, Rachid; Bui, Hung Tien

    2012-04-10

    This paper proposes a method of implementing parallel gene prediction algorithms in MATLAB. The proposed designs are based on either Goertzel's algorithm or on FFTs and have been implemented using varying amounts of parallelism on a central processing unit (CPU) and on a graphics processing unit (GPU). Results show that an implementation using a straightforward approach can require over 4.5 h to process 15 million base pairs (bps) whereas a properly designed one could perform the same task in less than five minutes. In the best case, a GPU implementation can yield these results in 57 s. The present work shows how parallelism can be used in MATLAB for gene prediction in very large DNA sequences to produce results that are over 270 times faster than a conventional approach. This is significant as MATLAB is typically overlooked due to its apparent slow processing time even though it offers a convenient environment for bioinformatics. From a practical standpoint, this work proposes two strategies for accelerating genome data processing which rely on different parallelization mechanisms. Using a CPU, the work shows that direct access to the MEX function increases execution speed and that the PARFOR construct should be used in order to take full advantage of the parallelizable Goertzel implementation. When the target is a GPU, the work shows that data needs to be segmented into manageable sizes within the GFOR construct before processing in order to minimize execution time.

  7. Design of a planar 3-DOF parallel micromanipulator

    International Nuclear Information System (INIS)

    Lee, Jeong Jae; Dong, Yanlu; Jeon, Yong Ho; Lee, Moon Gu

    2013-01-01

    A planar three degree-of-freedom (DOF) parallel manipulator is proposed to be applied for alignment during assembly of microcomponents. It adopts a PRR (prismatic-revolute-revolute) mechanism to meet the requirements of high precision for assembly and robustness against disturbance. The mechanism was designed to have a large workspace and good dexterity because parallel mechanisms usually have a narrow range and singularity of motion compared to serial mechanisms. Inverse kinematics and a simple closed-loop algorithm of the parallel manipulator are presented to control it. Experimental tests have been carried out with high-resolution capacitance sensors to verify the performance of the mechanism. The results of experiments show that the manipulator has a large workspace of ±1.0 mm, ±1.0 mm, and ±10 mrad in the X-, Y-, and θ-directions, respectively. This is a large workspace when considering it adopts a parallel mechanism and has a small size, 100 ´ 100 ´ 100 mm3 . It also has a good precision of 2 μm, 3 μm, and 0.2 mrad, in the X-, Y-, and θ- axes, respectively. These are high resolutions considering the manipulator adopts conventional joints. The manipulator is expected to have good dexterity.

  8. Design and test of a parallel kinematic solar tracker

    Directory of Open Access Journals (Sweden)

    Stefano Mauro

    2015-12-01

    Full Text Available This article proposes a parallel kinematic solar tracker designed for driving high-concentration photovoltaic modules. This kind of module produces energy only if they are oriented with misalignment errors lower than 0.4°. Generally, a parallel kinematic structure provides high stiffness and precision in positioning, so these features make this mechanism fit for the purpose. This article describes the work carried out to design a suitable parallel machine: an already existing architecture was chosen, and the geometrical parameters of the system were defined in order to obtain a workspace consistent with the requirements for sun tracking. Besides, an analysis of the singularities of the system was carried out. The method used for the singularity analysis revealed the existence of singularities which had not been previously identified for this kind of mechanism. From the analysis of the mechanism developed, very low nominal energy consumption and elevated stiffness were found. A small-scale prototype of the system was constructed for the first time. A control algorithm was also developed, implemented, and tested. Finally, experimental tests were carried out in order to verify the capability of the system of ensuring precise pointing. The tests have been considered passed as the system showed an orientation error lower than 0.4° during sun tracking.

  9. Fast ℓ1-SPIRiT Compressed Sensing Parallel Imaging MRI: Scalable Parallel Implementation and Clinically Feasible Runtime

    Science.gov (United States)

    Murphy, Mark; Alley, Marcus; Demmel, James; Keutzer, Kurt; Vasanawala, Shreyas; Lustig, Michael

    2012-01-01

    We present ℓ1-SPIRiT, a simple algorithm for auto calibrating parallel imaging (acPI) and compressed sensing (CS) that permits an efficient implementation with clinically-feasible runtimes. We propose a CS objective function that minimizes cross-channel joint sparsity in the Wavelet domain. Our reconstruction minimizes this objective via iterative soft-thresholding, and integrates naturally with iterative Self-Consistent Parallel Imaging (SPIRiT). Like many iterative MRI reconstructions, ℓ1-SPIRiT’s image quality comes at a high computational cost. Excessively long runtimes are a barrier to the clinical use of any reconstruction approach, and thus we discuss our approach to efficiently parallelizing ℓ1-SPIRiT and to achieving clinically-feasible runtimes. We present parallelizations of ℓ1-SPIRiT for both multi-GPU systems and multi-core CPUs, and discuss the software optimization and parallelization decisions made in our implementation. The performance of these alternatives depends on the processor architecture, the size of the image matrix, and the number of parallel imaging channels. Fundamentally, achieving fast runtime requires the correct trade-off between cache usage and parallelization overheads. We demonstrate image quality via a case from our clinical experimentation, using a custom 3DFT Spoiled Gradient Echo (SPGR) sequence with up to 8× acceleration via poisson-disc undersampling in the two phase-encoded directions. PMID:22345529

  10. Analysis and Design of Embedded Controlled Parallel Resonant Converter

    Directory of Open Access Journals (Sweden)

    P. CHANDRASEKHAR

    2009-07-01

    Full Text Available Microcontroller based constant frequency controlled full bridge LC parallel resonant converter is presented in this paper for electrolyser application. An electrolyser is a part of renewable energy system which generates hydrogen from water electrolysis. The DC power required by the electrolyser system is supplied by the DC-DC converter. Owing to operation under constant frequency, the filter designs are simplified and utilization of magnetic components is improved. This converter has advantages like high power density, low EMI and reduced switching stresses. DC-DC converter system is simulated using MATLAB, Simulink. Detailed simulation results are presented. The simulation results are compared with the experimental results.

  11. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    Science.gov (United States)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  12. Massively parallel de novo protein design for targeted therapeutics

    KAUST Repository

    Chevalier, Aaron; Silva, Daniel-Adriano; Rocklin, Gabriel J.; Hicks, Derrick R.; Vergara, Renan; Murapa, Patience; Bernard, Steffen M.; Zhang, Lu; Lam, Kwok-Ho; Yao, Guorui; Bahl, Christopher D.; Miyashita, Shin-Ichiro; Goreshnik, Inna; Fuller, James T.; Koday, Merika T.; Jenkins, Cody M.; Colvin, Tom; Carter, Lauren; Bohn, Alan; Bryan, Cassie M.; Ferná ndez-Velasco, D. Alejandro; Stewart, Lance; Dong, Min; Huang, Xuhui; Jin, Rongsheng; Wilson, Ian A.; Fuller, Deborah H.; Baker, David

    2017-01-01

    De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37-43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing.

  13. Massively parallel de novo protein design for targeted therapeutics

    KAUST Repository

    Chevalier, Aaron

    2017-09-26

    De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37-43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing.

  14. Massively parallel de novo protein design for targeted therapeutics

    Science.gov (United States)

    Chevalier, Aaron; Silva, Daniel-Adriano; Rocklin, Gabriel J.; Hicks, Derrick R.; Vergara, Renan; Murapa, Patience; Bernard, Steffen M.; Zhang, Lu; Lam, Kwok-Ho; Yao, Guorui; Bahl, Christopher D.; Miyashita, Shin-Ichiro; Goreshnik, Inna; Fuller, James T.; Koday, Merika T.; Jenkins, Cody M.; Colvin, Tom; Carter, Lauren; Bohn, Alan; Bryan, Cassie M.; Fernández-Velasco, D. Alejandro; Stewart, Lance; Dong, Min; Huang, Xuhui; Jin, Rongsheng; Wilson, Ian A.; Fuller, Deborah H.; Baker, David

    2018-01-01

    De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37–43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing. PMID:28953867

  15. New design of an RSFQ parallel multiply-accumulate unit

    International Nuclear Information System (INIS)

    Kataeva, Irina; Engseth, Henrik; Kidiyarova-Shevchenko, Anna

    2006-01-01

    The multiply-accumulate unit (MAC) is a central component of a successive interference canceller, an advanced receiver for W-CDMA base stations. A 4 x 4 two's complement fixed point RSFQ MAC with rounding to 5 bits has been simulated using VHDL, and maximum performance is equal to 24 GMACS (giga-multiply-accumulates per second). The clock distribution network has been re-designed from a linear ripple to a binary tree network in order to eliminate the data dependence of the clock propagation speed and reduce the number of Josephson junctions in clock lines. The 4 x 4 bit MAC has been designed for the HYPRES 4.5 kA cm -2 process and its components have been experimentally tested at low frequency: the 5-bit combiner, using an exhaustive test pattern, had margins on DC bias voltage of ± 18%, and the 4 x 4 parallel multiplier had margins equal to ± 2%

  16. A novel magnetorheological damper based parallel planar manipulator design

    International Nuclear Information System (INIS)

    Hoyle, A; Arzanpour, S; Shen, Y

    2010-01-01

    This paper presents a novel parallel planar robot design which is low cost and simple in structure. The design addresses some of the problems, such as concentration of excessive load on the links and joints, due to wrong commanding signals being given by the controller. In this application two of the conventional actuators are replaced by magnetorheological (MR) dampers, and only one actuator is used to generate motion. The design paradigm is based on the concept that a moving object 'intuitively' follows the path with minimum resistance to its motion. This implies that virtual adoptable constraints can be used effectively to define motion trajectories. In fact, motion generation and adaptive constraints are two elements essential to implementing this strategy. In this paper, MR dampers are used to provide adjustable constraints and to guide the platform that is moved by the linear motor. The model of the MR dampers is derived using the Bouc–Wen model. This model is then used for manipulator simulation and controller design. Two controllers are developed for this manipulator: (1) a closed loop on/off one and (2) a proportional–derivative controller. Also, three different trajectories are defined and used for both the simulations and experiments. The results indicate a good agreement between the simulations and experiments. The experimental results also demonstrate the capability of the manipulator for following sophisticated trajectories

  17. GPU-based Parallel Application Design for Emerging Mobile Devices

    Science.gov (United States)

    Gupta, Kshitij

    A revolution is underway in the computing world that is causing a fundamental paradigm shift in device capabilities and form-factor, with a move from well-established legacy desktop/laptop computers to mobile devices in varying sizes and shapes. Amongst all the tasks these devices must support, graphics has emerged as the 'killer app' for providing a fluid user interface and high-fidelity game rendering, effectively making the graphics processor (GPU) one of the key components in (present and future) mobile systems. By utilizing the GPU as a general-purpose parallel processor, this dissertation explores the GPU computing design space from an applications standpoint, in the mobile context, by focusing on key challenges presented by these devices---limited compute, memory bandwidth, and stringent power consumption requirements---while improving the overall application efficiency of the increasingly important speech recognition workload for mobile user interaction. We broadly partition trends in GPU computing into four major categories. We analyze hardware and programming model limitations in current-generation GPUs and detail an alternate programming style called Persistent Threads, identify four use case patterns, and propose minimal modifications that would be required for extending native support. We show how by manually extracting data locality and altering the speech recognition pipeline, we are able to achieve significant savings in memory bandwidth while simultaneously reducing the compute burden on GPU-like parallel processors. As we foresee GPU computing to evolve from its current 'co-processor' model into an independent 'applications processor' that is capable of executing complex work independently, we create an alternate application framework that enables the GPU to handle all control-flow dependencies autonomously at run-time while minimizing host involvement to just issuing commands, that facilitates an efficient application implementation. Finally, as

  18. Analysis and Design of High-Order Parallel Resonant Converters

    Science.gov (United States)

    Batarseh, Issa Eid

    1990-01-01

    In this thesis, a special state variable transformation technique has been derived for the analysis of high order dc-to-dc resonant converters. Converters comprised of high order resonant tanks have the advantage of utilizing the parasitic elements by making them part of the resonant tank. A new set of state variables is defined in order to make use of two-dimensional state-plane diagrams in the analysis of high order converters. Such a method has been successfully used for the analysis of the conventional Parallel Resonant Converters (PRC). Consequently, two -dimensional state-plane diagrams are used to analyze the steady state response for third and fourth order PRC's when these converters are operated in the continuous conduction mode. Based on this analysis, a set of control characteristic curves for the LCC-, LLC- and LLCC-type PRC are presented from which various converter design parameters are obtained. Various design curves for component value selections and device ratings are given. This analysis of high order resonant converters shows that the addition of the reactive components to the resonant tank results in converters with better performance characteristics when compared with the conventional second order PRC. Complete design procedure along with design examples for 2nd, 3rd and 4th order converters are presented. Practical power supply units, normally used for computer applications, were built and tested by using the LCC-, LLC- and LLCC-type commutation schemes. In addition, computer simulation results are presented for these converters in order to verify the theoretical results.

  19. Successful design and application of SNCR parallel to combustion modification

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Dongxian; Tang, Leping; Shao, Xiaozhen; Meng, Derun; Li, Hongjian [Tongfang Environment CO., LTD., Beijing (China); Zhou, Wei; Xu, Guang [GE Energy, Anaheim, CA (United States)

    2013-07-01

    Various De-NOx methods have been recently adopted in China to control NOx emissions including Selective Non-Catalytic Reaction (SNCR) technology. Usually, the design of SNCR system is carried out after the combustion modification technologies, such as low NOx burner (LNB) and over fire air (OFA), have already been installed and in operation. This article discusses how to design the SNCR system parallel to the combustion modification. The SNCR process design consists of three steps: (1) boiler baseline test, (2) computational fluid dynamics simulation (CFD) facilitated design and (3) SNCR system performance predictions and optimizations. The first step is to conduct boiler baseline test to characterize the boiler operating conditions at a load range. The test data can also be used to calibrate the CFD model. The second step is to develop a three-dimensional boiler coal combustion CFD model to simulate the operation of the boilers at both baseline and post combustion modification conditions. The simulation reveals velocity, temperature and combustible distributions in the furnace. The last step is to determine the position and numbers of the injectors for SNCR reagent. The final field tests upon the project completion have shown that the average SNCR De-NOx efficiency has reached 35.1% with the maximum removal efficiency of 45% on full load. The project also couples the SNCR and SCR (Selective Catalytic Reduction) technologies. The combined removal efficiency of combustion modifications, SNCR and SCR is higher than 82%. This paper shows a successful example for retrofitting aged power-generating units with limited space.

  20. Design of a 3-DOF Parallel Hand-Controller

    Directory of Open Access Journals (Sweden)

    Chengcheng Zhu

    2017-01-01

    Full Text Available Hand-controllers, as human-machine-interface (HMI devices, can transfer the position information of the operator’s hands into the virtual environment to control the target objects or a real robot directly. At the same time, the haptic information from the virtual environment or the sensors on the real robot can be displayed to the operator. It helps human perceive haptic information more truly with feedback force. A parallel hand-controller is designed in this paper. It is simplified from the traditional delta haptic device. The swing arms in conventional delta devices are replaced with the slider rail modules. The base consists of two hexagons and several links. For the use of the linear sliding modules instead of swing arms, the arc movement is replaced by linear movement. So that, the calculating amount of the position positive solution and the force inverse solution is reduced for the simplification of the motion. The kinematics, static mechanics, and dynamic mechanics are analyzed in this paper. What is more, two demonstration applications are developed to verify the performance of the designed hand-controller.

  1. Refining SCJ Mission Specifications into Parallel Handler Designs

    Directory of Open Access Journals (Sweden)

    Frank Zeyda

    2013-05-01

    Full Text Available Safety-Critical Java (SCJ is a recent technology that restricts the execution and memory model of Java in such a way that applications can be statically analysed and certified for their real-time properties and safe use of memory. Our interest is in the development of comprehensive and sound techniques for the formal specification, refinement, design, and implementation of SCJ programs, using a correct-by-construction approach. As part of this work, we present here an account of laws and patterns that are of general use for the refinement of SCJ mission specifications into designs of parallel handlers used in the SCJ programming paradigm. Our notation is a combination of languages from the Circus family, supporting state-rich reactive models with the addition of class objects and real-time properties. Our work is a first step to elicit laws of programming for SCJ and fits into a refinement strategy that we have developed previously to derive SCJ programs.

  2. A general approach for optimal kinematic design of 6-DOF parallel ...

    Indian Academy of Sciences (India)

    Optimal kinematic design of parallel manipulators is a challenging problem. In this work, an attempt has been made to present a generalized approach of kinematic design for a 6-legged parallel manipulator, by considering only the minimally required design parameters. The same approach has been used to design a ...

  3. Kinematics and design of a class of parallel manipulators

    Science.gov (United States)

    Hertz, Roger Barry

    1998-12-01

    This dissertation is concerned with the kinematic analysis and design of a class of three degree-of-freedom, spatial parallel manipulators. The class of manipulators is characterized by two platforms, between which are three legs, each possessing a succession of revolute, spherical, and revolute joints. The class is termed the "revolute-spherical-revolute" class of parallel manipulators. Two members of this class are examined. The first mechanism is a double-octahedral variable-geometry truss, and the second is termed a double tripod. The history the mechanisms is explored---the variable-geometry truss dates back to 1984, while predecessors of the double tripod mechanism date back to 1869. This work centers on the displacement analysis of these three-degree-of-freedom mechanisms. Two types of problem are solved: the forward displacement analysis (forward kinematics) and the inverse displacement analysis (inverse kinematics). The kinematic model of the class of mechanism is general in nature. A classification scheme for the revolute-spherical-revolute class of mechanism is introduced, which uses dominant geometric features to group designs into 8 different sub-classes. The forward kinematics problem is discussed: given a set of independently controllable input variables, solve for the relative position and orientation between the two platforms. For the variable-geometry truss, the controllable input variables are assumed to be the linear (prismatic) joints. For the double tripod, the controllable input variables are the three revolute joints adjacent to the base (proximal) platform. Multiple solutions are presented to the forward kinematics problem, indicating that there are many different positions (assemblies) that the manipulator can assume with equivalent inputs. For the double tripod these solutions can be expressed as a 16th degree polynomial in one unknown, while for the variable-geometry truss there exist two 16th degree polynomials, giving rise to 256

  4. Graph Transformation and Designing Parallel Sparse Matrix Algorithms beyond Data Dependence Analysis

    Directory of Open Access Journals (Sweden)

    H.X. Lin

    2004-01-01

    Full Text Available Algorithms are often parallelized based on data dependence analysis manually or by means of parallel compilers. Some vector/matrix computations such as the matrix-vector products with simple data dependence structures (data parallelism can be easily parallelized. For problems with more complicated data dependence structures, parallelization is less straightforward. The data dependence graph is a powerful means for designing and analyzing parallel algorithms. However, for sparse matrix computations, parallelization based on solely exploiting the existing parallelism in an algorithm does not always give satisfactory results. For example, the conventional Gaussian elimination algorithm for the solution of a tri-diagonal system is inherently sequential, so algorithms specially for parallel computation has to be designed. After briefly reviewing different parallelization approaches, a powerful graph formalism for designing parallel algorithms is introduced. This formalism will be discussed using a tri-diagonal system as an example. Its application to general matrix computations is also discussed. Its power in designing parallel algorithms beyond the ability of data dependence analysis is shown by means of a new algorithm called ACER (Alternating Cyclic Elimination and Reduction algorithm.

  5. Evaluation of the Efficacy and Safety of the Lercanidipine/Valsartan Combination in Korean Patients With Essential Hypertension Not Adequately Controlled With Lercanidipine Monotherapy: A Randomized, Multicenter, Parallel Design, Phase III Clinical Trial.

    Science.gov (United States)

    Na, Sang-Hoon; Lee, Hae-Young; Hong Baek, Sang; Jeon, Hui-Kyung; Kang, Jin-Ho; Kim, Yoon-Nyun; Park, Chang-Gyu; Ryu, Jae-Kean; Rhee, Moo-Yong; Kim, Moo-Hyun; Hong, Taek-Jong; Choi, Dong-Ju; Cho, Seong-Wook; Cha, Dong-Hun; Jeon, Eun-Seok; Kim, Jae-Joong; Shin, Joon-Han; Park, Sung-Ha; Lee, Seung-Hwan; John, Sung-Hee; Shin, Eun-Seok; Kim, Nam-Ho; Lee, Sung-Yun; Kwan, Jun; Jeong, Myung-Ho; Kim, Sang-Wook; Jeong, Jin-Ok; Kim, Dong-Woon; Lee, Nam-Ho; Park, Woo-Jung; Ahn, Jeong-Cheon; Won, Kyung-Heon; Uk Lee, Seung; Cho, Jang-Hyun; Kim, Soon-Kil; Ahn, Taehoon; Hong, Sukkeun; Yoo, Sang-Yong; Kim, Song-Yi; Kim, Byung-Soo; Juhn, Jae-Hyeon; Kim, Sun-Young; Lee, Yu-Jeong; Oh, Byung-Hee

    2015-08-01

    The objective of this study was to evaluate the efficacy and safety of the lercanidipine/valsartan combination compared with lercanidipine monotherapy in patients with hypertension. Part 1 of this study was the randomized, multicenter, double-blind, parallel group, Phase III, 8-week clinical trial to compare superiority of lercanidipine 10 mg/valsartan 80 mg (L10/V80) and lercanidipine 10 mg/valsartan 160 mg (L10/V160) combinations with lercanidipine 10 mg (L10) monotherapy. At screening, hypertensive patients, whose diastolic blood pressure (DBP) was >90 mm Hg after 4 weeks with L10, were randomized to 3 groups of L10, L10/V80, and L10/V160. The primary end point was the change in the mean sitting DBP from baseline (week 0) after 8 weeks of therapy. Patients who were randomly assigned to L10/V160 and whose mean DBP was still ≥ 90 mm Hg in part 1 were enrolled to the up-titration extension study with lercanidipine 20 mg/valsartan 160 mg (L20/V160) (part 2). Of 772 patients screened, 497 were randomized to 3 groups (166 in the L10 group, 168 in the L10/V80 group, and 163 in the L10/V160 group). Mean (SD) age was 55 (9.9) years, and male patients comprised 69%. The mean (SD) baseline systolic blood pressure (SBP)/DBP were 148.4 (15.1)/94.3 (9.5) mm Hg. No significant differences were found between groups in baseline characteristics except the percentages of previous history of antihypertensive medication. The primary end points, the changes of mean (SD) DBP at week 8 from the baseline were -2.0 (8.8) mm Hg in the L10 group, -6.7 (8.5) mm Hg in L10/V80 group, and -8.1 (8.4) mm Hg in L10/V160 group. The adjusted mean difference between the combination groups and the L10 monotherapy group was -4.6 mm Hg (95% CI, -6.5 to -2.6; P < 0.001) in the L10/V80 group and -5.9 mm Hg (95% CI, -7.9 to -4.0, P < 0.001) in the L10/V160 group, which had significantly greater efficacy in BP lowering. A total of 74 patients were enrolled in the part 2 extension study. Changes of mean

  6. Discussion about the design for mesh data structure within the parallel framework

    International Nuclear Information System (INIS)

    Shi Guangmei; Wu Ruian; Wang Keying; Ji Xiaoyu; Hao Zhiming; Mo Jun; He Yingbo

    2010-01-01

    The mesh data structure, one of the fundamental data structure within the parallel framework, its design and realization level have an effect upon parallel capability of the parallel framework. Through the architecture and the fundamental data structure within some typical parallel framework relatively analyzed, such as JASMIN, SIERRA, and ITAPS, the design thought of parallel framework is discussed. Through borrowing ideas from layered set of services design about the SIERRA Framework, and combining with the objective of PANDA Framework in the near future, this paper present the rudimentary system about PANDA framework layered set of services. On this foundation, detailed introduction is placed in the definition and the management of the mesh data structure that it is located in the underlayer of the PANDA framework. The design and realization about parallel distributed mesh data structure of PANDA are emphatically discussed. The PANDA framework extension and application program development based on PANDA framework are grounded on our efforts.

  7. Design, analysis and control of cable-suspended parallel robots and its applications

    CERN Document Server

    Zi, Bin

    2017-01-01

    This book provides an essential overview of the authors’ work in the field of cable-suspended parallel robots, focusing on innovative design, mechanics, control, development and applications. It presents and analyzes several typical mechanical architectures of cable-suspended parallel robots in practical applications, including the feed cable-suspended structure for super antennae, hybrid-driven-based cable-suspended parallel robots, and cooperative cable parallel manipulators for multiple mobile cranes. It also addresses the fundamental mechanics of cable-suspended parallel robots on the basis of their typical applications, including the kinematics, dynamics and trajectory tracking control of the feed cable-suspended structure for super antennae. In addition it proposes a novel hybrid-driven-based cable-suspended parallel robot that uses integrated mechanism design methods to improve the performance of traditional cable-suspended parallel robots. A comparative study on error and performance indices of hybr...

  8. Out-of-order parallel discrete event simulation for electronic system-level design

    CERN Document Server

    Chen, Weiwei

    2014-01-01

    This book offers readers a set of new approaches and tools a set of tools and techniques for facing challenges in parallelization with design of embedded systems.? It provides an advanced parallel simulation infrastructure for efficient and effective system-level model validation and development so as to build better products in less time.? Since parallel discrete event simulation (PDES) has the potential to exploit the underlying parallel computational capability in today's multi-core simulation hosts, the author begins by reviewing the parallelization of discrete event simulation, identifyin

  9. Massive parallel electromagnetic field simulation program JEMS-FDTD design and implementation on jasmin

    International Nuclear Information System (INIS)

    Li Hanyu; Zhou Haijing; Dong Zhiwei; Liao Cheng; Chang Lei; Cao Xiaolin; Xiao Li

    2010-01-01

    A large-scale parallel electromagnetic field simulation program JEMS-FDTD(J Electromagnetic Solver-Finite Difference Time Domain) is designed and implemented on JASMIN (J parallel Adaptive Structured Mesh applications INfrastructure). This program can simulate propagation, radiation, couple of electromagnetic field by solving Maxwell equations on structured mesh explicitly with FDTD method. JEMS-FDTD is able to simulate billion-mesh-scale problems on thousands of processors. In this article, the program is verified by simulating the radiation of an electric dipole. A beam waveguide is simulated to demonstrate the capability of large scale parallel computation. A parallel performance test indicates that a high parallel efficiency is obtained. (authors)

  10. Adaptive designs in clinical trials

    Directory of Open Access Journals (Sweden)

    Suresh Bowalekar

    2011-01-01

    Full Text Available In addition to the expensive and lengthy process of developing a new medicine, the attrition rate in clinical research was on the rise, resulting in stagnation in the development of new compounds. As a consequence to this, the US Food and Drug Administration released a critical path initiative document in 2004, highlighting the need for developing innovative trial designs. One of the innovations suggested the use of adaptive designs for clinical trials. Thus, post critical path initiative, there is a growing interest in using adaptive designs for the development of pharmaceutical products. Adaptive designs are expected to have great potential to reduce the number of patients and duration of trial and to have relatively less exposure to new drug. Adaptive designs are not new in the sense that the task of interim analysis (IA/review of the accumulated data used in adaptive designs existed in the past too. However, such reviews/analyses of accumulated data were not necessarily planned at the stage of planning clinical trial and the methods used were not necessarily compliant with clinical trial process. The Bayesian approach commonly used in adaptive designs was developed by Thomas Bayes in the 18th century, about hundred years prior to the development of modern statistical methods by the father of modern statistics, Sir Ronald A. Fisher, but the complexity involved in Bayesian approach prevented its use in real life practice. The advances in the field of computer and information technology over the last three to four decades has changed the scenario and the Bayesian techniques are being used in adaptive designs in addition to other sequential methods used in IA. This paper attempts to describe the various adaptive designs in clinical trial and views of stakeholders about feasibility of using them, without going into mathematical complexities.

  11. Adaptive designs in clinical trials.

    Science.gov (United States)

    Bowalekar, Suresh

    2011-01-01

    In addition to the expensive and lengthy process of developing a new medicine, the attrition rate in clinical research was on the rise, resulting in stagnation in the development of new compounds. As a consequence to this, the US Food and Drug Administration released a critical path initiative document in 2004, highlighting the need for developing innovative trial designs. One of the innovations suggested the use of adaptive designs for clinical trials. Thus, post critical path initiative, there is a growing interest in using adaptive designs for the development of pharmaceutical products. Adaptive designs are expected to have great potential to reduce the number of patients and duration of trial and to have relatively less exposure to new drug. Adaptive designs are not new in the sense that the task of interim analysis (IA)/review of the accumulated data used in adaptive designs existed in the past too. However, such reviews/analyses of accumulated data were not necessarily planned at the stage of planning clinical trial and the methods used were not necessarily compliant with clinical trial process. The Bayesian approach commonly used in adaptive designs was developed by Thomas Bayes in the 18th century, about hundred years prior to the development of modern statistical methods by the father of modern statistics, Sir Ronald A. Fisher, but the complexity involved in Bayesian approach prevented its use in real life practice. The advances in the field of computer and information technology over the last three to four decades has changed the scenario and the Bayesian techniques are being used in adaptive designs in addition to other sequential methods used in IA. This paper attempts to describe the various adaptive designs in clinical trial and views of stakeholders about feasibility of using them, without going into mathematical complexities.

  12. Error Modeling and Design Optimization of Parallel Manipulators

    DEFF Research Database (Denmark)

    Wu, Guanglei

    /backlash, manufacturing and assembly errors and joint clearances. From the error prediction model, the distributions of the pose errors due to joint clearances are mapped within its constant-orientation workspace and the correctness of the developed model is validated experimentally. ix Additionally, using the screw......, dynamic modeling etc. Next, the rst-order dierential equation of the kinematic closure equation of planar parallel manipulator is obtained to develop its error model both in Polar and Cartesian coordinate systems. The established error model contains the error sources of actuation error...

  13. Optimization Algorithms for Calculation of the Joint Design Point in Parallel Systems

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1992-01-01

    In large structures it is often necessary to estimate the reliability of the system by use of parallel systems. Optimality criteria-based algorithms for calculation of the joint design point in a parallel system are described and efficient active set strategies are developed. Three possible...

  14. Multiobjective Optimum Design of a 3-RRR Spherical Parallel Manipulator with Kinematic and Dynamic Dexterities

    DEFF Research Database (Denmark)

    Wu, Guanglei

    2012-01-01

    parameters of the spherical parallel manipulator. The proposed approach is illustrated with the optimum design of a special spherical parallel manipulator with unlimited rolling motion. The corresponding optimization problem aims to maximize the kinematic and dynamic dexterities over its regular shaped...

  15. Design and Implementation of Papyrus: Parallel Aggregate Persistent Storage

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jungwon [ORNL; Sajjapongse, Kittisak [ORNL; Lee, Seyong [ORNL; Vetter, Jeffrey S [ORNL

    2017-01-01

    A surprising development in recently announced HPC platforms is the addition of, sometimes massive amounts of, persistent (nonvolatile) memory (NVM) in order to increase memory capacity and compensate for plateauing I/O capabilities. However, there are no portable and scalable programming interfaces using aggregate NVM effectively. This paper introduces Papyrus: a new software system built to exploit emerging capability of NVM in HPC architectures. Papyrus (or Parallel Aggregate Persistent -YRU- Storage) is a novel programming system that provides features for scalable, aggregate, persistent memory in an extreme-scale system for typical HPC usage scenarios. Papyrus mainly consists of Papyrus Virtual File System (VFS) and Papyrus Template Container Library (TCL). Papyrus VFS provides a uniform aggregate NVM storage image across diverse NVM architectures. It enables Papyrus TCL to provide a portable and scalable high-level container programming interface whose data elements are distributed across multiple NVM nodes without requiring the user to handle complex communication, synchronization, replication, and consistency model. We evaluate Papyrus on two HPC systems, including UTK Beacon and NERSC Cori, using real NVM storage devices.

  16. Design of a Load-Balancing Architecture For Parallel Firewalls

    National Research Council Canada - National Science Library

    Joyner, William

    1999-01-01

    .... This thesis proposes a load-balancing firewall architecture to meet the Navy's needs. It first conducts an architectural analysis of the problem and then presents a high-level system design as a solution...

  17. Automatic evaluation of task-focused parallel jaw gripper design

    DEFF Research Database (Denmark)

    Wolniakowski, Adam; Miatliuk, Konstantsin; Krüger, Norbert

    2014-01-01

    In this paper, we suggest gripper quality metrics that indicate the performance of a gripper given an object CAD model and a task description. Those, we argue, can be used in the design and selection of an appropriate gripper when the task is known. We present three different gripper metrics that...

  18. Parallel algorithms for placement and routing in VLSI design. Ph.D. Thesis

    Science.gov (United States)

    Brouwer, Randall Jay

    1991-01-01

    The computational requirements for high quality synthesis, analysis, and verification of very large scale integration (VLSI) designs have rapidly increased with the fast growing complexity of these designs. Research in the past has focused on the development of heuristic algorithms, special purpose hardware accelerators, or parallel algorithms for the numerous design tasks to decrease the time required for solution. Two new parallel algorithms are proposed for two VLSI synthesis tasks, standard cell placement and global routing. The first algorithm, a parallel algorithm for global routing, uses hierarchical techniques to decompose the routing problem into independent routing subproblems that are solved in parallel. Results are then presented which compare the routing quality to the results of other published global routers and which evaluate the speedups attained. The second algorithm, a parallel algorithm for cell placement and global routing, hierarchically integrates a quadrisection placement algorithm, a bisection placement algorithm, and the previous global routing algorithm. Unique partitioning techniques are used to decompose the various stages of the algorithm into independent tasks which can be evaluated in parallel. Finally, results are presented which evaluate the various algorithm alternatives and compare the algorithm performance to other placement programs. Measurements are presented on the parallel speedups available.

  19. Design of an Input-Parallel Output-Parallel LLC Resonant DC-DC Converter System for DC Microgrids

    Science.gov (United States)

    Juan, Y. L.; Chen, T. R.; Chang, H. M.; Wei, S. E.

    2017-11-01

    Compared with the centralized power system, the distributed modularized power system is composed of several power modules with lower power capacity to provide a totally enough power capacity for the load demand. Therefore, the current stress of the power components in each module can then be reduced, and the flexibility of system setup is also enhanced. However, the parallel-connected power modules in the conventional system are usually controlled to equally share the power flow which would result in lower efficiency in low loading condition. In this study, a modular power conversion system for DC micro grid is developed with 48 V dc low voltage input and 380 V dc high voltage output. However, in the developed system control strategy, the numbers of power modules enabled to share the power flow is decided according to the output power at lower load demand. Finally, three 350 W power modules are constructed and parallel-connected to setup a modular power conversion system. From the experimental results, compared with the conventional system, the efficiency of the developed power system in the light loading condition is greatly improved. The modularized design of the power system can also decrease the power loss ratio to the system capacity.

  20. Parallel imports of hospital pharmaceuticals: An empirical analysis of price effects from parallel imports and the design of procurement procedures in the Danish hospital sector

    OpenAIRE

    Hostenkamp, Gisela; Kronborg, Christian; Arendt, Jacob Nielsen

    2012-01-01

    We analyse pharmaceutical imports in the Danish hospital sector. In this market medicines are publicly tendered using first-price sealed-bid procurement auctions. We analyse whether parallel imports have an effect on pharmaceutical prices and whether the way tenders were organised matters for the competitive effect of parallel imports on prices. Our theoretical analysis shows that the design of the procurement rules affects both market structure and pharmaceutical prices. Parallel imports may...

  1. Optimal Design of Passive Power Filters Based on Pseudo-parallel Genetic Algorithm

    Science.gov (United States)

    Li, Pei; Li, Hongbo; Gao, Nannan; Niu, Lin; Guo, Liangfeng; Pei, Ying; Zhang, Yanyan; Xu, Minmin; Chen, Kerui

    2017-05-01

    The economic costs together with filter efficiency are taken as targets to optimize the parameter of passive filter. Furthermore, the method of combining pseudo-parallel genetic algorithm with adaptive genetic algorithm is adopted in this paper. In the early stages pseudo-parallel genetic algorithm is introduced to increase the population diversity, and adaptive genetic algorithm is used in the late stages to reduce the workload. At the same time, the migration rate of pseudo-parallel genetic algorithm is improved to change with population diversity adaptively. Simulation results show that the filter designed by the proposed method has better filtering effect with lower economic cost, and can be used in engineering.

  2. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    Science.gov (United States)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  3. Optimization Design by Genetic Algorithm Controller for Trajectory Control of a 3-RRR Parallel Robot

    Directory of Open Access Journals (Sweden)

    Lianchao Sheng

    2018-01-01

    Full Text Available In order to improve the control precision and robustness of the existing proportion integration differentiation (PID controller of a 3-Revolute–Revolute–Revolute (3-RRR parallel robot, a variable PID parameter controller optimized by a genetic algorithm controller is proposed in this paper. Firstly, the inverse kinematics model of the 3-RRR parallel robot was established according to the vector method, and the motor conversion matrix was deduced. Then, the error square integral was chosen as the fitness function, and the genetic algorithm controller was designed. Finally, the control precision of the new controller was verified through the simulation model of the 3-RRR planar parallel robot—built in SimMechanics—and the robustness of the new controller was verified by adding interference. The results show that compared with the traditional PID controller, the new controller designed in this paper has better control precision and robustness, which provides the basis for practical application.

  4. Particularities of fully-parallel manipulators in 6-DOFs robots design: a review of critical aspects

    Directory of Open Access Journals (Sweden)

    Milica Lucian

    2017-01-01

    Full Text Available A whole range of industrial applications requires the presence of parallel mechanisms with six degrees of freedom (6-DOF which have been developed in the last fifteen years, and one of the reasons why they still are a current topic is that present-day computers are capable of performing real-time motion laws of great complexity associated with these types of parallel mechanisms. The present work underlines particularities of parallel manipulators and their importance in the design of 6-DOF robots. The paper reveals the progress made in the last twenty years in the development of 6-DOF parallel manipulators, which increasingly find a wide scope of applications in different industrial areas such as robotics, manufacture and assisted medicine. It also emphasizes the need to determine singular configurations and the effect of cinematic redundancy which can increase the working space of the manipulators by adding active joints in one or more branches of the manipulator. Throughout the work, there were outlined three types of singularities encountered in the modelling of different types of parallel manipulators, and three types of redundancy. Furthermore, an analysis was made of the dimension of the workspace for a series of parallel manipulators, highlighting a number of factors that influence its size.

  5. Parallel calculation of sensitivity derivatives for aircraft design using automatic differentiation

    Energy Technology Data Exchange (ETDEWEB)

    Bischof, C.H.; Knauff, T.L. Jr. [Argonne National Lab., IL (United States); Green, L.L.; Haigler, K.J. [National Aeronautics and Space Administration, Hampton, VA (United States). Langley Research Center

    1994-01-01

    Realistic multidisciplinary design optimization (MDO) of advanced aircraft using state-of-the-art computers is an extremely challenging problem from both the physical modelling and computer science points of view. In order to produce an efficient aircraft design, many trade-offs must be made among the various physical design variables. Similarly, in order to produce an efficient design scheme, many trade-offs must be made among the various MDO implementation options. In this paper, we examine the effects of vectorization and coarse-grained parallelization on the SD calculation using a representative example taken from a transonic transport design problem.

  6. Line filter design of parallel interleaved VSCs for high power wind energy conversion systems

    DEFF Research Database (Denmark)

    Gohil, Ghanshyamsinh Vijaysinh; Bede, Lorand; Teodorescu, Remus

    2015-01-01

    The Voltage Source Converters (VSCs) are often connected in parallel in a Wind Energy Conversion System (WECS) to match the high power rating of the modern wind turbines. The effect of the interleaved carriers on the harmonic performance of the parallel connected VSCs is analyzed in this paper...... limit. In order to achieve the desired filter performance with optimal values of the filter parameters, the use of a LC trap branch with the conventional LCL filter is proposed. The expressions for the resonant frequencies of the proposed line filter are derived and used in the design to selectively...

  7. On the impact of communication complexity in the design of parallel numerical algorithms

    Science.gov (United States)

    Gannon, D.; Vanrosendale, J.

    1984-01-01

    This paper describes two models of the cost of data movement in parallel numerical algorithms. One model is a generalization of an approach due to Hockney, and is suitable for shared memory multiprocessors where each processor has vector capabilities. The other model is applicable to highly parallel nonshared memory MIMD systems. In the second model, algorithm performance is characterized in terms of the communication network design. Techniques used in VLSI complexity theory are also brought in, and algorithm independent upper bounds on system performance are derived for several problems that are important to scientific computation.

  8. Multi-objective Design Optimization of a Parallel Schönflies-motion Robot

    DEFF Research Database (Denmark)

    Wu, Guanglei; Bai, Shaoping; Hjørnet, Preben

    2016-01-01

    . The dynamic performance is concerned mainly the capability of force transmission in the parallel kinematic chain, for which transmission indices are defined. The Pareto-front is obtained to investigate the influence of the design variables to the robot performance. Dynamic characteristics for three Pareto......This paper introduces a parallel Schoenflies-motion robot with rectangular workspace, which is suitable for pick-and-place operations. A multi-objective optimization problem is formulated to optimize the robot's geometric parameters with consideration of kinematic and dynamic performances...

  9. Design and fabrication of a micro parallel mechanism system using MEMS technologies

    Science.gov (United States)

    Chin, Chi-Te

    A parallel mechanism is seen as an attractive method of fabricating a multi-degree of freedom micro-stage on a chip. The research team at Arizona State University has experience with several potential parallel mechanisms that would be scaled down to micron dimensions and fabricated by using the silicon process. The researcher developed a micro parallel mechanism that allows for planar motion having two translational motions and one rotational motion (e.g., x, y, theta). The mask design shown in Appendix B is an example of a planar parallel mechanism, however, this design would only have a few discrete positions given the nature of the fully extended or fully retracted electrostatic motor. The researcher proposes using a rotary motor (comb-drive actuator with gear chain system) coupled to a rack and pinion for finer increments of linear motion. The rotary motor can behave as a stepper motor by counting drive pulses, which is the basis for a simple open loop control system. This system was manufactured at the Central Regional MEMS Research Center (CMEMS), National Tsing-Hua University, and supported by the National Science Council, Taiwan. After the microstructures had been generated, the proceeding devices were released and an experiment study was performed to demonstrate the feasibility of the proposed micro-stage devices. In this dissertation, the micro electromechanical system (MEMS) fabrication technologies were introduced. The development of this parallel mechanism system will initially focus on development of a planar micro-stage. The design of the micro-stage will build on the parallel mechanism technology, which has been developed for manufacturing, assembly, and flight simulator applications. Parallel mechanism will give the maximum operating envelope with a minimum number of silicon levels. The ideally proposed mechanism should comprise of a user interface, a micro-stage and a non-silicon tool, which is difficult to accomplish by current MEMS technology

  10. Design and Nonlinear Control of a 2-DOF Flexible Parallel Humanoid Arm Joint Robot

    Directory of Open Access Journals (Sweden)

    Leijie Jiang

    2017-01-01

    Full Text Available The paper focuses on the design and nonlinear control of the humanoid wrist/shoulder joint based on the cable-driven parallel mechanism which can realize roll and pitch movement. In view of the existence of the flexible parts in the mechanism, it is necessary to solve the vibration control of the flexible wrist/shoulder joint. In this paper, a cable-driven parallel robot platform is developed for the experiment study of the humanoid wrist/shoulder joint. And the dynamic model of the mechanism is formulated by using the coupling theory of the flexible body’s large global motion and small flexible deformation. Based on derived dynamics, antivibration control of the joint robot is studied with a nonlinear control method. Finally, simulations and experiments were performed to validate the feasibility of the developed parallel robot prototype and the proposed control scheme.

  11. Vdebug: debugging tool for parallel scientific programs. Design report on vdebug

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2000-02-01

    We report on a debugging tool called vdebug which supports debugging work for parallel scientific simulation programs. It is difficult to debug scientific programs with an existing debugger, because the volume of data generated by the programs is too large for users to check data in characters. Usually, the existing debugger shows data values in characters. To alleviate it, we have developed vdebug which enables to check the validity of large amounts of data by showing these data values visually. Although targets of vdebug have been restricted to sequential programs, we have made it applicable to parallel programs by realizing the function of merging and visualizing data distributed on programs on each computer node. Now, vdebug works on seven kinds of parallel computers. In this report, we describe the design of vdebug. (author)

  12. Design, Dynamics, and Workspace of a Hybrid-Driven-Based Cable Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Bin Zi

    2013-01-01

    Full Text Available The design, dynamics, and workspace of a hybrid-driven-based cable parallel manipulator (HDCPM are presented. The HDCPM is able to perform high efficiency, heavy load, and high-performance motion due to the advantages of both the cable parallel manipulator and the hybrid-driven planar five-bar mechanism. The design is performed according to theories of mechanism structure synthesis for cable parallel manipulators. The dynamic formulation of the HDCPM is established on the basis of Newton-Euler method. The workspace of the manipulator is analyzed additionally. As an example, a completely restrained HDCPM with 3 degrees of freedom is studied in simulation in order to verify the validity of the proposed design, workspace, and dynamic analysis. The simulation results, compared with the theoretical analysis, and the case study previously performed show that the manipulator design is reasonable and the mathematical models are correct, which provides the theoretical basis for future physical prototype and control system design.

  13. A design concept of parallel elasticity extracted from biological muscles for engineered actuators.

    Science.gov (United States)

    Chen, Jie; Jin, Hongzhe; Iida, Fumiya; Zhao, Jie

    2016-08-23

    Series elastic actuation that takes inspiration from biological muscle-tendon units has been extensively studied and used to address the challenges (e.g. energy efficiency, robustness) existing in purely stiff robots. However, there also exists another form of passive property in biological actuation, parallel elasticity within muscles themselves, and our knowledge of it is limited: for example, there is still no general design strategy for the elasticity profile. When we look at nature, on the other hand, there seems a universal agreement in biological systems: experimental evidence has suggested that a concave-upward elasticity behaviour is exhibited within the muscles of animals. Seeking to draw possible design clues for elasticity in parallel with actuators, we use a simplified joint model to investigate the mechanisms behind this biologically universal preference of muscles. Actuation of the model is identified from general biological joints and further reduced with a specific focus on muscle elasticity aspects, for the sake of easy implementation. By examining various elasticity scenarios, one without elasticity and three with elasticity of different profiles, we find that parallel elasticity generally exerts contradictory influences on energy efficiency and disturbance rejection, due to the mechanical impedance shift thus caused. The trade-off analysis between them also reveals that concave parallel elasticity is able to achieve a more advantageous balance than linear and convex ones. It is expected that the results could contribute to our further understanding of muscle elasticity and provide a theoretical guideline on how to properly design parallel elasticity behaviours for engineering systems such as artificial actuators and robotic joints.

  14. Design and analysis of all-dielectric broadband nonpolarizing parallel-plate beam splitters.

    Science.gov (United States)

    Wang, Wenliang; Xiong, Shengming; Zhang, Yundong

    2007-06-01

    Past research on the all-dielectric nonpolarizing beam splitter is reviewed. With the aid of the needle thin-film synthesis method and the conjugate graduate refine method, three different split ratio nonpolarizing parallel-plate beam splitters over a 200 nm spectral range centered at 550 nm with incidence angles of 45 degrees are designed. The chosen materials component and the initial stack are based on the Costich and Thelen theories. The results of design and analysis show that the designs maintain a very low polarization ratio in the working range of the spectrum and has a reasonable angular field.

  15. Optical design of a reaction chamber for weakly absorbed light. II. Parallel mirrors, multitravel

    International Nuclear Information System (INIS)

    Devaney, J.J.; Finch, F.T.

    1975-06-01

    This report outlines the possibilities to be found using one or more diffraction-limited high-quality light beams to activate a weakly absorbing gas in a regime where the diffraction spread can be controlled by converging optical devices to within a ratio of √2 of the minimum at the beam waist (corresponding lengths between converging elements are within twice the Rayleigh range). Our designs use plane or cylindrical parallel mirrors down which a light beam is repeatedly reflected. In the first design variation, the beam is re-reflected up the parallel mirrors to the entrance aperture where it can be returned repeatedly for a number of multiply reflecting ''travels'' up and down the parallel mirror reaction chamber. In the second variation, the return of the beam after each multiply reflecting ''travel'' down the chamber is external to the chamber and is achieved by two mirror reflections. For diffraction control the return mirrors can be made converging. For multiple laser excitation, any of the external return mirrors can be replaced by a laser. The advantage of these designs is a high degree of uniformity of chamber illumination with a reasonably high number of passes. Drawbacks of the designs are the large space needed for beam return (many tens of meters for some parameters) and (common to all high optical quality chambers) the figuring and reflectivity demands on the mirrors. (U.S.)

  16. A conceptual design of multidisciplinary-integrated C.F.D. simulation on parallel computers

    International Nuclear Information System (INIS)

    Onishi, Ryoichi; Ohta, Takashi; Kimura, Toshiya.

    1996-11-01

    A design of a parallel aeroelastic code for aircraft integrated simulations is conducted. The method for integrating aerodynamics and structural dynamics software on parallel computers is devised by using the Euler/Navier-Stokes equations coupled with wing-box finite element structures. A synthesis of modern aircraft requires the optimizations of aerodynamics, structures, controls, operabilities, or other design disciplines, and the R and D efforts to implement Multidisciplinary Design Optimization environments using high performance computers are made especially among the U.S. aerospace industries. This report describes a Multiple Program Multiple Data (MPMD) parallelization of aerodynamics and structural dynamics codes with a dynamic deformation grid. A three-dimensional computation of a flowfield with dynamic deformation caused by a structural deformation is performed, and a pressure data calculated is used for a computation of the structural deformation which is input again to a fluid dynamics code. This process is repeated exchanging the computed data of pressures and deformations between flowfield grids and structural elements. It enables to simulate the structure movements which take into account of the interaction of fluid and structure. The conceptual design for achieving the aforementioned various functions is reported. Also the future extensions to incorporate control systems, which enable to simulate a realistic aircraft configuration to be a major tool for Aircraft Integrated Simulation, are investigated. (author)

  17. Design and Analysis of Cooperative Cable Parallel Manipulators for Multiple Mobile Cranes

    Directory of Open Access Journals (Sweden)

    Bin Zi

    2012-11-01

    Full Text Available The design, dynamic modelling, and workspace are presented in this paper concerning cooperative cable parallel manipulators for multiple mobile cranes (CPMMCs. The CPMMCs can handle complex tasks that are more difficult or even impossible for a single mobile crane. Kinematics and dynamics of the CPMMCs are studied on the basis of geometric methodology and d'Alembert's principle, and a mathematical model of the CPMMCs is developed and presented with dynamic simulation. The constant orientation workspace analysis of the CPMMCs is carried out additionally. As an example, a cooperative cable parallel manipulator for triple mobile cranes with 6 Degrees of Freedom is investigated on the basis of the above design objectives.

  18. Parallel power electronics filters in three-phase four-wire systems principle, control and design

    CERN Document Server

    Wong, Man-Chung; Lam, Chi-Seng

    2016-01-01

    This book describes parallel power electronic filters for 3-phase 4-wire systems, focusing on the control, design and system operation. It presents the basics of power-electronics techniques applied in power systems as well as the advanced techniques in controlling, implementing and designing parallel power electronics converters. The power-quality compensation has been achieved using active filters and hybrid filters, and circuit models, control principles and operational practice problems have been verified by principle study, simulation and experimental results. The state-of-the-art research findings were mainly developed by a team at the University of Macau. Offering background information and related novel techniques, this book is a valuable resource for electrical engineers and researchers wanting to work on energy saving using power-quality compensators or renewable energy power electronics systems. .

  19. Prospective Elementary School Teachers’ Views about Socioscientific Issues: A Concurrent Parallel Design Study

    OpenAIRE

    Muhammet ÖZDEN

    2015-01-01

    The purpose of this research is to examine the prospective elementary school teachers’ perceptions on socioscientific issues. The research was conducted on prospective elementary school teachers studying at a university located in western Turkey. The researcher first taught the subjects of global warming and nuclear power plants from a perspective of socioscientific issues in the science and technology education course and then conducted the research. Concurrent parallel design, one of the mi...

  20. Parallel Hybrid Gas-Electric Geared Turbofan Engine Conceptual Design and Benefits Analysis

    Science.gov (United States)

    Lents, Charles; Hardin, Larry; Rheaume, Jonathan; Kohlman, Lee

    2016-01-01

    The conceptual design of a parallel gas-electric hybrid propulsion system for a conventional single aisle twin engine tube and wing vehicle has been developed. The study baseline vehicle and engine technology are discussed, followed by results of the hybrid propulsion system sizing and performance analysis. The weights analysis for the electric energy storage & conversion system and thermal management system is described. Finally, the potential system benefits are assessed.

  1. Design of multiple sequence alignment algorithms on parallel, distributed memory supercomputers.

    Science.gov (United States)

    Church, Philip C; Goscinski, Andrzej; Holt, Kathryn; Inouye, Michael; Ghoting, Amol; Makarychev, Konstantin; Reumann, Matthias

    2011-01-01

    The challenge of comparing two or more genomes that have undergone recombination and substantial amounts of segmental loss and gain has recently been addressed for small numbers of genomes. However, datasets of hundreds of genomes are now common and their sizes will only increase in the future. Multiple sequence alignment of hundreds of genomes remains an intractable problem due to quadratic increases in compute time and memory footprint. To date, most alignment algorithms are designed for commodity clusters without parallelism. Hence, we propose the design of a multiple sequence alignment algorithm on massively parallel, distributed memory supercomputers to enable research into comparative genomics on large data sets. Following the methodology of the sequential progressiveMauve algorithm, we design data structures including sequences and sorted k-mer lists on the IBM Blue Gene/P supercomputer (BG/P). Preliminary results show that we can reduce the memory footprint so that we can potentially align over 250 bacterial genomes on a single BG/P compute node. We verify our results on a dataset of E.coli, Shigella and S.pneumoniae genomes. Our implementation returns results matching those of the original algorithm but in 1/2 the time and with 1/4 the memory footprint for scaffold building. In this study, we have laid the basis for multiple sequence alignment of large-scale datasets on a massively parallel, distributed memory supercomputer, thus enabling comparison of hundreds instead of a few genome sequences within reasonable time.

  2. A Novel Design of 4-Class BCI Using Two Binary Classifiers and Parallel Mental Tasks

    Directory of Open Access Journals (Sweden)

    Tao Geng

    2008-01-01

    Full Text Available A novel 4-class single-trial brain computer interface (BCI based on two (rather than four or more binary linear discriminant analysis (LDA classifiers is proposed, which is called a “parallel BCI.” Unlike other BCIs where mental tasks are executed and classified in a serial way one after another, the parallel BCI uses properly designed parallel mental tasks that are executed on both sides of the subject body simultaneously, which is the main novelty of the BCI paradigm used in our experiments. Each of the two binary classifiers only classifies the mental tasks executed on one side of the subject body, and the results of the two binary classifiers are combined to give the result of the 4-class BCI. Data was recorded in experiments with both real movement and motor imagery in 3 able-bodied subjects. Artifacts were not detected or removed. Offline analysis has shown that, in some subjects, the parallel BCI can generate a higher accuracy than a conventional 4-class BCI, although both of them have used the same feature selection and classification algorithms.

  3. Coarse-grained parallel genetic algorithm applied to a nuclear reactor core design optimization problem

    International Nuclear Information System (INIS)

    Pereira, Claudio M.N.A.; Lapa, Celso M.F.

    2003-01-01

    This work extends the research related to generic algorithms (GA) in core design optimization problems, which basic investigations were presented in previous work. Here we explore the use of the Island Genetic Algorithm (IGA), a coarse-grained parallel GA model, comparing its performance to that obtained by the application of a traditional non-parallel GA. The optimization problem consists on adjusting several reactor cell parameters, such as dimensions, enrichment and materials, in order to minimize the average peak-factor in a 3-enrichment zone reactor, considering restrictions on the average thermal flux, criticality and sub-moderation. Our IGA implementation runs as a distributed application on a conventional local area network (LAN), avoiding the use of expensive parallel computers or architectures. After exhaustive experiments, taking more than 1500 h in 550 MHz personal computers, we have observed that the IGA provided gains not only in terms of computational time, but also in the optimization outcome. Besides, we have also realized that, for such kind of problem, which fitness evaluation is itself time consuming, the time overhead in the IGA, due to the communication in LANs, is practically imperceptible, leading to the conclusion that the use of expensive parallel computers or architecture can be avoided

  4. A fast pulse design for parallel excitation with gridding conjugate gradient.

    Science.gov (United States)

    Feng, Shuo; Ji, Jim

    2013-01-01

    Parallel excitation (pTx) is recognized as a crucial technique in high field MRI to address the transmit field inhomogeneity problem. However, it can be time consuming to design pTx pulses which is not desirable. In this work, we propose a pulse design with gridding conjugate gradient (CG) based on the small-tip-angle approximation. The two major time consuming matrix-vector multiplications are substituted by two operators which involves with FFT and gridding only. Simulation results have shown that the proposed method is 3 times faster than conventional method and the memory cost is reduced by 1000 times.

  5. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Fully Decoupled Compliant Parallel Mechanism: a New Solution for the Design of Multidimensional Accelerometer

    Directory of Open Access Journals (Sweden)

    Zhen GAO

    2010-08-01

    Full Text Available In this paper, a novel multidimensional accelerometer is proposed based on fully decoupled compliant parallel mechanism. Three separated chains, which are served as the elastic body, are perpendicular to each other for sensing the kinetic information in different directions without decoupling process. As the crucial part of the whole sensor structure, the revolute and prismatic joints in three pairwise orthogonal branches of the parallel mechanism are manufactured with the alloy aluminium as flexure hinge-based compliant joints. The structure development is first introduced, followed by the comprehensive finite-element analysis including the strain of the sensitive legs, modal analysis for total deformation under different frequency, and the performance of harmonic response. Then, the shape optimization is conducted to reduce the unnecessary parts. Compliance optimization with particle swarm algorithm is implemented to redesign the dimension of the sensitive legs. The research supplies a new viewpoint for the mechanical design of physical sensor, especially acceleration sensor.

  7. Optimum design of 6-DOF parallel manipulator with translational/rotational workspaces for haptic device application

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Jung Won; Hwang, Yoon Kwon [Gyeongsang National University, Jinju (Korea, Republic of); Ryu, Je Ha [Gwangju Institute of Science and Technology, Gwangju (Korea, Republic of)

    2010-05-15

    This paper proposes an optimum design method that satisfies the desired orientation workspace at the boundary of the translation workspace while maximizing the mechanism isotropy for parallel manipulators. A simple genetic algorithm is used to obtain the optimal linkage parameters of a six-degree-of-freedom parallel manipulator that can be used as a haptic device. The objective function is composed of a desired spherical shape translation workspace and a desired orientation workspace located on the boundaries of the desired translation workspace, along with a global conditioning index based on a homogeneous Jacobian matrix. The objective function was optimized to satisfy the desired orientation workspace at the boundary positions as translated from a neutral position of the increased entropy mechanism. An optimization result with desired translation and orientation workspaces for a haptic device was obtained to show the effectiveness of the suggested scheme, and the kinematic performances of the proposed model were compared with those of a preexisting base model

  8. Optimum design of 6-DOF parallel manipulator with translational/rotational workspaces for haptic device application

    International Nuclear Information System (INIS)

    Yoon, Jung Won; Hwang, Yoon Kwon; Ryu, Je Ha

    2010-01-01

    This paper proposes an optimum design method that satisfies the desired orientation workspace at the boundary of the translation workspace while maximizing the mechanism isotropy for parallel manipulators. A simple genetic algorithm is used to obtain the optimal linkage parameters of a six-degree-of-freedom parallel manipulator that can be used as a haptic device. The objective function is composed of a desired spherical shape translation workspace and a desired orientation workspace located on the boundaries of the desired translation workspace, along with a global conditioning index based on a homogeneous Jacobian matrix. The objective function was optimized to satisfy the desired orientation workspace at the boundary positions as translated from a neutral position of the increased entropy mechanism. An optimization result with desired translation and orientation workspaces for a haptic device was obtained to show the effectiveness of the suggested scheme, and the kinematic performances of the proposed model were compared with those of a preexisting base model

  9. Acceleration of cardiovascular MRI using parallel imaging: basic principles, practical considerations, clinical applications and future directions

    International Nuclear Information System (INIS)

    Niendorf, T.; Sodickson, D.

    2006-01-01

    Cardiovascular Magnetic Resonance (CVMR) imaging has proven to be of clinical value for non-invasive diagnostic imaging of cardiovascular diseases. CVMR requires rapid imaging; however, the speed of conventional MRI is fundamentally limited due to its sequential approach to image acquisition, in which data points are collected one after the other in the presence of sequentially-applied magnetic field gradients and radiofrequency coils to acquire multiple data points simultaneously, and thereby to increase imaging speed and efficiency beyond the limits of purely gradient-based approaches. The resulting improvements in imaging speed can be used in various ways, including shortening long examinations, improving spatial resolution and anatomic coverage, improving temporal resolution, enhancing image quality, overcoming physiological constraints, detecting and correcting for physiologic motion, and streamlining work flow. Examples of these strategies will be provided in this review, after some of the fundamentals of parallel imaging methods now in use for cardiovascular MRI are outlined. The emphasis will rest upon basic principles and clinical state-of-the art cardiovascular MRI applications. In addition, practical aspects such as signal-to-noise ratio considerations, tailored parallel imaging protocols and potential artifacts will be discussed, and current trends and future directions will be explored. (orig.)

  10. Design of a real-time wind turbine simulator using a custom parallel architecture

    Science.gov (United States)

    Hoffman, John A.; Gluck, R.; Sridhar, S.

    1995-01-01

    The design of a new parallel-processing digital simulator is described. The new simulator has been developed specifically for analysis of wind energy systems in real time. The new processor has been named: the Wind Energy System Time-domain simulator, version 3 (WEST-3). Like previous WEST versions, WEST-3 performs many computations in parallel. The modules in WEST-3 are pure digital processors, however. These digital processors can be programmed individually and operated in concert to achieve real-time simulation of wind turbine systems. Because of this programmability, WEST-3 is very much more flexible and general than its two predecessors. The design features of WEST-3 are described to show how the system produces high-speed solutions of nonlinear time-domain equations. WEST-3 has two very fast Computational Units (CU's) that use minicomputer technology plus special architectural features that make them many times faster than a microcomputer. These CU's are needed to perform the complex computations associated with the wind turbine rotor system in real time. The parallel architecture of the CU causes several tasks to be done in each cycle, including an IO operation and the combination of a multiply, add, and store. The WEST-3 simulator can be expanded at any time for additional computational power. This is possible because the CU's interfaced to each other and to other portions of the simulation using special serial buses. These buses can be 'patched' together in essentially any configuration (in a manner very similar to the programming methods used in analog computation) to balance the input/ output requirements. CU's can be added in any number to share a given computational load. This flexible bus feature is very different from many other parallel processors which usually have a throughput limit because of rigid bus architecture.

  11. Molecular diagnosis of glycogen storage disease and disorders with overlapping clinical symptoms by massive parallel sequencing.

    Science.gov (United States)

    Vega, Ana I; Medrano, Celia; Navarrete, Rosa; Desviat, Lourdes R; Merinero, Begoña; Rodríguez-Pombo, Pilar; Vitoria, Isidro; Ugarte, Magdalena; Pérez-Cerdá, Celia; Pérez, Belen

    2016-10-01

    Glycogen storage disease (GSD) is an umbrella term for a group of genetic disorders that involve the abnormal metabolism of glycogen; to date, 23 types of GSD have been identified. The nonspecific clinical presentation of GSD and the lack of specific biomarkers mean that Sanger sequencing is now widely relied on for making a diagnosis. However, this gene-by-gene sequencing technique is both laborious and costly, which is a consequence of the number of genes to be sequenced and the large size of some genes. This work reports the use of massive parallel sequencing to diagnose patients at our laboratory in Spain using either a customized gene panel (targeted exome sequencing) or the Illumina Clinical-Exome TruSight One Gene Panel (clinical exome sequencing (CES)). Sequence variants were matched against biochemical and clinical hallmarks. Pathogenic mutations were detected in 23 patients. Twenty-two mutations were recognized (mostly loss-of-function mutations), including 11 that were novel in GSD-associated genes. In addition, CES detected five patients with mutations in ALDOB, LIPA, NKX2-5, CPT2, or ANO5. Although these genes are not involved in GSD, they are associated with overlapping phenotypic characteristics such as hepatic, muscular, and cardiac dysfunction. These results show that next-generation sequencing, in combination with the detection of biochemical and clinical hallmarks, provides an accurate, high-throughput means of making genetic diagnoses of GSD and related diseases.Genet Med 18 10, 1037-1043.

  12. Classical and adaptive clinical trial designs using ExpDesign Studio

    National Research Council Canada - National Science Library

    Chang, Mark

    2008-01-01

    ... Relationship 2.2.9 Parallel Design 17 2.2.10 Crossover Design 17 2.2.11 Factorial Design 18 Selection of a Trial Design 18 2.3.1 Balanced Versus Unbalanced Designs 18 2.3.2 Crossover Versus Parallel...

  13. Design of parallel dual-energy X-ray beam and its performance for security radiography

    International Nuclear Information System (INIS)

    Kim, Kwang Hyun; Myoung, Sung Min; Chung, Yong Hyun

    2011-01-01

    A new concept of dual-energy X-ray beam generation and acquisition of dual-energy security radiography is proposed. Erbium (Er) and rhodium (Rh) with a copper filter were positioned in front of X-ray tube to generate low- and high-energy X-ray spectra. Low- and high-energy X-rays were guided to separately enter into two parallel detectors. Monte Carlo code of MCNPX was used to derive an optimum thickness of each filter for improved dual X-ray image quality. It was desired to provide separation ability between organic and inorganic matters for the condition of 140 kVp/0.8 mA as used in the security application. Acquired dual-energy X-ray beams were evaluated by the dual-energy Z-map yielding enhanced performance compared with a commercial dual-energy detector. A collimator for the parallel dual-energy X-ray beam was designed to minimize X-ray beam interference between low- and high-energy parallel beams for 500 mm source-to-detector distance.

  14. Field Programmable Gate Array Based Parallel Strapdown Algorithm Design for Strapdown Inertial Navigation Systems

    Directory of Open Access Journals (Sweden)

    Long-Hua Ma

    2011-08-01

    Full Text Available A new generalized optimum strapdown algorithm with coning and sculling compensation is presented, in which the position, velocity and attitude updating operations are carried out based on the single-speed structure in which all computations are executed at a single updating rate that is sufficiently high to accurately account for high frequency angular rate and acceleration rectification effects. Different from existing algorithms, the updating rates of the coning and sculling compensations are unrelated with the number of the gyro incremental angle samples and the number of the accelerometer incremental velocity samples. When the output sampling rate of inertial sensors remains constant, this algorithm allows increasing the updating rate of the coning and sculling compensation, yet with more numbers of gyro incremental angle and accelerometer incremental velocity in order to improve the accuracy of system. Then, in order to implement the new strapdown algorithm in a single FPGA chip, the parallelization of the algorithm is designed and its computational complexity is analyzed. The performance of the proposed parallel strapdown algorithm is tested on the Xilinx ISE 12.3 software platform and the FPGA device XC6VLX550T hardware platform on the basis of some fighter data. It is shown that this parallel strapdown algorithm on the FPGA platform can greatly decrease the execution time of algorithm to meet the real-time and high precision requirements of system on the high dynamic environment, relative to the existing implemented on the DSP platform.

  15. Design of a family of integrated parallel co-processors for images processing

    International Nuclear Information System (INIS)

    Court, Thierry

    1991-01-01

    The design of parallel image processing Systems joining in a same architecture, sophisticated microprocessors and specialised operators is a difficult task, because of the various problems to be taken into account. The current study identifies a certain way of realizing and interfacing such dedicated operators to a central unit with microprocessor type. The two guide lines of this work are the search for polyvalent specialized and re-configurated operators as well as their connections to a System bus, and not to specialized video buses. This research work proposes a certain architecture of circuits dedicated to image processing and two realization proposals of them. One of them was be realized in this study by using silicon compiler tools. This work belongs to a more important project, whose aim is the development of an industrial image processing System, high performing, modular, based on the parallelization, in MIMD structures, of an elementary, autonomous image processing unit integrating a microprocessor equipped with a parallel coprocessor suited to image processing. (author) [fr

  16. Design and Control of Parallel Three Phase Voltage Source Inverters in Low Voltage AC Microgrid

    Directory of Open Access Journals (Sweden)

    El Hassane Margoum

    2017-01-01

    Full Text Available Design and hierarchical control of three phase parallel Voltage Source Inverters are developed in this paper. The control scheme is based on synchronous reference frame and consists of primary and secondary control levels. The primary control consists of the droop control and the virtual output impedance loops. This control level is designed to share the active and reactive power correctly between the connected VSIs in order to avoid the undesired circulating current and overload of the connected VSIs. The secondary control is designed to clear the magnitude and the frequency deviations caused by the primary control. The control structure is validated through dynamics simulations.The obtained results demonstrate the effectiveness of the control structure.

  17. Efficient method to design RF pulses for parallel excitation MRI using gridding and conjugate gradient.

    Science.gov (United States)

    Feng, Shuo; Ji, Jim

    2014-04-01

    Parallel excitation (pTx) techniques with multiple transmit channels have been widely used in high field MRI imaging to shorten the RF pulse duration and/or reduce the specific absorption rate (SAR). However, the efficiency of pulse design still needs substantial improvement for practical real-time applications. In this paper, we present a detailed description of a fast pulse design method with Fourier domain gridding and a conjugate gradient method. Simulation results of the proposed method show that the proposed method can design pTx pulses at an efficiency 10 times higher than that of the conventional conjugate-gradient based method, without reducing the accuracy of the desirable excitation patterns.

  18. Design and Control System of a Modular Parallel Robot for Medical Applications

    Directory of Open Access Journals (Sweden)

    Florin Covaciu

    2015-06-01

    Full Text Available Brachytherapy (BT, a cancer treatment method, is a type of internal radiation therapy which implies that radiation doses (seeds are placed inside the tumor, aiming to destroy only the cancerous cells, without affecting the surrounding healthy tissue. For a successful brachytherapy procedure, the accurate radiation seeds placement is an important issue, which is why a robotic system has been built for this task. The paper presents the design of a parallel robotic system for brachytherapy procedures and the control system architecture and its implementation.

  19. Parallel LC circuit model for multi-band absorption and preliminary design of radiative cooling.

    Science.gov (United States)

    Feng, Rui; Qiu, Jun; Liu, Linhua; Ding, Weiqiang; Chen, Lixue

    2014-12-15

    We perform a comprehensive analysis of multi-band absorption by exciting magnetic polaritons in the infrared region. According to the independent properties of the magnetic polaritons, we propose a parallel inductance and capacitance(PLC) circuit model to explain and predict the multi-band resonant absorption peaks, which is fully validated by using the multi-sized structure with identical dielectric spacing layer and the multilayer structure with the same strip width. More importantly, we present the application of the PLC circuit model to preliminarily design a radiative cooling structure realized by merging several close peaks together. This omnidirectional and polarization insensitive structure is a good candidate for radiative cooling application.

  20. Design of a highly parallel board-level-interconnection with 320 Gbps capacity

    Science.gov (United States)

    Lohmann, U.; Jahns, J.; Limmer, S.; Fey, D.; Bauer, H.

    2012-01-01

    A parallel board-level interconnection design is presented consisting of 32 channels, each operating at 10 Gbps. The hardware uses available optoelectronic components (VCSEL, TIA, pin-diodes) and a combination of planarintegrated free-space optics, fiber-bundles and available MEMS-components, like the DMD™ from Texas Instruments. As a specific feature, we present a new modular inter-board interconnect, realized by 3D fiber-matrix connectors. The performance of the interconnect is evaluated with regard to optical properties and power consumption. Finally, we discuss the application of the interconnect for strongly distributed system architectures, as, for example, in high performance embedded computing systems and data centers.

  1. Modeling, analysis, and design of stationary reference frame droop controlled parallel three-phase voltage source inverters

    DEFF Research Database (Denmark)

    Vasquez, Juan Carlos; Guerrero, Josep M.; Savaghebi, Mehdi

    2013-01-01

    Power electronics based MicroGrids consist of a number of voltage source inverters (VSIs) operating in parallel. In this paper, the modeling, control design, and stability analysis of parallel connected three-phase VSIs are derived. The proposed voltage and current inner control loops and the mat......Power electronics based MicroGrids consist of a number of voltage source inverters (VSIs) operating in parallel. In this paper, the modeling, control design, and stability analysis of parallel connected three-phase VSIs are derived. The proposed voltage and current inner control loops...... control restores the frequency and amplitude deviations produced by the primary control. Also, a synchronization algorithm is presented in order to connect the MicroGrid to the grid. Experimental results are provided to validate the performance and robustness of the parallel VSI system control...

  2. DATA TRANSFER IN THE AUTOMATED SYSTEM OF PARALLEL DESIGN AND CONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Volkov Andrey Anatol'evich

    2012-12-01

    Full Text Available This article covers data transfer processes in the automated system of parallel design and construction. The authors consider the structure of reports used by contractors and clients when large-scale projects are implemented. All necessary items of information are grouped into three levels, and each level is described by certain attributes. The authors drive a lot of attention to the integrated operational schedule as it is the main tool of project management. Some recommendations concerning the forms and the content of reports are presented. Integrated automation of all operations is a necessary condition for the successful implementation of the new concept. The technical aspect of the notion of parallel design and construction also includes the client-to-server infrastructure that brings together all process implemented by the parties involved into projects. This approach should be taken into consideration in the course of review of existing codes and standards to eliminate any inconsistency between the construction legislation and the practical experience of engineers involved into the process.

  3. Teaching ethics to engineers: ethical decision making parallels the engineering design process.

    Science.gov (United States)

    Bero, Bridget; Kuhlman, Alana

    2011-09-01

    In order to fulfill ABET requirements, Northern Arizona University's Civil and Environmental engineering programs incorporate professional ethics in several of its engineering courses. This paper discusses an ethics module in a 3rd year engineering design course that focuses on the design process and technical writing. Engineering students early in their student careers generally possess good black/white critical thinking skills on technical issues. Engineering design is the first time students are exposed to "grey" or multiple possible solution technical problems. To identify and solve these problems, the engineering design process is used. Ethical problems are also "grey" problems and present similar challenges to students. Students need a practical tool for solving these ethical problems. The step-wise engineering design process was used as a model to demonstrate a similar process for ethical situations. The ethical decision making process of Martin and Schinzinger was adapted for parallelism to the design process and presented to students as a step-wise technique for identification of the pertinent ethical issues, relevant moral theories, possible outcomes and a final decision. Students had greatest difficulty identifying the broader, global issues presented in an ethical situation, but by the end of the module, were better able to not only identify the broader issues, but also to more comprehensively assess specific issues, generate solutions and a desired response to the issue.

  4. Parameters Design for a Parallel Hybrid Electric Bus Using Regenerative Brake Model

    Directory of Open Access Journals (Sweden)

    Zilin Ma

    2014-01-01

    Full Text Available A design methodology which uses the regenerative brake model is introduced to determine the major system parameters of a parallel electric hybrid bus drive train. Hybrid system parameters mainly include the power rating of internal combustion engine (ICE, gear ratios of transmission, power rating, and maximal torque of motor, power, and capacity of battery. The regenerative model is built in the vehicle model to estimate the regenerative energy in the real road conditions. The design target is to ensure that the vehicle meets the specified vehicle performance, such as speed and acceleration, and at the same time, operates the ICE within an expected speed range. Several pairs of parameters are selected from the result analysis, and the fuel saving result in the road test shows that a 25% reduction is achieved in fuel consumption.

  5. Design and Implementation of a New DELTA Parallel Robot in Robotics Competitions

    Directory of Open Access Journals (Sweden)

    Jonqlan Lin

    2015-10-01

    Full Text Available This investigation concerns the design and implementation of the DELTA parallel robot, covering the entire mechatronic process, involving kinematics, control design and optimizing methods. To accelerate the construction of the robot, 3D printing is used to fabricate end-effector parts. The parts are modular, low-cost, reconfigurable and can be assembled in less time than is required for conventionally fabricated parts. The controller, including the control algorithm and human-machine interface (HMI, is coded using the Borland C++ Builder 6 Personal software environment. The integration of the motion controller with image recognition into an opto-mechatronics system is presented. The robot system has been entered into robotic competitions in Taiwan. The experimental results reveal that the proposed DELTA robot completed the tasks in those competitions successfully.

  6. Mechatronic Design of a New Humanoid Robot with Hybrid Parallel Actuation

    Directory of Open Access Journals (Sweden)

    Vítor Santos

    2012-10-01

    Full Text Available Humanoid robotics is unquestionably a challenging and long-term field of research. Of the numerous and most urgent challenges to tackle, autonomous and efficient locomotion may possibly be the most underdeveloped at present in the research community. Therefore, to pursue studies in relation to autonomy with efficient locomotion, the authors have been developing a new teen-sized humanoid platform with hybrid characteristics. The hybrid nature is clear in the mixed actuation based on common electrical motors and passive actuators attached in parallel to the motors. This paper presents the mechatronic design of the humanoid platform, focusing mainly on the mechanical structure, the design and simulation of the hybrid joints, and the different subsystems implemented. Trying to keep the appropriate human proportions and main degrees of freedom, the developed platform utilizes a distributed control architecture and a rich set of sensing capabilities, both ripe for future development and research.

  7. Optimal design of a spherical parallel manipulator based on kinetostatic performance using evolutionary techniques

    Energy Technology Data Exchange (ETDEWEB)

    Daneshmand, Morteza [University of Tartu, Tartu (Estonia); Saadatzi, Mohammad Hossein [Colorado School of Mines, Golden (United States); Kaloorazi, Mohammad Hadi [École de Technologie Supérieur, Montréal (Canada); Masouleh, Mehdi Tale [University of Tehran, Tehran (Iran, Islamic Republic of); Anbarjafari, Gholamreza [Hasan Kalyoncu University, Gaziantep (Turkmenistan)

    2016-03-15

    This study aims to provide an optimal design for a Spherical parallel manipulator (SPM), namely, the Agile Eye. This aim is approached by investigating kinetostatic performance and workspace and searching for the most promising design. Previously recommended designs are examined to determine whether they provide acceptable kinetostatic performance and workspace. Optimal designs are provided according to different kinetostatic performance indices, especially kinematic sensitivity. The optimization process is launched based on the concept of the genetic algorithm. A single-objective process is implemented in accordance with the guidelines of an evolutionary algorithm called differential evolution. A multi-objective procedure is then provided following the reasoning of the nondominated sorting genetic algorithm-II. This process results in several sets of Pareto points for reconciliation between kinetostatic performance indices and workspace. The concept of numerous kinetostatic performance indices and the results of optimization algorithms are elaborated. The conclusions provide hints on the provided set of designs and their credibility to provide a well-conditioned workspace and acceptable kinetostatic performance for the SPM under study, which can be well extended to other types of SPMs.

  8. A design procedure for the phase-controlled parallel-loaded resonant inverter

    Science.gov (United States)

    King, Roger J.

    1989-01-01

    High-frequency-link power conversion and distribution based on a resonant inverter (RI) has been recently proposed. The design of several topologies is reviewed, and a simple approximate design procedure is developed for the phase-controlled parallel-loaded RI. This design procedure seeks to ensure the benefits of resonant conversion and is verified by data from a laboratory 2.5 kVA, 20-kHz converter. A simple phasor analysis is introduced as a useful approximation for design purposes. The load is considered to be a linear impedance (or an ac current sink). The design procedure is verified using a 2.5-kVA 20-kHz RI. Also obtained are predictable worst-case ratings for each component of the resonant tank circuit and the inverter switches. For a given load VA requirement, below-resonance operation is found to result in a significantly lower tank VA requirement. Under transient conditions such as load short-circuit, a reversal of the expected commutation sequence is possible.

  9. Design of mechanical coxa joints based on three-degree-of-freedom spherical parallel manipulators

    International Nuclear Information System (INIS)

    Li, Yanbiao; Ji, Shiming; Wang, Zhongfei; Jin, Mingsheng; Liu, Yi; Jin, Zhenlin

    2013-01-01

    We addressed the issue of the design of mechanical coxa joints based on three-degree-of-freedom spherical parallel manipulators using the parameter statistics optimum method based on index atlases. The coxa joints have the advantages of high payload, high accuracy, and good technological efficiency. The first step of the design and prototyping used in this paper develops the direct and inverse displacement equations from the layout feature of the mechanical coxa joints. Then, the shapes of a constant-orientation workspace of the mechanical coxa joints are described, and the effects of the design parameters on the workspace volume are studied quantitatively. The next step deals with the graphical representation of the atlases that illustrates the relationship between performance evaluation index and design parameters based on the kinematics and torque analysis of the mechanical coxa joints. Finally, the geometric parameters of the coxa joints are obtained by the parameter statistics optimum method based on the index atlases. Considering assembly conditions, the design scheme of the mechanical coxa joints is developed, which provides a theoretical basis for the application of the mechanical coxa joints.

  10. Design, fabrication and characterization of a micro-fluxgate intended for parallel robot application

    Science.gov (United States)

    Kirchhoff, M. R.; Bogdanski, G.; Büttgenbach, S.

    2009-05-01

    This paper presents a micro-magnetometer based on the fluxgate principle. Fluxgates detect the magnitude and direction of DC and low-frequency AC magnetic fields. The detectable flux density typically ranges from several 10 nT to about 1 mT. The introduced fluxgate sensor is fabricated using MEMS-technologies, basically UV depth lithography and electroplating for manufacturing high aspect ratio structures. It consists of helical copper coils around a soft magnetic nickel-iron (NiFe) core. The core is designed in so-called racetrack geometry, whereby the directional sensitivity of the sensor is considerably higher compared to common ring-core fluxgates. The electrical operation is based on analyzing the 2nd harmonic of the AC output signal. Configuration, manufacturing and selected characteristics of the fluxgate magnetometer are discussed in this work. The fluxgate builds the basis of an innovative angular sensor system for a parallel robot with HEXA-structure. Integrated into the passive joints of the parallel robot, the fluxgates are combined with permanent magnets rotating on the joint shafts. The magnet transmits the angular information via its magnetic orientation. In this way, the angles between the kinematic elements are measured, which allows self-calibration of the robot and the fast analytical solution of direct kinematics for an advanced workspace monitoring.

  11. Development of a parallel genetic algorithm using MPI and its application in a nuclear reactor core. Design optimization

    International Nuclear Information System (INIS)

    Waintraub, Marcel; Pereira, Claudio M.N.A.; Baptista, Rafael P.

    2005-01-01

    This work presents the development of a distributed parallel genetic algorithm applied to a nuclear reactor core design optimization. In the implementation of the parallelism, a 'Message Passing Interface' (MPI) library, standard for parallel computation in distributed memory platforms, has been used. Another important characteristic of MPI is its portability for various architectures. The main objectives of this paper are: validation of the results obtained by the application of this algorithm in a nuclear reactor core optimization problem, through comparisons with previous results presented by Pereira et al.; and performance test of the Brazilian Nuclear Engineering Institute (IEN) cluster in reactors physics optimization problems. The experiments demonstrated that the developed parallel genetic algorithm using the MPI library presented significant gains in the obtained results and an accentuated reduction of the processing time. Such results ratify the use of the parallel genetic algorithms for the solution of nuclear reactor core optimization problems. (author)

  12. Clinic exam room design: present and future.

    Science.gov (United States)

    Freihoefer, Kara; Nyberg, Gary; Vickery, Christine

    2013-01-01

    This article aims to deconstruct various design qualities and strategies of clinic exam rooms, and discuss how they influence users' interaction and behavior in the space. Relevant literature supports the advantages and disadvantages of different design strategies. Annotated exam room prototypes illustrate the design qualities and strategies discussed. Advancements in technology and medicine, along with new legislative policies, are influencing the way care providers deliver care and ultimately clinic exam room designs. The patient-centered medical home model has encouraged primary care providers to make patients more active leaders of their health plan which will influence the overall functionality and configuration of clinic exam rooms. Specific design qualities discussed include overall size, location of doors and privacy curtains, positioning of exam tables, influence of technology in the consultation area, types of seating, and placement of sink and hand sanitizing dispensers. In addition, future trends of exam room prototypes are presented. There is a general lack of published evidence to support design professionals' design solutions for outpatient exam rooms. Future research should investigate such topics as the location of exam tables and privacy curtains as they relate to patient privacy; typical size and location of consultation table as it relates to patient connection and communication; and placement of sinks and sanitization dispensers as they relate to frequency and patterns of usage. Literature review, outpatient, technology, visual privacy.

  13. A Novel Technique for Design of Ultra High Tunable Electrostatic Parallel Plate RF MEMS Variable Capacitor

    Science.gov (United States)

    Baghelani, Masoud; Ghavifekr, Habib Badri

    2017-12-01

    This paper introduces a novel method for designing of low actuation voltage, high tuning ratio electrostatic parallel plate RF MEMS variable capacitors. It is feasible to achieve ultra-high tuning ratios way beyond 1.5:1 barrier, imposed by pull-in effect, by the proposed method. The proposed method is based on spring strengthening of the structure just before the unstable region. Spring strengthening could be realized by embedding some dimples on the spring arms with the precise height. These dimples shorten the spring length when achieved to the substrate. By the proposed method, as high tuning ratios as 7.5:1 is attainable by only considering four dimple sets. The required actuation voltage for this high tuning ratio is 14.33 V which is simply achievable on-chip by charge pump circuits. Brownian noise effect is also discussed and mechanical natural frequency of the structure is calculated.

  14. The design and performance of the parallel multiprocessor nuclear physics data acquisition system, DAPHNE

    International Nuclear Information System (INIS)

    Welch, L.C.; Moog, T.H.; Daly, R.T.; Videbaek, F.

    1987-05-01

    The ever increasing complexity of nuclear physics experiments places severe demands on computerized data acquisition systems. A natural evolution of these systems, taking advantages of the independent nature of ''events,'' is to use identical parallel microcomputers in a front end to simultaneously analyze separate events. Such a system has been developed at Argonne to serve the needs of the experimental program of ATLAS, a new superconducting heavy-ion accelerator and other on-going research. Using microcomputers based on the National Semiconductor 32016 microprocessor housed in a Multibus I cage, CPU power equivalent to several VAXs is obtained at a fraction of the cost of one VAX. The front end interfacs to a VAX 11/750 on which an extensive user friendly command language based on DCL resides. The whole system, known as DAPHNE, also provides the means to reply data using the same command language. Design concepts, data structures, performance, and experience to data are discussed

  15. The design, creation, and performance of the parallel multiprocessor nuclear physics data acquisition system, DAPHNE

    International Nuclear Information System (INIS)

    Welch, L.C.; Moog, T.H.; Daly, R.T.; Videbaek, F.

    1986-01-01

    The ever increasing complexity of nuclear physics experiments places severe demands on computerized data acquisition systems. A natural evolution of these system, taking advantage of the independent nature of ''events'', is to use identical parallel microcomputers in a front end to simultaneously analyze separate events. Such a system has been developed at Argonne to serve the needs of the experimental program of ATLAS, a new superconducting heavy-ion accelerator and other on-going research. Using microcomputers based on the National Semiconductor 32016 microprocessor housed in a Multibus I cage, multi-VAX cpu power is obtained at a fraction of the cost of one VAX. The front end interfaces to a VAX 750 on which an extensive user friendly command language based on DCL resides. The whole system, known as DAPHNE, also provides the means to replay data using the same command language. Design concepts, data structures, performance, and experience to data are discussed. 5 refs., 2 figs

  16. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  17. Layout design and energetic analysis of a complex diesel parallel hybrid electric vehicle

    International Nuclear Information System (INIS)

    Finesso, Roberto; Spessa, Ezio; Venditti, Mattia

    2014-01-01

    Highlights: • Layout design, energetic and cost analysis of complex parallel hybrid vehicles. • Development of global and real-time optimizers for control strategy identification. • Rule-based control strategies to minimize fuel consumption and NO x . • Energy share across each working mode for battery and thermal engine. - Abstract: The present paper is focused on the design, optimization and analysis of a complex parallel hybrid electric vehicle, equipped with two electric machines on both the front and rear axles, and on the evaluation of its potential to reduce fuel consumption and NO x emissions over several driving missions. The vehicle has been compared with two conventional parallel hybrid vehicles, equipped with a single electric machine on the front axle or on the rear axle, as well as with a conventional vehicle. All the vehicles have been equipped with compression ignition engines. The optimal layout of each vehicle was identified on the basis of the minimization of the overall powertrain costs during the whole vehicle life. These costs include the initial investment due to the production of the components as well as the operating costs related to fuel consumption and to battery depletion. Identification of the optimal powertrain control strategy, in terms of the management of the power flows of the engine and electric machines, and of gear selection, is necessary in order to be able to fully exploit the potential of the hybrid architecture. To this end, two global optimizers, one of a deterministic nature and another of a stochastic type, and two real-time optimizers have been developed, applied and compared. A new mathematical technique has been developed and applied to the vehicle simulation model in order to decrease the computational time of the optimizers. First, the vehicle model equations were written in order to allow a coarse time grid to be used, then, the control variables (i.e., power flow and gear number) were discretized, and the

  18. Parallel rendering

    Science.gov (United States)

    Crockett, Thomas W.

    1995-01-01

    This article provides a broad introduction to the subject of parallel rendering, encompassing both hardware and software systems. The focus is on the underlying concepts and the issues which arise in the design of parallel rendering algorithms and systems. We examine the different types of parallelism and how they can be applied in rendering applications. Concepts from parallel computing, such as data decomposition, task granularity, scalability, and load balancing, are considered in relation to the rendering problem. We also explore concepts from computer graphics, such as coherence and projection, which have a significant impact on the structure of parallel rendering algorithms. Our survey covers a number of practical considerations as well, including the choice of architectural platform, communication and memory requirements, and the problem of image assembly and display. We illustrate the discussion with numerous examples from the parallel rendering literature, representing most of the principal rendering methods currently used in computer graphics.

  19. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Narayanan Manikandan

    2016-01-01

    Full Text Available Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used.

  20. Overview of development and design of MPACT: Michigan parallel characteristics transport code

    Energy Technology Data Exchange (ETDEWEB)

    Kochunas, B.; Collins, B.; Jabaay, D.; Downar, T. J.; Martin, W. R. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2200 Bonisteel, Ann Arbor, MI 48109 (United States)

    2013-07-01

    MPACT (Michigan Parallel Characteristics Transport Code) is a new reactor analysis tool. It is being developed by students and research staff at the University of Michigan to be used for an advanced pin-resolved transport capability within VERA (Virtual Environment for Reactor Analysis). VERA is the end-user reactor simulation tool being produced by the Consortium for the Advanced Simulation of Light Water Reactors (CASL). The MPACT development project is itself unique for the way it is changing how students do research to achieve the instructional and research goals of an academic institution, while providing immediate value to industry. The MPACT code makes use of modern lean/agile software processes and extensive testing to maintain a level of productivity and quality required by CASL. MPACT's design relies heavily on object-oriented programming concepts and design patterns and is programmed in Fortran 2003. These designs are explained and illustrated as to how they can be readily extended to incorporate new capabilities and research ideas in support of academic research objectives. The transport methods currently implemented in MPACT include the 2-D and 3-D method of characteristics (MOC) and 2-D and 3-D method of collision direction probabilities (CDP). For the cross section resonance treatment, presently the subgroup method and the new embedded self-shielding method (ESSM) are implemented within MPACT. (authors)

  1. Spine device clinical trials: design and sponsorship.

    Science.gov (United States)

    Cher, Daniel J; Capobianco, Robyn A

    2015-05-01

    Multicenter prospective randomized clinical trials represent the best evidence to support the safety and effectiveness of medical devices. Industry sponsorship of multicenter clinical trials is purported to lead to bias. To determine what proportion of spine device-related trials are industry-sponsored and the effect of industry sponsorship on trial design. Analysis of data from a publicly available clinical trials database. Clinical trials of spine devices registered on ClinicalTrials.gov, a publicly accessible trial database, were evaluated in terms of design, number and location of study centers, and sample size. The relationship between trial design characteristics and study sponsorship was evaluated using logistic regression and general linear models. One thousand six hundred thrity-eight studies were retrieved from ClinicalTrials.gov using the search term "spine." Of the 367 trials that focused on spine surgery, 200 (54.5%) specifically studied devices for spine surgery and 167 (45.5%) focused on other issues related to spine surgery. Compared with nondevice trials, device trials were far more likely to be sponsored by the industry (74% vs. 22.2%, odds ratio (OR) 9.9 [95% confidence interval 6.1-16.3]). Industry-sponsored device trials were more likely multicenter (80% vs. 29%, OR 9.8 [4.8-21.1]) and had approximately four times as many participating study centers (pdevices not sponsored by the industry. Most device-related spine research is industry-sponsored. Multicenter trials are more likely to be industry-sponsored. These findings suggest that previously published studies showing larger effect sizes in industry-sponsored vs. nonindustry-sponsored studies may be biased as a result of failure to take into account the marked differences in design and purpose. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. A New XYZ Compliant Parallel Mechanism for Micro-/Nano-Manipulation: Design and Analysis

    Directory of Open Access Journals (Sweden)

    Haiyang Li

    2016-02-01

    Full Text Available Based on the constraint and position identification (CPI approach for synthesizing XYZ compliant parallel mechanisms (CPMs and configuration modifications, this paper proposes a new fully-symmetrical XYZ CPM with desired motion characteristics such as reduced cross-axis coupling, minimized lost motion, and relatively small parasitic motion. The good motion characteristics arise from not only its symmetric configuration, but also the rigid linkages between non-adjacent rigid stages. Comprehensive kinematic analysis is carried out based on a series of finite element simulations over a motion range per axis less than ±5% of the beam length, which reveals that the maximum cross-axis coupling rate is less than 0.86%, the maximum lost motion rate is less than 1.20%, the parasitic rotations of the motion stage (MS are in the order of 10−5 rad, and the parasitic translations of the three actuated stages (ASs are in the order of 10−4 of the beam length (less than 0.3% of the motion range, where the beam slenderness ratio is larger than 20. Furthermore, the nonlinear analytical models of the primary translations of the XYZ CPM, including the primary translations of the MS and the ASs, are derived and validated to provide a quick design synthesis. Moreover, two practical design schemes of the proposed XYZ CPM are discussed with consideration of the manufacturability. The practical designs enable the XYZ CPM to be employed in many applications such as micro-/nano-positioning, micro-/nano-manufacturing and micro-/nano-assembly. Finally, a spatial high-precision translational system is presented based on the practical design schemes, taking the actuator and sensor integration into account.

  3. Online optimal experimental re-design in robotic parallel fed-batch cultivation facilities.

    Science.gov (United States)

    Cruz Bournazou, M N; Barz, T; Nickel, D B; Lopez Cárdenas, D C; Glauche, F; Knepper, A; Neubauer, P

    2017-03-01

    We present an integrated framework for the online optimal experimental re-design applied to parallel nonlinear dynamic processes that aims to precisely estimate the parameter set of macro kinetic growth models with minimal experimental effort. This provides a systematic solution for rapid validation of a specific model to new strains, mutants, or products. In biosciences, this is especially important as model identification is a long and laborious process which is continuing to limit the use of mathematical modeling in this field. The strength of this approach is demonstrated by fitting a macro-kinetic differential equation model for Escherichia coli fed-batch processes after 6 h of cultivation. The system includes two fully-automated liquid handling robots; one containing eight mini-bioreactors and another used for automated at-line analyses, which allows for the immediate use of the available data in the modeling environment. As a result, the experiment can be continually re-designed while the cultivations are running using the information generated by periodical parameter estimations. The advantages of an online re-computation of the optimal experiment are proven by a 50-fold lower average coefficient of variation on the parameter estimates compared to the sequential method (4.83% instead of 235.86%). The success obtained in such a complex system is a further step towards a more efficient computer aided bioprocess development. Biotechnol. Bioeng. 2017;114: 610-619. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. The Business of Research in Art and Design: Parallels Between Research Centres and Small Businesses

    Directory of Open Access Journals (Sweden)

    Seymour Roworth-Stokes

    2013-01-01

    Full Text Available This article provides a cross-case analysis of four art and design research centres operating within UK universities. Findings from autobiographical and semi-structured interviews with researchers, research managers, and research leaders indicate that they encounter similar issues in trying to establish internal legitimacy within the university alongside the need to gain external support and recognition. In dealing with these challenges, art and design research centres tend to pass through four broadly identifiable phases: (i Origination (utilising credentials and leadership capacity, (ii Establishment (securing resources and embedding dedicated systems and processes, (iii Development (furthering profile, diversifying, and retaining autonomy, and (iv Sustainability (enhancing research culture, networks, and influence.Many interesting parallels are evident with the way small businesses strive to establish themselves within competitive market environments. Lessons for research managers and directors are explored to consider such similarities in key areas of responsibility that cover leadership, managing people and processes, developing organisational capacity, and building external networks. The research suggests research centre directors must demonstrate many intrapreneurial qualities to overcome obstacles in the development of a successful research team and that university departments can make substantial organisational interventions to help them succeed.

  5. Design and control of a decoupled two degree of freedom translational parallel micro-positioning stage.

    Science.gov (United States)

    Lai, Lei-Jie; Gu, Guo-Ying; Zhu, Li-Min

    2012-04-01

    This paper presents a novel decoupled two degrees of freedom (2-DOF) translational parallel micro-positioning stage. The stage consists of a monolithic compliant mechanism driven by two piezoelectric actuators. The end-effector of the stage is connected to the base by four independent kinematic limbs. Two types of compound flexure module are serially connected to provide 2-DOF for each limb. The compound flexure modules and mirror symmetric distribution of the four limbs significantly reduce the input and output cross couplings and the parasitic motions. Based on the stiffness matrix method, static and dynamic models are constructed and optimal design is performed under certain constraints. The finite element analysis results are then given to validate the design model and a prototype of the XY stage is fabricated for performance tests. Open-loop tests show that maximum static and dynamic cross couplings between the two linear motions are below 0.5% and -45 dB, which are low enough to utilize the single-input-single-out control strategies. Finally, according to the identified dynamic model, an inversion-based feedforward controller in conjunction with a proportional-integral-derivative controller is applied to compensate for the nonlinearities and uncertainties. The experimental results show that good positioning and tracking performances are achieved, which verifies the effectiveness of the proposed mechanism and controller design. The resonant frequencies of the loaded stage at 2 kg and 5 kg are 105 Hz and 68 Hz, respectively. Therefore, the performance of the stage is reasonably good in term of a 200 N load capacity. © 2012 American Institute of Physics

  6. Prospective Elementary School Teachers’ Views about Socioscientific Issues: A Concurrent Parallel Design Study

    Directory of Open Access Journals (Sweden)

    Muhammet ÖZDEN

    2015-06-01

    Full Text Available The purpose of this research is to examine the prospective elementary school teachers’ perceptions on socioscientific issues. The research was conducted on prospective elementary school teachers studying at a university located in western Turkey. The researcher first taught the subjects of global warming and nuclear power plants from a perspective of socioscientific issues in the science and technology education course and then conducted the research. Concurrent parallel design, one of the mixed-method research approaches, was used to conduct the research. In this context, semi-structured interviews were conducted with eight teachers in the qualitative strand of the study to explore the phenomenon. The data obtained from the interviews were analyzed using thematic analysis. During the quantitative strand of the research, 113 prospective teachers were administered a questionnaire form. The results of the study revealed that none of the participating prospective teachers mentioned about the religious and cultural characteristics of socioscientific issues, and they need training about how to use socioscientific issues in teaching.

  7. Prospective elementary school teachers’ views about socioscientific issues: A concurrent parallel design study

    Directory of Open Access Journals (Sweden)

    Muhammet Özden

    2015-07-01

    Full Text Available The purpose of this research is to examine the prospective elementary school teachers’ perceptions on socioscientific issues. The research was conducted on prospective elementary school teachers studying at a university located in western Turkey. The researcher first taught the subjects of global warming and nuclear power plants from a perspective of socioscientific issues in the science and technology education course and then conducted the research. Concurrent parallel design, one of the mixed-method research approaches, was used to conduct the research. In this context, semi-structured interviews were conducted with eight teachers in the qualitative strand of the study to explore the phenomenon. The data obtained from the interviews were analyzed using thematic analysis. During the quantitative strand of the research, 113 prospective teachers were administered a questionnaire form. The results of the study revealed that none of the participating prospective teachers mentioned about the religious and cultural characteristics of socioscientific issues, and they need training about how to use socioscientific issues in teaching.

  8. SEJITS: embedded specializers to turn patterns-based designs into optimized parallel code

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    All software should be parallel software. This is natural result of the transition to a many core world. For a small fraction of the world's programmers (efficiency programmers), this is not a problem. They enjoy mapping algorithms onto the details of a particular system and are well served by low level languages and OpenMP, MPI, or OpenCL. Most programmers, however, are "domain specialists" who write code. They are too busy working in their domain of choice (such as physics) to master the intricacies of each computer they use. How do we make these programmers productive without giving up performance? We have been working with a team at UC Berkeley's ParLab to address this problem. The key is a clear software architecture expressed in terms of design patterns that exposes the concurrency in a problem. The resulting code is written using a patterns-based framework within a high level, productivity language (such as Python). Then a separate system is used by a small group o...

  9. An improved design of virtual output impedance loop for droop-controlled parallel three-phase Voltage Source Inverters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Chen, Zhe

    2012-01-01

    -sequence virtual resistance even in the case of feeding a balanced three-phase load. Furthermore, to adapt to the variety of unbalanced loads, a dynamically-tuned negative-sequence resistance loop is designed, such that a good compromise between the quality of inverter output voltage and the performance of load......The virtual output impedance loop is known as an effective way to enhance the load sharing stability and quality of droop-controlled parallel inverters. This paper proposes an improved design of virtual output impedance loop for parallel three-phase voltage source inverters. In the approach...... sharing can be obtained. Finally, laboratory test results of two parallel three-phase voltage source inverters are shown to confirm the validity of the proposed method....

  10. Fisher information and Cramér-Rao lower bound for experimental design in parallel imaging.

    Science.gov (United States)

    Bouhrara, Mustapha; Spencer, Richard G

    2018-06-01

    The Cramér-Rao lower bound (CRLB) is widely used in the design of magnetic resonance (MR) experiments for parameter estimation. Previous work has considered only Gaussian or Rician noise distributions in this calculation. However, the noise distribution for multi-coil acquisitions, such as in parallel imaging, obeys the noncentral χ-distribution under many circumstances. The purpose of this paper is to present the CRLB calculation for parameter estimation from multi-coil acquisitions. We perform explicit calculations of Fisher matrix elements and the associated CRLB for noise distributions following the noncentral χ-distribution. The special case of diffusion kurtosis is examined as an important example. For comparison with analytic results, Monte Carlo (MC) simulations were conducted to evaluate experimental minimum standard deviations (SDs) in the estimation of diffusion kurtosis model parameters. Results were obtained for a range of signal-to-noise ratios (SNRs), and for both the conventional case of Gaussian noise distribution and noncentral χ-distribution with different numbers of coils, m. At low-to-moderate SNR, the noncentral χ-distribution deviates substantially from the Gaussian distribution. Our results indicate that this departure is more pronounced for larger values of m. As expected, the minimum SDs (i.e., CRLB) in derived diffusion kurtosis model parameters assuming a noncentral χ-distribution provided a closer match to the MC simulations as compared to the Gaussian results. Estimates of minimum variance for parameter estimation and experimental design provided by the CRLB must account for the noncentral χ-distribution of noise in multi-coil acquisitions, especially in the low-to-moderate SNR regime. Magn Reson Med 79:3249-3255, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  11. Incorporating alternative design clinical trials in network meta-analyses

    Directory of Open Access Journals (Sweden)

    Thorlund K

    2014-12-01

    Full Text Available Kristian Thorlund,1–3 Eric Druyts,1,4 Kabirraaj Toor,1,5 Jeroen P Jansen,1,6 Edward J Mills1,3 1Redwood Outcomes, Vancouver, BC, 2Department of Clinical Epidemiology and Biostatistics, McMaster University, Hamilton, ON, Canada; 3Stanford Prevention Research Center, Stanford University, Stanford, CA, USA; 4Department of Medicine, Faculty of Medicine, 5School of Population and Public Health, Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada; 6Department of Public Health and Community Medicine, Tufts University, Boston, MA, USA Introduction: Network meta-analysis (NMA is an extension of conventional pairwise meta-analysis that allows for simultaneous comparison of multiple interventions. Well-established drug class efficacies have become commonplace in many disease areas. Thus, for reasons of ethics and equipoise, it is not practical to randomize patients to placebo or older drug classes. Unique randomized clinical trial designs are an attempt to navigate these obstacles. These alternative designs, however, pose challenges when attempting to incorporate data into NMAs. Using ulcerative colitis as an example, we illustrate an example of a method where data provided by these trials are used to populate treatment networks. Methods: We present the methods used to convert data from the PURSUIT trial into a typical parallel design for inclusion in our NMA. Data were required for three arms: golimumab 100 mg; golimumab 50 mg; and placebo. Golimumab 100 mg induction data were available; however, data regarding those individuals who were nonresponders at induction and those who were responders at maintenance were not reported, and as such, had to be imputed using data from the rerandomization phase. Golimumab 50 mg data regarding responses at week 6 were not available. Existing relationships between the available components were used to impute the expected proportions in this missing subpopulation. Data for placebo maintenance

  12. Parallel algorithms

    CERN Document Server

    Casanova, Henri; Robert, Yves

    2008-01-01

    ""…The authors of the present book, who have extensive credentials in both research and instruction in the area of parallelism, present a sound, principled treatment of parallel algorithms. … This book is very well written and extremely well designed from an instructional point of view. … The authors have created an instructive and fascinating text. The book will serve researchers as well as instructors who need a solid, readable text for a course on parallelism in computing. Indeed, for anyone who wants an understandable text from which to acquire a current, rigorous, and broad vi

  13. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  14. Chocolate flavanols and skin photoprotection: a parallel, double-blind, randomized clinical trial.

    Science.gov (United States)

    Mogollon, Jaime Andres; Boivin, Catherine; Lemieux, Simone; Blanchet, Claudine; Claveau, Joël; Dodin, Sylvie

    2014-06-27

    Solar ultraviolet (UV) radiation has deleterious effects on the skin, including sunburn, photoaging and cancer. Chocolate flavanols are naturally-occurring antioxidant and anti-inflammatory molecules that could play a role in preventing cutaneous UV damage. We investigated the influence of 12-week high-flavanol chocolate (HFC) consumption on skin sensitivity to UV radiation, measured by minimal erythema dose (MED). We also evaluated skin elasticity and hydration. In this 2-group, parallel, double-blind, randomized controlled trial, 74 women aged 20-65 years and Fitzpatrick skin phototypes I or II were recruited from the general community in Quebec City, for randomization to either HFC (n = 33) or low-flavanol chocolate (LFC) (n = 41). A blocked randomisation (4), considering date of entry, skin type and age as factors, generated a sequentially-numbered allocation list. Study participants and research assistants were blinded. Totally, 30 g of chocolate were consumed daily for 12 weeks, followed by a 3-week washout period. MED was assessed at baseline and at 6, 9, 12 and 15 weeks. Main outcome was changes in MED at week 12. 33 participants in the HFC group and 41 in the LFC group were analyzed with 15 weeks of follow-up. Both groups showed similarly-increased MED at 12 weeks (HFC: 0.0252 ± 0.1099 J/cm2 [mean ± standard deviation (SD)]; LFC: 0.0151 ± 0.1118; mean difference (MD): 0.0100 J/cm2; 95% confidence interval (CI): -0.0417 to 0.0618). However, after 3-week washout, the HFC group presented decreased MED (-0.0248 ± 0.1145) whereas no effect was seen in the LFC group (0.0168 ± 0.1698) (MD: -0.0417; 95% CI: -0.1106 to 0.0272). Net temple elasticity increased slightly but significantly by 0.09 ± 0.12 mm in the HFC group at 12 weeks compared to 0.02 ± 0.12 mm in the LFC group (MD: 0.06; 95% CI: 0.01 to 0.12 ). No significant adverse events were reported. Our study failed to demonstrate a statistically

  15. Chocolate flavanols and skin photoprotection: a parallel, double-blind, randomized clinical trial

    Science.gov (United States)

    2014-01-01

    Background Solar ultraviolet (UV) radiation has deleterious effects on the skin, including sunburn, photoaging and cancer. Chocolate flavanols are naturally-occurring antioxidant and anti-inflammatory molecules that could play a role in preventing cutaneous UV damage. We investigated the influence of 12-week high-flavanol chocolate (HFC) consumption on skin sensitivity to UV radiation, measured by minimal erythema dose (MED). We also evaluated skin elasticity and hydration. Methods In this 2-group, parallel, double-blind, randomized controlled trial, 74 women aged 20–65 years and Fitzpatrick skin phototypes I or II were recruited from the general community in Quebec City, for randomization to either HFC (n = 33) or low-flavanol chocolate (LFC) (n = 41). A blocked randomisation (4), considering date of entry, skin type and age as factors, generated a sequentially-numbered allocation list. Study participants and research assistants were blinded. Totally, 30 g of chocolate were consumed daily for 12 weeks, followed by a 3-week washout period. MED was assessed at baseline and at 6, 9, 12 and 15 weeks. Main outcome was changes in MED at week 12. Results 33 participants in the HFC group and 41 in the LFC group were analyzed with 15 weeks of follow-up. Both groups showed similarly-increased MED at 12 weeks (HFC: 0.0252 ± 0.1099 J/cm2 [mean ± standard deviation (SD)]; LFC: 0.0151 ± 0.1118; mean difference (MD): 0.0100 J/cm2; 95% confidence interval (CI): -0.0417 to 0.0618). However, after 3-week washout, the HFC group presented decreased MED (-0.0248 ± 0.1145) whereas no effect was seen in the LFC group (0.0168 ± 0.1698) (MD: -0.0417; 95% CI: -0.1106 to 0.0272). Net temple elasticity increased slightly but significantly by 0.09 ± 0.12 mm in the HFC group at 12 weeks compared to 0.02 ± 0.12 mm in the LFC group (MD: 0.06; 95% CI: 0.01 to 0.12 ). No significant adverse events were reported. Conclusion Our study failed to

  16. Design of a Quasi-Passive Parallel Leg Exoskeleton to Augment Load Carrying for Walking

    National Research Council Canada - National Science Library

    Valiente, Andrew

    2005-01-01

    .... The exoskeleton structure runs parallel to the legs, transferring payload forces to the ground. In an attempt to make the exoskeleton more efficient, passive hip and ankle springs are employed to store and release energy throughout the gait cycle...

  17. Design, construction, and testing of a hysteresis controlled inverter for paralleling

    OpenAIRE

    Fillmore, Paul F.

    2003-01-01

    The U. S. Navy is pursuing an all electric ship that will require enormous amounts of power for applications such as electric propulsion. Reliability and redundancy in the electronics are imperative, since failure of a critical system could leave a ship stranded and vulnerable. A parallel inverter drive topology has been proposed to provide reliability and redundancy through load sharing. The parallel architecture enables some functionality in the event that one of the inverters fails. This t...

  18. Design of a chemical batch plant : a study of dedicated parallel lines with intermediate storage and the plant performance

    OpenAIRE

    Verbiest, Floor; Cornelissens, Trijntje; Springael, Johan

    2016-01-01

    Abstract: Production plants worldwide face huge challenges in satisfying high service levels and outperforming competition. These challenges require appropriate strategic decisions on plant design and production strategies. In this paper, we focus on multiproduct chemical batch plants, which are typically equipped with multiple production lines and intermediate storage tanks. First we extend the existing MI(N) LP design models with the concept of parallel production lines, and optimise the as...

  19. Design of Parallel Air-Cooled Battery Thermal Management System through Numerical Study

    Directory of Open Access Journals (Sweden)

    Kai Chen

    2017-10-01

    Full Text Available In electric vehicles, the battery pack is one of the most important components that strongly influence the system performance. The battery thermal management system (BTMS is critical to remove the heat generated by the battery pack, which guarantees the appropriate working temperature for the battery pack. Air cooling is one of the most commonly-used solutions among various battery thermal management technologies. In this paper, the cooling performance of the parallel air-cooled BTMS is improved through choosing appropriate system parameters. The flow field and the temperature field of the system are calculated using the computational fluid dynamics method. Typical numerical cases are introduced to study the influences of the operation parameters and the structure parameters on the performance of the BTMS. The operation parameters include the discharge rate of the battery pack, the inlet air temperature and the inlet airflow rate. The structure parameters include the cell spacing and the angles of the divergence plenum and the convergence plenum. The results show that the temperature rise and the temperature difference of the batter pack are not affected by the inlet air flow temperature and are increased as the discharge rate increases. Increasing the inlet airflow rate can reduce the maximum temperature, but meanwhile significantly increase the power consumption for driving the airflow. Adopting smaller cell spacing can reduce the temperature and the temperature difference of the battery pack, but it consumes much more power. Designing the angles of the divergence plenum and the convergence plenum is an effective way to improve the performance of the BTMS without occupying more system volume. An optimization strategy is used to obtain the optimal values of the plenum angles. For the numerical cases with fixed power consumption, the maximum temperature and the maximum temperature difference at the end of the five-current discharge process for

  20. Effective Five Directional Partial Derivatives-Based Image Smoothing and a Parallel Structure Design.

    Science.gov (United States)

    Choongsang Cho; Sangkeun Lee

    2016-04-01

    Image smoothing has been used for image segmentation, image reconstruction, object classification, and 3D content generation. Several smoothing approaches have been used at the pre-processing step to retain the critical edge, while removing noise and small details. However, they have limited performance, especially in removing small details and smoothing discrete regions. Therefore, to provide fast and accurate smoothing, we propose an effective scheme that uses a weighted combination of the gradient, Laplacian, and diagonal derivatives of a smoothed image. In addition, to reduce computational complexity, we designed and implemented a parallel processing structure for the proposed scheme on a graphics processing unit (GPU). For an objective evaluation of the smoothing performance, the images were linearly quantized into several layers to generate experimental images, and the quantized images were smoothed using several methods for reconstructing the smoothly changed shape and intensity of the original image. Experimental results showed that the proposed scheme has higher objective scores and better successful smoothing performance than similar schemes, while preserving and removing critical and trivial details, respectively. For computational complexity, the proposed smoothing scheme running on a GPU provided 18 and 16 times lower complexity than the proposed smoothing scheme running on a CPU and the L0-based smoothing scheme, respectively. In addition, a simple noise reduction test was conducted to show the characteristics of the proposed approach; it reported that the presented algorithm outperforms the state-of-the art algorithms by more than 5.4 dB. Therefore, we believe that the proposed scheme can be a useful tool for efficient image smoothing.

  1. Modeling and design of a multivariable control system for multi-paralleled grid-connected inverters with LCL filter

    DEFF Research Database (Denmark)

    Akhavan, Ali; Mohammadi, Hamid Reza; Guerrero, Josep M.

    2018-01-01

    The quality of injected current in multi-paralleled grid-connected inverters is a matter of concern. The current controlled grid-connected inverters with LCL filter are widely used in the distributed generation (DG) systems due to their fast dynamic response and better power features. However...... with resonances in the system, damping methods such as passive or active damping is necessary. Secondly and perhaps more importantly, paralleled grid-connected inverters in a microgrid are coupled due to grid impedance. Generally, the coupling effect is not taken into account when designing the control systems...

  2. Analysis of clinical complication data for radiation hepatitis using a parallel architecture model

    International Nuclear Information System (INIS)

    Jackson, A.; Haken, R.K. ten; Robertson, J.M.; Kessler, M.L.; Kutcher, G.J.; Lawrence, T.S.

    1995-01-01

    Purpose: The detailed knowledge of dose volume distributions available from the three-dimensional (3D) conformal radiation treatment of tumors in the liver (reported elsewhere) offers new opportunities to quantify the effect of volume on the probability of producing radiation hepatitis. We aim to test a new parallel architecture model of normal tissue complication probability (NTCP) with these data. Methods and Materials: Complication data and dose volume histograms from a total of 93 patients with normal liver function, treated on a prospective protocol with 3D conformal radiation therapy and intraarterial hepatic fluorodeoxyuridine, were analyzed with a new parallel architecture model. Patient treatment fell into six categories differing in doses delivered and volumes irradiated. By modeling the radiosensitivity of liver subunits, we are able to use dose volume histograms to calculate the fraction of the liver damaged in each patient. A complication results if this fraction exceeds the patient's functional reserve. To determine the patient distribution of functional reserves and the subunit radiosensitivity, the maximum likelihood method was used to fit the observed complication data. Results: The parallel model fit the complication data well, although uncertainties on the functional reserve distribution and subunit radiosensitivy are highly correlated. Conclusion: The observed radiation hepatitis complications show a threshold effect that can be described well with a parallel architecture model. However, additional independent studies are required to better determine the parameters defining the functional reserve distribution and subunit radiosensitivity

  3. Parametric Optimal Design of a Parallel Schönflies-Motion Robot under Pick-And-Place Trajectory Constraints

    DEFF Research Database (Denmark)

    Wu, Guanglei; Bai, Shaoping; Hjørnet, Preben

    2015-01-01

    This paper deals with the parametric optimum design of a parallel Schoenflies-motion robot, named "Ragnar", designed for fast and flexible pick-and-place applications. The robot architecture admits a rectangular workspace, which can utilize the shop-floor space efficiently. In this work......, the parametric models of the transmission quality, elasto-statics and dynamics are established. By taking into consideration of design requirements and pick-and-place trajectory, a comprehensive multi-objective optimization problem is formulated to optimize both kinematic and dynamic performances. The Pareto......-front is obtained, which provides optimal solutions to the robot design. Robot prototyping work based on the optimal results is described....

  4. Hybrid antibiotics - clinical progress and novel designs.

    Science.gov (United States)

    Parkes, Alastair L; Yule, Ian A

    2016-07-01

    There is a growing need for new antibacterial agents, but success in development of antibiotics in recent years has been limited. This has led researchers to investigate novel approaches to finding compounds that are effective against multi-drug resistant bacteria, and that delay onset of resistance. One such strategy has been to link antibiotics to produce hybrids designed to overcome resistance mechanisms. The concept of dual-acting hybrid antibiotics was introduced and reviewed in this journal in 2010. In the present review the authors sought to discover how clinical candidates described had progressed, and to examine how the field has developed. In three sections the authors cover the clinical progress of hybrid antibiotics, novel agents produced from hybridisation of two or more small-molecule antibiotics, and novel agents produced from hybridisation of antibiotics with small-molecules that have complementary activity. Many key questions regarding dual-acting hybrid antibiotics remain to be answered, and the proposed benefits of this approach are yet to be demonstrated. While Cadazolid in particular continues to progress in the clinic, suggesting that there is promise in hybridisation through covalent linkage, it may be that properties other than antibacterial activity are key when choosing a partner molecule.

  5. Design of parallel intersector weld/cut robot for machining processes in ITER vacuum vessel

    International Nuclear Information System (INIS)

    Wu Huapeng; Handroos, Heikki; Kovanen, Janne; Rouvinen, Asko; Hannukainen, Petri; Saira, Tanja; Jones, Lawrence

    2003-01-01

    This paper presents a new parallel robot Penta-WH, which has five degrees of freedom driven by hydraulic cylinders. The manipulator has a large, singularity-free workspace and high stiffness and it acts as a transport device for welding, machining and inspection end-effectors inside the ITER vacuum vessel. The presented kinematic structure of a parallel robot is particularly suitable for the ITER environment. Analysis of the machining process for ITER, such as the machining methods and forces are given, and the kinematic analyses, such as workspace and force capacity are discussed

  6. Effectiveness of a mobile cooperation intervention during the clinical practicum of nursing students: a parallel group randomized controlled trial protocol.

    Science.gov (United States)

    Strandell-Laine, Camilla; Saarikoski, Mikko; Löyttyniemi, Eliisa; Salminen, Leena; Suomi, Reima; Leino-Kilpi, Helena

    2017-06-01

    The aim of this study was to describe a study protocol for a study evaluating the effectiveness of a mobile cooperation intervention to improve students' competence level, self-efficacy in clinical performance and satisfaction with the clinical learning environment. Nursing student-nurse teacher cooperation during the clinical practicum has a vital role in promoting the learning of students. Despite an increasing interest in using mobile technologies to improve the clinical practicum of students, there is limited robust evidence regarding their effectiveness. A multicentre, parallel group, randomized, controlled, pragmatic, superiority trial. Second-year pre-registration nursing students who are beginning a clinical practicum will be recruited from one university of applied sciences. Eligible students will be randomly allocated to either a control group (engaging in standard cooperation) or an intervention group (engaging in mobile cooperation) for the 5-week the clinical practicum. The complex mobile cooperation intervention comprises of a mobile application-assisted, nursing student-nurse teacher cooperation and a training in the functions of the mobile application. The primary outcome is competence. The secondary outcomes include self-efficacy in clinical performance and satisfaction with the clinical learning environment. Moreover, a process evaluation will be undertaken. The ethical approval for this study was obtained in December 2014 and the study received funding in 2015. The results of this study will provide robust evidence on mobile cooperation during the clinical practicum, a research topic that has not been consistently studied to date. © 2016 John Wiley & Sons Ltd.

  7. Stepped-wedge cluster randomised controlled trials: a generic framework including parallel and multiple-level designs.

    Science.gov (United States)

    Hemming, Karla; Lilford, Richard; Girling, Alan J

    2015-01-30

    Stepped-wedge cluster randomised trials (SW-CRTs) are being used with increasing frequency in health service evaluation. Conventionally, these studies are cross-sectional in design with equally spaced steps, with an equal number of clusters randomised at each step and data collected at each and every step. Here we introduce several variations on this design and consider implications for power. One modification we consider is the incomplete cross-sectional SW-CRT, where the number of clusters varies at each step or where at some steps, for example, implementation or transition periods, data are not collected. We show that the parallel CRT with staggered but balanced randomisation can be considered a special case of the incomplete SW-CRT. As too can the parallel CRT with baseline measures. And we extend these designs to allow for multiple layers of clustering, for example, wards within a hospital. Building on results for complete designs, power and detectable difference are derived using a Wald test and obtaining the variance-covariance matrix of the treatment effect assuming a generalised linear mixed model. These variations are illustrated by several real examples. We recommend that whilst the impact of transition periods on power is likely to be small, where they are a feature of the design they should be incorporated. We also show examples in which the power of a SW-CRT increases as the intra-cluster correlation (ICC) increases and demonstrate that the impact of the ICC is likely to be smaller in a SW-CRT compared with a parallel CRT, especially where there are multiple levels of clustering. Finally, through this unified framework, the efficiency of the SW-CRT and the parallel CRT can be compared. © 2014 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  8. Design heuristic for parallel many server systems under FCFS-ALIS

    NARCIS (Netherlands)

    Adan, I.J.B.F.; Boon, M.; Weiss, G.

    2016-01-01

    We study a parallel service queueing system with servers of types $s_1,\\ldots,s_J$, customers of types $c_1,\\ldots,c_I$, bipartite compatibility graph $\\mathcal{G}$, where arc $(c_i, s_j)$ indicates that server type $s_j$ can serve customer type $c_i$, and service policy of first come first served

  9. Analysis, design, and experimental evaluation of power calculation in digital droop-controlled parallel microgrid inverters

    DEFF Research Database (Denmark)

    Gao, Ming-zhi; Chen, Min; Jin, Cheng

    2013-01-01

    Parallel operation of distributed generation is an important topic for microgrids, which can provide a highly reliable electric supply service and good power quality to end customers when the utility is unavailable. However, there is a well-known limitation: the power sharing accuracy between...

  10. Design of the Trap Filter for the High Power Converters with Parallel Interleaved VSCs

    DEFF Research Database (Denmark)

    Gohil, Ghanshyamsinh Vijaysinh; Bede, Lorand; Teodorescu, Remus

    2014-01-01

    The power handling capability of the state-of-the-art semiconductor devices is limited. Therefore, the Voltage Source Converters (VSCs) are often connected in parallel to realize high power converter. The switching frequency semiconductor devices, used in the high power VSCs, is also limited...

  11. Design and implementation of parallel video encoding strategies using divisible load analysis

    NARCIS (Netherlands)

    Li, Ping; Veeravalli, Bharadwaj; Kassim, A.A.

    2005-01-01

    The processing time needed for motion estimation usually accounts for a significant part of the overall processing time of the video encoder. To improve the video encoding speed, reducing the execution time for motion estimation process is essential. Parallel implementation of video encoding systems

  12. MaMiCo: Software design for parallel molecular-continuum flow simulations

    KAUST Repository

    Neumann, Philipp; Flohr, Hanno; Arora, Rahul; Jarmatz, Piet; Tchipev, Nikola; Bungartz, Hans-Joachim

    2015-01-01

    The macro-micro-coupling tool (MaMiCo) was developed to ease the development of and modularize molecular-continuum simulations, retaining sequential and parallel performance. We demonstrate the functionality and performance of MaMiCo by coupling

  13. Power Efficient Design of Parallel/Serial FIR Filters in RNS

    DEFF Research Database (Denmark)

    Petricca, Massimo; Albicocco, Pietro; Cardarilli, Gian Carlo

    2012-01-01

    It is well known that the Residue Number System (RNS) provides an efficient implementation of parallel FIR filters especially when the filter order and the dynamic range are high. The two main drawbacks of RNS, need of converters and coding overhead, make a serialized implementation of the FIR...

  14. Design and performance characterization of electronic structure calculations on massively parallel supercomputers

    DEFF Research Database (Denmark)

    Romero, N. A.; Glinsvad, Christian; Larsen, Ask Hjorth

    2013-01-01

    Density function theory (DFT) is the most widely employed electronic structure method because of its favorable scaling with system size and accuracy for a broad range of molecular and condensed-phase systems. The advent of massively parallel supercomputers has enhanced the scientific community...

  15. Clinical and serologic parallels to APS-I in patients with thymomas, and autoantigen transcripts in their tumors1

    Science.gov (United States)

    Wolff, Anette S. B.; Kärner, Jaanika; Owe, Jone F.; Oftedal, Bergithe E.V.; Gilhus, Nils Erik; Erichsen, Martina M.; Kämpe, Olle; Meager, Anthony; Peterson, Pärt; Kisand, Kai; Willcox, Nick; Husebye, Eystein S.

    2014-01-01

    Patients with the autoimmune polyendocrine syndrome type I (APS-I), caused by mutations in the autoimmune regulator (AIRE) gene, and myasthenia gravis (MG) with thymoma, show intriguing but unexplained parallels. They include uncommon manifestations like autoimmune adrenal insufficiency (AI), hypoparathyroidism (HP), and chronic mucocutaneous candidiasis (CMC) plus autoantibodies neutralizing IL-17, IL-22 and type I interferons. Thymopoiesis in the absence of AIRE is implicated in both syndromes. To test whether these parallels extend further, we screened 247 patients with MG and/or thymoma for clinical features and organ-specific autoantibodies characteristic of APS-I patients, and assayed 26 thymoma samples for transcripts for AIRE and 16 peripheral tissue-specific autoantigens (TSAgs) by quantitative PCR. We found APS-I-typical autoantibodies and clinical manifestations, including CMC, AI and asplenia, respectively in 49/121 (40%) and 10/121 (8%) thymoma patients, but clinical features seldom co-occurred with the corresponding autoantibodies. Both were rare in other MG subgroups (N=126). In 38 APS-I patients, by contrast, we observed neither autoantibodies against muscle antigens nor any neuromuscular disorders. Whereas relative transcript levels for AIRE and 7 of 16 TSAgs showed the expected under-expression in thymomas, levels were increased for 4 of the 5 TSAgs most frequently targeted by these patients’ autoAbs. Hence the clinical and serologic parallels to APS-I in patients with thymomas are not explained purely by deficient TSAg transcription in these aberrant AIRE-deficient tumors. We therefore propose additional explanations for the unusual autoimmune biases they provoke. Thymoma patients should be monitored for potentially life-threatening APS-I manifestations such as AI and HP. PMID:25230752

  16. Clinical and serologic parallels to APS-I in patients with thymomas and autoantigen transcripts in their tumors.

    Science.gov (United States)

    Wolff, Anette S B; Kärner, Jaanika; Owe, Jone F; Oftedal, Bergithe E V; Gilhus, Nils Erik; Erichsen, Martina M; Kämpe, Olle; Meager, Anthony; Peterson, Pärt; Kisand, Kai; Willcox, Nick; Husebye, Eystein S

    2014-10-15

    Patients with the autoimmune polyendocrine syndrome type I (APS-I), caused by mutations in the autoimmune regulator (AIRE) gene, and myasthenia gravis (MG) with thymoma, show intriguing but unexplained parallels. They include uncommon manifestations like autoimmune adrenal insufficiency (AI), hypoparathyroidism, and chronic mucocutaneous candidiasis plus autoantibodies neutralizing IL-17, IL-22, and type I IFNs. Thymopoiesis in the absence of AIRE is implicated in both syndromes. To test whether these parallels extend further, we screened 247 patients with MG, thymoma, or both for clinical features and organ-specific autoantibodies characteristic of APS-I patients, and we assayed 26 thymoma samples for transcripts for AIRE and 16 peripheral tissue-specific autoantigens (TSAgs) by quantitative PCR. We found APS-I-typical autoantibodies and clinical manifestations, including chronic mucocutaneous candidiasis, AI, and asplenia, respectively, in 49 of 121 (40%) and 10 of 121 (8%) thymoma patients, but clinical features seldom occurred together with the corresponding autoantibodies. Both were rare in other MG subgroups (n = 126). In 38 patients with APS-I, by contrast, we observed neither autoantibodies against muscle Ags nor any neuromuscular disorders. Whereas relative transcript levels for AIRE and 7 of 16 TSAgs showed the expected underexpression in thymomas, levels were increased for four of the five TSAgs most frequently targeted by these patients' autoantibodies. Therefore, the clinical and serologic parallels to APS-I in patients with thymomas are not explained purely by deficient TSAg transcription in these aberrant AIRE-deficient tumors. We therefore propose additional explanations for the unusual autoimmune biases they provoke. Thymoma patients should be monitored for potentially life-threatening APS-I manifestations such as AI and hypoparathyroidism. Copyright © 2014 by The American Association of Immunologists, Inc.

  17. The design of multi-core DSP parallel model based on message passing and multi-level pipeline

    Science.gov (United States)

    Niu, Jingyu; Hu, Jian; He, Wenjing; Meng, Fanrong; Li, Chuanrong

    2017-10-01

    Currently, the design of embedded signal processing system is often based on a specific application, but this idea is not conducive to the rapid development of signal processing technology. In this paper, a parallel processing model architecture based on multi-core DSP platform is designed, and it is mainly suitable for the complex algorithms which are composed of different modules. This model combines the ideas of multi-level pipeline parallelism and message passing, and summarizes the advantages of the mainstream model of multi-core DSP (the Master-Slave model and the Data Flow model), so that it has better performance. This paper uses three-dimensional image generation algorithm to validate the efficiency of the proposed model by comparing with the effectiveness of the Master-Slave and the Data Flow model.

  18. Application of Massively Parallel Sequencing in the Clinical Diagnostic Testing of Inherited Cardiac Conditions

    Directory of Open Access Journals (Sweden)

    Ivone U. S. Leong

    2014-06-01

    Full Text Available Sudden cardiac death in people between the ages of 1–40 years is a devastating event and is frequently caused by several heritable cardiac disorders. These disorders include cardiac ion channelopathies, such as long QT syndrome, catecholaminergic polymorphic ventricular tachycardia and Brugada syndrome and cardiomyopathies, such as hypertrophic cardiomyopathy and arrhythmogenic right ventricular cardiomyopathy. Through careful molecular genetic evaluation of DNA from sudden death victims, the causative gene mutation can be uncovered, and the rest of the family can be screened and preventative measures implemented in at-risk individuals. The current screening approach in most diagnostic laboratories uses Sanger-based sequencing; however, this method is time consuming and labour intensive. The development of massively parallel sequencing has made it possible to produce millions of sequence reads simultaneously and is potentially an ideal approach to screen for mutations in genes that are associated with sudden cardiac death. This approach offers mutation screening at reduced cost and turnaround time. Here, we will review the current commercially available enrichment kits, massively parallel sequencing (MPS platforms, downstream data analysis and its application to sudden cardiac death in a diagnostic environment.

  19. Designing of superconducting magnet for clinical MRI

    International Nuclear Information System (INIS)

    Kar, Soumen; Choudhury, A.; Sharma, R.G.; Datta, T.S.

    2015-01-01

    Superconducting technology of Magnetic Resonance Imaging (MRI) scanner is closely guarded technology as it has huge commercial application for clinical diagnostics. This is a rapidly evolving technology which requires innovative design of magnetic and cryogenic system. A project on the indigenous development of 1.5 T (B_0) MRI scanner has been initiated by SAMEER, Mumbai funded by DeitY, Gov. of India. IUAC is the collaborating institute for designing and developing the superconducting magnets and the cryostat for 1.5 T MRI scanner. The superconducting magnet is heart of the present day MRI system. The performance of the magnet has the highest impact on the overall image quality of the scanner. The stringent requirement of the spatial homogeneity (few parts per million within 50 cm diametrical spherical volume), the temporal stability (0.1 ppm/hr.) of the superconducting magnet and the safety standard (5 G in 5 m x 3 m ellipsoidal space) makes the designing of the superconducting magnet more complex. MRI consists of set of main coils and shielding coils. The large ratio between the diameter and the winding length of each coil makes the B_p_e_a_k/B_0 ratio much higher, which makes complexity in selecting the load line of the magnet. Superconducting magnets will be made of NbTi wire-in-channel (WIC) conductor with high copper to superconducting (NbTi) ratio. Multi-coil configuration on multi-bobbin architecture is though is cost effective but poses complexity in the mechanical integration to achieve desired homogeneity. Some of the major sources of inhomogeneities, in a multi-bobbin configuration, are the imperfect axial positioning and angular shift. We have simulated several factors which causes the homogeneity in six (main) coils configuration for a 1.5 T MRI magnet. Differential thermal shrinkage between the bobbin and superconducting winding is also a major source of inhomogeneity in a MRI magnet. This paper briefly present the different designing aspects of the

  20. Scalable High-Performance Parallel Design for Network Intrusion Detection Systems on Many-Core Processors

    OpenAIRE

    Jiang, Hayang; Xie, Gaogang; Salamatian, Kavé; Mathy, Laurent

    2013-01-01

    Network Intrusion Detection Systems (NIDSes) face significant challenges coming from the relentless network link speed growth and increasing complexity of threats. Both hardware accelerated and parallel software-based NIDS solutions, based on commodity multi-core and GPU processors, have been proposed to overcome these challenges. Network Intrusion Detection Systems (NIDSes) face significant challenges coming from the relentless network link speed growth and increasing complexity of threats. ...

  1. A modified parallel paradigm for clinical evaluation of auditory echoic memory.

    Science.gov (United States)

    Karino, Shotaro; Yumoto, Masato; Itoh, Kenji; Yamakawa, Keiko; Mizuochi, Tomomi; Kaga, Kimitaka

    2005-05-12

    We established a new parallel paradigm for mismatch negativity by presenting repetitive trains of three consonant-vowel syllables and those of three sinusoidal tones alternately. Magnetoencephalography was performed to test the new method, and mismatch negativities in six study participants with normal hearing were compared with the results of the conventional oddball paradigm. Peak amplitude and latencies of mismatch negativity showed no significant difference between the methods. The maximum amplitude in short memory probe interval of 1.0 s was significantly larger than in long memory probe interval of 3.0 s, demonstrating decay in auditory echoic memory caused by a prolonged memory probe interval. The new method facilitated simultaneous evaluation of mismatch negativity with various stimuli in a shorter period.

  2. Multi-GPU parallel algorithm design and analysis for improved inversion of probability tomography with gravity gradiometry data

    Science.gov (United States)

    Hou, Zhenlong; Huang, Danian

    2017-09-01

    In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.

  3. Design of a Simple and Modular 2-DOF Ankle Physiotherapy Device Relying on a Hybrid Serial-Parallel Robotic Architecture

    Directory of Open Access Journals (Sweden)

    Christos E. Syrseloudis

    2011-01-01

    Full Text Available The aim of this work is to propose a new 2-DOF robotic platform with hybrid parallel-serial structure and to undertake its parametric design so that it can follow the whole range of ankle related foot movements. This robot can serve as a human ankle rehabilitation device. The existing ankle rehabilitation devices present typically one or more of the following shortcomings: redundancy, large size, or high cost, hence the need for a device that could offer simplicity, modularity, and low cost of construction and maintenance. In addition, our targeted device must be safe during operation, disallow undesirable movements of the foot, while adaptable to any human foot. Our detailed study of foot kinematics has led us to a new hybrid architecture, which strikes a balance among all aforementioned goals. It consists of a passive serial kinematics chain with two adjustable screws so that the axes of the chain match the two main ankle-axes of typical feet. An active parallel chain, which consists of two prismatic actuators, provides the movement of the platform. Thus, the platform can follow the foot movements, thanks to the passive chain, and also possesses the advantages of parallel robots, including rigidity, high stiffness and force capabilities. The lack of redundancy yields a simpler device with lower size and cost. The paper describes the kinematics modelling of the platform and analyses the force and velocity transmission. The parametric design of the platform is carried out; our simulations confirm the platform's suitability for ankle rehabilitation.

  4. Clinical trials in neurology: design, conduct, analysis

    National Research Council Canada - National Science Library

    Ravina, Bernard

    2012-01-01

    .... Clinical Trials in Neurology aims to improve the efficiency of clinical trials and the development of interventions in order to enhance the development of new treatments for neurologic diseases...

  5. Conceptual design and kinematic analysis of a novel parallel robot for high-speed pick-and-place operations

    Science.gov (United States)

    Meng, Qizhi; Xie, Fugui; Liu, Xin-Jun

    2018-06-01

    This paper deals with the conceptual design, kinematic analysis and workspace identification of a novel four degrees-of-freedom (DOFs) high-speed spatial parallel robot for pick-and-place operations. The proposed spatial parallel robot consists of a base, four arms and a 1½ mobile platform. The mobile platform is a major innovation that avoids output singularity and offers the advantages of both single and double platforms. To investigate the characteristics of the robot's DOFs, a line graph method based on Grassmann line geometry is adopted in mobility analysis. In addition, the inverse kinematics is derived, and the constraint conditions to identify the correct solution are also provided. On the basis of the proposed concept, the workspace of the robot is identified using a set of presupposed parameters by taking input and output transmission index as the performance evaluation criteria.

  6. Design Issues and Application of Cable-Based Parallel Manipulators for Rehabilitation Therapy

    Directory of Open Access Journals (Sweden)

    E. Ottaviano

    2008-01-01

    Full Text Available In this study, cable-based manipulators are proposed for application in rehabilitation therapies. Cable-based manipulators show good features that are very useful when the system has to interact with humans. In particular, they can be used to aid motion or as monitoring/training systems in rehabilitation therapies. Modelling and simulation of both active and passive cable-based parallel manipulators are presented for an application to help older people, patients or disabled people in the sit-to-stand transfer and as a monitoring/training system. Experimental results are presented by using built prototypes.

  7. Designing for Anxiety Therapy, Bridging Clinical and Non-Clinical

    DEFF Research Database (Denmark)

    Bertelsen, Olav Wedege; Kramp, Gunnar

    2012-01-01

    In this position paper we discuss, in terms of the concept of boundary objects, how a mobile application, the MIKAT.app, bridge between clinical intervention in anxiety therapy, and life and coping strategies outside the clinic and across phases of being a person suffering from, or having suffered...... from anxiety. Thereby, we hope to provide a counterpoint in the discussion on illness trajectories....

  8. HVAC design manual for hospitals and clinics

    National Research Council Canada - National Science Library

    2013-01-01

    "Provides in-depth design recommendations and proven, cost effective, and reliable solutions for health care HVAC design that provide low maintenance cost and high reliability based on best practices...

  9. An adaptive optics imaging system designed for clinical use

    Science.gov (United States)

    Zhang, Jie; Yang, Qiang; Saito, Kenichi; Nozato, Koji; Williams, David R.; Rossi, Ethan A.

    2015-01-01

    Here we demonstrate a new imaging system that addresses several major problems limiting the clinical utility of conventional adaptive optics scanning light ophthalmoscopy (AOSLO), including its small field of view (FOV), reliance on patient fixation for targeting imaging, and substantial post-processing time. We previously showed an efficient image based eye tracking method for real-time optical stabilization and image registration in AOSLO. However, in patients with poor fixation, eye motion causes the FOV to drift substantially, causing this approach to fail. We solve that problem here by tracking eye motion at multiple spatial scales simultaneously by optically and electronically integrating a wide FOV SLO (WFSLO) with an AOSLO. This multi-scale approach, implemented with fast tip/tilt mirrors, has a large stabilization range of ± 5.6°. Our method consists of three stages implemented in parallel: 1) coarse optical stabilization driven by a WFSLO image, 2) fine optical stabilization driven by an AOSLO image, and 3) sub-pixel digital registration of the AOSLO image. We evaluated system performance in normal eyes and diseased eyes with poor fixation. Residual image motion with incremental compensation after each stage was: 1) ~2–3 arc minutes, (arcmin) 2) ~0.5–0.8 arcmin and, 3) ~0.05–0.07 arcmin, for normal eyes. Performance in eyes with poor fixation was: 1) ~3–5 arcmin, 2) ~0.7–1.1 arcmin and 3) ~0.07–0.14 arcmin. We demonstrate that this system is capable of reducing image motion by a factor of ~400, on average. This new optical design provides additional benefits for clinical imaging, including a steering subsystem for AOSLO that can be guided by the WFSLO to target specific regions of interest such as retinal pathology and real-time averaging of registered images to eliminate image post-processing. PMID:26114033

  10. Operating system design of parallel computer for on-line management of nuclear pressurised water reactor cores

    International Nuclear Information System (INIS)

    Gougam, F.

    1991-04-01

    This study is part of the PHAETON project which aims at increasing the knowledge of safety parameters of PWR core and reducing operating margins during the reactor cycle. The on-line system associates a simulator process to compute the three dimensional flux distribution and an acquisition process of reactor core parameters from the central instrumentation. The 3D flux calculation is the most time consuming. So, for cost and safety reasons, the PHAETON project proposes an approach which is to parallelize the 3D diffusion calculation and to use a computer based on parallel processor architecture. This paper presents the design of the operating system on which the application is executed. The routine interface proposed, includes the main operations necessary for programming a real time and parallel application. The primitives include: task management, data transfer, synchronisation by event signalling and by using the rendez-vous mechanisms. The primitives which are proposed use standard softwares like real-time kernel and UNIX operating system [fr

  11. Performance of Polycrystalline Photovoltaic and Thermal Collector (PVT on Serpentine-Parallel Absorbers Design

    Directory of Open Access Journals (Sweden)

    Mustofa Mustofa

    2017-03-01

    Full Text Available This paper presents the performance of an unglazed polycrystalline photovoltaic-thermal PVT on 0.045 kg/s mass flow rate. PVT combine photovoltaic modules and solar thermal collectors, forming a single device that receive solar radiation and produces heat and electricity simultaneously. The collector figures out serpentine-parallel tubes that can prolong fluid heat conductivity from morning till afternoon. During testing, cell PV, inlet and outlet fluid temperaturs were recorded by thermocouple digital LM35 Arduino Mega 2560. Panel voltage and electric current were also noted in which they were connected to computer and presented each second data recorded. But, in this performance only shows in the certain significant time data. This because the electric current was only noted by multimeter device not the digital one. Based on these testing data, average cell efficieny was about 19%, while thermal efficiency of above 50% and correspondeng cell efficiency of 11%, respectively

  12. Design Sliding Mode Controller of with Parallel Fuzzy Inference System Compensator to Control of Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Farzin Piltan

    2013-06-01

    Full Text Available Sliding mode controller (SMC is a significant nonlinear controller under condition of partly uncertain dynamic parameters of system. This controller is used to control of highly nonlinear systems especially for robot manipulators, because this controller is a robust and stable. Conversely, pure sliding mode controller is used in many applications; it has two important drawbacks namely; chattering phenomenon, and nonlinear equivalent dynamic formulation in uncertain dynamic parameter. The nonlinear equivalent dynamic formulation problem and chattering phenomenon in uncertain system can be solved by using artificial intelligence theorem. However fuzzy logic controller is used to control complicated nonlinear dynamic systems, but it cannot guarantee stability and robustness.  In this research parallel fuzzy logic theory is used to compensate the system dynamic uncertainty.

  13. Xyce parallel electronic simulator design : mathematical formulation, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Hoekstra, Robert John; Waters, Lon J.; Hutchinson, Scott Alan; Keiter, Eric Richard; Russo, Thomas V.

    2004-06-01

    This document is intended to contain a detailed description of the mathematical formulation of Xyce, a massively parallel SPICE-style circuit simulator developed at Sandia National Laboratories. The target audience of this document are people in the role of 'service provider'. An example of such a person would be a linear solver expert who is spending a small fraction of his time developing solver algorithms for Xyce. Such a person probably is not an expert in circuit simulation, and would benefit from an description of the equations solved by Xyce. In this document, modified nodal analysis (MNA) is described in detail, with a number of examples. Issues that are unique to circuit simulation, such as voltage limiting, are also described in detail.

  14. Design and simulation of parallel and distributed architectures for images processing

    International Nuclear Information System (INIS)

    Pirson, Alain

    1990-01-01

    The exploitation of visual information requires special computers. The diversity of operations and the Computing power involved bring about structures founded on the concepts of concurrency and distributed processing. This work identifies a vision computer with an association of dedicated intelligent entities, exchanging messages according to the model of parallelism introduced by the language Occam. It puts forward an architecture of the 'enriched processor network' type. It consists of a classical multiprocessor structure where each node is provided with specific devices. These devices perform processing tasks as well as inter-nodes dialogues. Such an architecture benefits from the homogeneity of multiprocessor networks and the power of dedicated resources. Its implementation corresponds to that of a distributed structure, tasks being allocated to each Computing element. This approach culminates in an original architecture called ATILA. This modular structure is based on a transputer network supplied with vision dedicated co-processors and powerful communication devices. (author) [fr

  15. Optimal Design and Tuning of PID-Type Interval Type-2 Fuzzy Logic Controllers for Delta Parallel Robots

    Directory of Open Access Journals (Sweden)

    Xingguo Lu

    2016-05-01

    Full Text Available In this work, we propose a new method for the optimal design and tuning of a Proportional-Integral-Derivative type (PID-type interval type-2 fuzzy logic controller (IT2 FLC for Delta parallel robot trajectory tracking control. The presented methodology starts with an optimal design problem of IT2 FLC. A group of IT2 FLCs are obtained by blurring the membership functions using a variable called blurring degree. By comparing the performance of the controllers, the optimal structure of IT2 FLC is obtained. Then, a multi-objective optimization problem is formulated to tune the scaling factors of the PID-type IT2 FLC. The Non-dominated Sorting Genetic Algorithm (NSGA-II is adopted to solve the constrained nonlinear multi-objective optimization problem. Simulation results of the optimized controller are presented and discussed regarding application in the Delta parallel robot. The proposed method provides an effective way to design and tune the PID-type IT2 FLC with a desired control performance.

  16. Modeling, analysis, and design of stationary reference frame droop controlled parallel three-phase voltage source inverters

    DEFF Research Database (Denmark)

    Vasquez, Juan Carlos; Guerrero, Josep M.; Savaghebi, Mehdi

    2011-01-01

    and discussed. Experimental results are provided to validate the performance and robustness of the VSIs functionality during Islanded and grid-connected operations, allowing a seamless transition between these modes through control hierarchies by regulating frequency and voltage, main-grid interactivity......Power electronics based microgrids consist of a number of voltage source inverters (VSIs) operating in parallel. In this paper, the modeling, control design, and stability analysis of three-phase VSIs are derived. The proposed voltage and current inner control loops and the mathematical models...

  17. Preliminary design of an advanced programmable digital filter network for large passive acoustic ASW systems. [Parallel processor

    Energy Technology Data Exchange (ETDEWEB)

    McWilliams, T.; Widdoes, Jr., L. C.; Wood, L.

    1976-09-30

    The design of an extremely high performance programmable digital filter of novel architecture, the LLL Programmable Digital Filter, is described. The digital filter is a high-performance multiprocessor having general purpose applicability and high programmability; it is extremely cost effective either in a uniprocessor or a multiprocessor configuration. The architecture and instruction set of the individual processor was optimized with regard to the multiple processor configuration. The optimal structure of a parallel processing system was determined for addressing the specific Navy application centering on the advanced digital filtering of passive acoustic ASW data of the type obtained from the SOSUS net. 148 figures. (RWR)

  18. Optimal control design of turbo spin-echo sequences with applications to parallel-transmit systems

    NARCIS (Netherlands)

    Sbrizzi, Alessandro; Hoogduin, Hans; Hajnal, Joseph V; van den Berg, CAT; Luijten, Peter R; Malik, Shaihan J

    PURPOSE: The design of turbo spin-echo sequences is modeled as a dynamic optimization problem which includes the case of inhomogeneous transmit radiofrequency fields. This problem is efficiently solved by optimal control techniques making it possible to design patient-specific sequences online.

  19. Design of Rate-Compatible Parallel Concatenated Punctured Polar Codes for IR-HARQ Transmission Schemes

    Directory of Open Access Journals (Sweden)

    Jian Jiao

    2017-11-01

    Full Text Available In this paper, we propose a rate-compatible (RC parallel concatenated punctured polar (PCPP codes for incremental redundancy hybrid automatic repeat request (IR-HARQ transmission schemes, which can transmit multiple data blocks over a time-varying channel. The PCPP coding scheme can provide RC polar coding blocks in order to adapt to channel variations. First, we investigate an improved random puncturing (IRP pattern for the PCPP coding scheme due to the code-rate and block length limitations of conventional polar codes. The proposed IRP algorithm only select puncturing bits from the frozen bits set and keep the information bits unchanged during puncturing, which can improve 0.2–1 dB decoding performance more than the existing random puncturing (RP algorithm. Then, we develop a RC IR-HARQ transmission scheme based on PCPP codes. By analyzing the overhead of the previous successful decoded PCPP coding block in our IR-HARQ scheme, the optimal initial code-rate can be determined for each new PCPP coding block over time-varying channels. Simulation results show that the average number of transmissions is about 1.8 times for each PCPP coding block in our RC IR-HARQ scheme with a 2-level PCPP encoding construction, which can reduce half of the average number of transmissions than the existing RC polar coding schemes.

  20. FPGAs and parallel architectures for aerospace applications soft errors and fault-tolerant design

    CERN Document Server

    Rech, Paolo

    2016-01-01

    This book introduces the concepts of soft errors in FPGAs, as well as the motivation for using commercial, off-the-shelf (COTS) FPGAs in mission-critical and remote applications, such as aerospace.  The authors describe the effects of radiation in FPGAs, present a large set of soft-error mitigation techniques that can be applied in these circuits, as well as methods for qualifying these circuits under radiation.  Coverage includes radiation effects in FPGAs, fault-tolerant techniques for FPGAs, use of COTS FPGAs in aerospace applications, experimental data of FPGAs under radiation, FPGA embedded processors under radiation, and fault injection in FPGAs. Since dedicated parallel processing architectures such as GPUs have become more desirable in aerospace applications due to high computational power, GPU analysis under radiation is also discussed. ·         Discusses features and drawbacks of reconfigurability methods for FPGAs, focused on aerospace applications; ·         Explains how radia...

  1. SYSTEMIC LUPUS ERYTHEMATOSUS AND INFECTIVE ENDOCARDITIS: CLINICAL AND DIAGNOSTIC PARALLELS AND IMAGINARY MIMICRY

    Directory of Open Access Journals (Sweden)

    S. P. Filonenko

    2016-01-01

    Full Text Available Aim of the study – draw attention to the differential diagnosis of systemic lupus erythematosus (SLE and infective endocarditis.Materials and methods. Patient A., 44 years old, was admitted to the cardiologic department of Ryazan Regional Clinical Cardiology Clinic diagnosed with probable infective subacute endocarditis, glomerulonephritis, with complaints of weakness, fatigue, increase in body temperature up to 37.7 °C preferably in the evening, dry cough, shortness of breath on mild exertion, swelling of legs and feet. In early October 2015, the patient's body temperature increased up to 37.8 °C, there was a dry cough. Patient was treated on an outpatient basis for acute respiratory viral infections with antibiotics, decreased body temperature. Acute deterioration of the condition was observed in mid-October: severe shortness of breath even on mild physical exertion, heart rate increased, as well as lower limb edema, blood pressure (BP increased up to 240/140 mmHg. The patient was hospitalized in the therapeutic department. Against the background of the treatment (antibiotics, antihypertensive agents, diuretics, digoxin patient’s condition was improved: shortness of breath decreased, as well as the heart rate, limb edema, blood pressure down to 180/110–190/120 mmHg. However, there was persistent proteinuria (0.33–1.65–3.3 g/L, low grade fever persisting in the evening. On admission to the cardiological department of Ryazan Regional Clinical Cardiology Clinic patient underwent the following survey: assessment of lab parameters in dynamics, electrocardiography, heart echocardiography, computed tomography (CT of lungs.Results. We revealed left ventricular hypertrophy on heart ultrasonography; an increase in the volume of left atrium, right ventricle, right atrium; mitral, aortic, tricuspid valve insufficiency (grade II regurgitation; pulmonary hypertension; on lung CT – the picture of hydrothorax on the right side, hydropericardium

  2. Convective boiling in a parallel microchannel heat sink with a diverging cross-section design and artificial nucleation sites

    International Nuclear Information System (INIS)

    Lu, Chun Ting; Pan, Chin

    2009-01-01

    To develop a highly stable boiling heat transfer microchannel heat sink, the three types of diverging microchannels, namely Type-1, Type-2 and Type-3, were designed to explore experimentally the effect of different distribution of artificial nucleation sites on enhancing boiling heat transfer in 10 parallel diverging microchannels with a mean hydraulic diameter of 120 μm. The Type-1 system is with no cavities, Type-2 is with cavities distributed uniformly along the downstream half of the channel, while Type-3 is with cavities distributed uniformly along the whole channel. The artificial nucleation sites are laser-etched pits on the channel bottom wall with a mouth diameter of about 20-22 μm based on the heterogeneous nucleation theory. The results of the present study reveal the presence of the artificial nucleation sites for flow boiling in parallel diverging microchannel significantly reduces the wall superheat and enhances the boiling heat transfer performance. Additionally, the Type-3 design demonstrates the best boiling heat transfer performance. (author)

  3. Design and Programming for Cable-Driven Parallel Robots in the German Pavilion at the EXPO 2015

    Directory of Open Access Journals (Sweden)

    Philipp Tempel

    2015-08-01

    Full Text Available In the German Pavilion at the EXPO 2015, two large cable-driven parallel robots are flying over the heads of the visitors representing two bees flying over Germany and displaying everyday life in Germany. Each robot consists of a mobile platform and eight cables suspended by winches and follows a desired trajectory, which needs to be computed in advance taking technical limitations, safety considerations and visual aspects into account. In this paper, a path planning software is presented, which includes the design process from developing a robot design and workspace estimation via planning complex trajectories considering technical limitations through to exporting a complete show. For a test trajectory, simulation results are given, which display the relevant trajectories and cable force distributions.

  4. Systematic Design Method and Experimental Validation of a 2-DOF Compliant Parallel Mechanism with Excellent Input and Output Decoupling Performances

    Directory of Open Access Journals (Sweden)

    Yao Jiang

    2017-06-01

    Full Text Available The output and input coupling characteristics of the compliant parallel mechanism (CPM bring difficulty in the motion control and challenge its high performance and operational safety. This paper presents a systematic design method for a 2-degrees-of-freedom (DOFs CPM with excellent decoupling performance. A symmetric kinematic structure can guarantee a CPM with a complete output decoupling characteristic; input coupling is reduced by resorting to a flexure-based decoupler. This work discusses the stiffness design requirement of the decoupler and proposes a compound flexure hinge as its basic structure. Analytical methods have been derived to assess the mechanical performances of the CPM in terms of input and output stiffness, motion stroke, input coupling degree, and natural frequency. The CPM’s geometric parameters were optimized to minimize the input coupling while ensuring key performance indicators at the same time. The optimized CPM’s performances were then evaluated by using a finite element analysis. Finally, a prototype was constructed and experimental validations were carried out to test the performance of the CPM and verify the effectiveness of the design method. The design procedure proposed in this paper is systematic and can be extended to design the CPMs with other types of motion.

  5. Design and optimization of multi-class series-parallel linear electromagnetic array artificial muscle.

    Science.gov (United States)

    Li, Jing; Ji, Zhenyu; Shi, Xuetao; You, Fusheng; Fu, Feng; Liu, Ruigang; Xia, Junying; Wang, Nan; Bai, Jing; Wang, Zhanxi; Qin, Xiansheng; Dong, Xiuzhen

    2014-01-01

    Skeletal muscle exhibiting complex and excellent precision has evolved for millions of years. Skeletal muscle has better performance and simpler structure compared with existing driving modes. Artificial muscle may be designed by analyzing and imitating properties and structure of skeletal muscle based on bionics, which has been focused on by bionic researchers, and a structure mode of linear electromagnetic array artificial muscle has been designed in this paper. Half sarcomere is the minimum unit of artificial muscle and electromagnetic model has been built. The structural parameters of artificial half sarcomere actuator were optimized to achieve better movement performance. Experimental results show that artificial half sarcomere actuator possesses great motion performance such as high response speed, great acceleration, small weight and size, robustness, etc., which presents a promising application prospect of artificial half sarcomere actuator.

  6. Design, simulation, and prototype production of a through the road parallel hybrid electric motorcycle

    International Nuclear Information System (INIS)

    Asaei, Behzad; Habibidoost, Mahdi

    2013-01-01

    Highlights: • Design, simulation, and manufacturing of a hybrid electric motorcycle are explained. • The electric machine is mounted in the front wheel hub of an ordinary motorcycle. • Two different energy control strategy are implemented. • The simulation results show that the motorcycle performance is improved. • The acceleration is improved and the fuel consumption and pollutions are decreased. - Abstract: In this paper, design, simulation, and conversion of a normal motorcycle to a Hybrid Electric Motorcycle (HEM) is described. At first, a simple model designed and simulated using ADVISOR2002. Then, the controller schematic and its optimized control strategy are described. A 125 cc ICE motorcycle is selected and converted into a HEM. A brushless DC (BLDC) motor assembled in the front wheel and a normal internal combustion engine in the rear wheel propel the motorcycle. The nominal powers are 6.6 kW and 500 W for the ICE and BLDC respectively. The original motorcycle has a Continuous Variable Transmission (CVT) that is the best choice for a HEM power transmission because it can operate in the automatic handling mode and has high efficiency. Moreover, by using the CVT, the ICE can be started while motorcycle is running. Finally, three operating modes of HEM, two implemented energy control strategies, and HEM engine control system by servomotors, and LCD display are explained

  7. Topics in Analysis and Design of Primary Parallel Isolated Boost Converter

    DEFF Research Database (Denmark)

    Sen, Gokhan

    Efficient power processing through power electronics circuits has been a popular academic field for the past decades, especially after “green” vehicle applications based on fuel cells, batteries and super-capacitors emerged as alternative sources of propulsion for transportation. Dc-to-dc convert......Efficient power processing through power electronics circuits has been a popular academic field for the past decades, especially after “green” vehicle applications based on fuel cells, batteries and super-capacitors emerged as alternative sources of propulsion for transportation. Dc...... for such applications due to its simplicity and ability to handle higher currents. Design of magnetic structures like transformers and inductors is part of power electronics engineering in which trade-offs exist between size, price and losses. In higher currents this becomes even more challenging since conventional...... wire wound design approach has some shortcomings. It is possible to use copper foil in windings however this will increase the amount of handcraft during manufacturing; so academic attention was needed for this part of low voltage high current switching converter design. Planar magnetics...

  8. More ethical and more efficient clinical research: multiplex trial design.

    Science.gov (United States)

    Keus, Frederik; van der Horst, Iwan C C; Nijsten, Maarten W

    2014-08-14

    Today's clinical research faces challenges such as a lack of clinical equipoise between treatment arms, reluctance in randomizing for multiple treatments simultaneously, inability to address interactions and increasingly restricted resources. Furthermore, many trials are biased by extensive exclusion criteria, relatively small sample size and less appropriate outcome measures. We propose a 'Multiplex' trial design that preserves clinical equipoise with a continuous and factorial trial design that will also result in more efficient use of resources. This multiplex design accommodates subtrials with appropriate choice of treatment arms within each subtrial. Clinical equipoise should increase consent rates while the factorial design is the best way to identify interactions. The multiplex design may evolve naturally from today's research limitations and challenges, while principal objections seem absent. However this new design poses important infrastructural, organisational and psychological challenges that need in depth consideration.

  9. MaMiCo: Software design for parallel molecular-continuum flow simulations

    KAUST Repository

    Neumann, Philipp

    2015-11-19

    The macro-micro-coupling tool (MaMiCo) was developed to ease the development of and modularize molecular-continuum simulations, retaining sequential and parallel performance. We demonstrate the functionality and performance of MaMiCo by coupling the spatially adaptive Lattice Boltzmann framework waLBerla with four molecular dynamics (MD) codes: the light-weight Lennard-Jones-based implementation SimpleMD, the node-level optimized software ls1 mardyn, and the community codes ESPResSo and LAMMPS. We detail interface implementations to connect each solver with MaMiCo. The coupling for each waLBerla-MD setup is validated in three-dimensional channel flow simulations which are solved by means of a state-based coupling method. We provide sequential and strong scaling measurements for the four molecular-continuum simulations. The overhead of MaMiCo is found to come at 10%-20% of the total (MD) runtime. The measurements further show that scalability of the hybrid simulations is reached on up to 500 Intel SandyBridge, and more than 1000 AMD Bulldozer compute cores. Program summary: Program title: MaMiCo. Catalogue identifier: AEYW_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEYW_v1_0.html Program obtainable from: CPC Program Library, Queen\\'s University, Belfast, N. Ireland. Licensing provisions: BSD License. No. of lines in distributed program, including test data, etc.: 67905. No. of bytes in distributed program, including test data, etc.: 1757334. Distribution format: tar.gz. Programming language: C, C++II. Computer: Standard PCs, compute clusters. Operating system: Unix/Linux. RAM: Test cases consume ca. 30-50 MB. Classification: 7.7. External routines: Scons (http:www.scons.org), ESPResSo, LAMMPS, ls1 mardyn, waLBerla. Nature of problem: Coupled molecular-continuum simulation for multi-resolution fluid dynamics: parts of the domain are resolved by molecular dynamics whereas large parts are covered by a CFD solver, e.g. a lattice Boltzmann automaton

  10. Minimally invasive strabismus surgery versus paralimbal approach: A randomized, parallel design study is minimally invasive strabismus surgery worth the effort?

    Directory of Open Access Journals (Sweden)

    Richa Sharma

    2014-01-01

    Full Text Available Introduction : Minimal access surgery is common in all fields of medicine. We compared a new minimally invasive strabismus surgery (MISS approach with a standard paralimbal strabismus surgery (SPSS approach in terms of post-operative course. Materials and Methods: This parallel design study was done on 28 eyes of 14 patients, in which one eye was randomized to MISS and the other to SPSS. MISS was performed by giving two conjunctival incisions parallel to the horizontal rectus muscles; performing recession or resection below the conjunctival strip so obtained. We compared post-operative redness, congestion, chemosis, foreign body sensation (FBS, and drop intolerance (DI on a graded scale of 0 to 3 on post-operative day 1, at 2-3 weeks, and 6 weeks. In addition, all scores were added to obtain a total inflammatory score (TIS. Statistical Analysis: Inflammatory scores were analyzed using Wilcoxon′s signed rank test. Results: On the first post-operative day, only FBS (P = 0.01 and TIS (P = 0.04 showed significant difference favoring MISS. At 2-3 weeks, redness (P = 0.04, congestion (P = 0.04, FBS (P = 0.02, and TIS (P = 0.04 were significantly less in MISS eye. At 6 weeks, only redness (P = 0.04 and TIS (P = 0.05 were significantly less. Conclusion: MISS is more comfortable in the immediate post-operative period and provides better cosmesis in the intermediate period.

  11. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  12. Global optimization of discrete truss topology design problems using a parallel cut-and-branch method

    DEFF Research Database (Denmark)

    Rasmussen, Marie-Louise Højlund; Stolpe, Mathias

    2008-01-01

    the physics, and the cuts (Combinatorial Benders’ and projected Chvátal–Gomory) come from an understanding of the particular mathematical structure of the reformulation. The impact of a stronger representation is investigated on several truss topology optimization problems in two and three dimensions.......The subject of this article is solving discrete truss topology optimization problems with local stress and displacement constraints to global optimum. We consider a formulation based on the Simultaneous ANalysis and Design (SAND) approach. This intrinsically non-convex problem is reformulated...

  13. Multichannel microformulators for massively parallel machine learning and automated design of biological experiments

    Science.gov (United States)

    Wikswo, John; Kolli, Aditya; Shankaran, Harish; Wagoner, Matthew; Mettetal, Jerome; Reiserer, Ronald; Gerken, Gregory; Britt, Clayton; Schaffer, David

    Genetic, proteomic, and metabolic networks describing biological signaling can have 102 to 103 nodes. Transcriptomics and mass spectrometry can quantify 104 different dynamical experimental variables recorded from in vitro experiments with a time resolution approaching 1 s. It is difficult to infer metabolic and signaling models from such massive data sets, and it is unlikely that causality can be determined simply from observed temporal correlations. There is a need to design and apply specific system perturbations, which will be difficult to perform manually with 10 to 102 externally controlled variables. Machine learning and optimal experimental design can select an experiment that best discriminates between multiple conflicting models, but a remaining problem is to control in real time multiple variables in the form of concentrations of growth factors, toxins, nutrients and other signaling molecules. With time-division multiplexing, a microfluidic MicroFormulator (μF) can create in real time complex mixtures of reagents in volumes suitable for biological experiments. Initial 96-channel μF implementations control the exposure profile of cells in a 96-well plate to different temporal profiles of drugs; future experiments will include challenge compounds. Funded in part by AstraZeneca, NIH/NCATS HHSN271201600009C and UH3TR000491, and VIIBRE.

  14. Parallel computation

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Maurel, G.; Silva, J.; Wolff-Bacha, F.

    1997-01-01

    The work in the field of parallel processing has developed as research activities using several numerical Monte Carlo simulations related to basic or applied current problems of nuclear and particle physics. For the applications utilizing the GEANT code development or improvement works were done on parts simulating low energy physical phenomena like radiation, transport and interaction. The problem of actinide burning by means of accelerators was approached using a simulation with the GEANT code. A program of neutron tracking in the range of low energies up to the thermal region has been developed. It is coupled to the GEANT code and permits in a single pass the simulation of a hybrid reactor core receiving a proton burst. Other works in this field refers to simulations for nuclear medicine applications like, for instance, development of biological probes, evaluation and characterization of the gamma cameras (collimators, crystal thickness) as well as the method for dosimetric calculations. Particularly, these calculations are suited for a geometrical parallelization approach especially adapted to parallel machines of the TN310 type. Other works mentioned in the same field refer to simulation of the electron channelling in crystals and simulation of the beam-beam interaction effect in colliders. The GEANT code was also used to simulate the operation of germanium detectors designed for natural and artificial radioactivity monitoring of environment

  15. Implementing Clinical Research Using Factorial Designs: A Primer.

    Science.gov (United States)

    Baker, Timothy B; Smith, Stevens S; Bolt, Daniel M; Loh, Wei-Yin; Mermelstein, Robin; Fiore, Michael C; Piper, Megan E; Collins, Linda M

    2017-07-01

    Factorial experiments have rarely been used in the development or evaluation of clinical interventions. However, factorial designs offer advantages over randomized controlled trial designs, the latter being much more frequently used in such research. Factorial designs are highly efficient (permitting evaluation of multiple intervention components with good statistical power) and present the opportunity to detect interactions amongst intervention components. Such advantages have led methodologists to advocate for the greater use of factorial designs in research on clinical interventions (Collins, Dziak, & Li, 2009). However, researchers considering the use of such designs in clinical research face a series of choices that have consequential implications for the interpretability and value of the experimental results. These choices include: whether to use a factorial design, selection of the number and type of factors to include, how to address the compatibility of the different factors included, whether and how to avoid confounds between the type and number of interventions a participant receives, and how to interpret interactions. The use of factorial designs in clinical intervention research poses choices that differ from those typically considered in randomized clinical trial designs. However, the great information yield of the former encourages clinical researchers' increased and careful execution of such designs. Copyright © 2017. Published by Elsevier Ltd.

  16. Intranasal Midazolam versus Rectal Diazepam for the Management of Canine Status Epilepticus: A Multicenter Randomized Parallel-Group Clinical Trial.

    Science.gov (United States)

    Charalambous, M; Bhatti, S F M; Van Ham, L; Platt, S; Jeffery, N D; Tipold, A; Siedenburg, J; Volk, H A; Hasegawa, D; Gallucci, A; Gandini, G; Musteata, M; Ives, E; Vanhaesebrouck, A E

    2017-07-01

    Intranasal administration of benzodiazepines has shown superiority over rectal administration for terminating emergency epileptic seizures in human trials. No such clinical trials have been performed in dogs. To evaluate the clinical efficacy of intranasal midazolam (IN-MDZ), via a mucosal atomization device, as a first-line management option for canine status epilepticus and compare it to rectal administration of diazepam (R-DZP) for controlling status epilepticus before intravenous access is available. Client-owned dogs with idiopathic or structural epilepsy manifesting status epilepticus within a hospital environment were used. Dogs were randomly allocated to treatment with IN-MDZ (n = 20) or R-DZP (n = 15). Randomized parallel-group clinical trial. Seizure cessation time and adverse effects were recorded. For each dog, treatment was considered successful if the seizure ceased within 5 minutes and did not recur within 10 minutes after administration. The 95% confidence interval was used to detect the true population of dogs that were successfully treated. The Fisher's 2-tailed exact test was used to compare the 2 groups, and the results were considered statistically significant if P status epilepticus in 70% (14/20) and 20% (3/15) of cases, respectively (P = .0059). All dogs showed sedation and ataxia. IN-MDZ is a quick, safe and effective first-line medication for controlling status epilepticus in dogs and appears superior to R-DZP. IN-MDZ might be a valuable treatment option when intravenous access is not available and for treatment of status epilepticus in dogs at home. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  17. Parallel experimental design and multivariate analysis provides efficient screening of cell culture media supplements to improve biosimilar product quality.

    Science.gov (United States)

    Brühlmann, David; Sokolov, Michael; Butté, Alessandro; Sauer, Markus; Hemberger, Jürgen; Souquet, Jonathan; Broly, Hervé; Jordan, Martin

    2017-07-01

    Rational and high-throughput optimization of mammalian cell culture media has a great potential to modulate recombinant protein product quality. We present a process design method based on parallel design-of-experiment (DoE) of CHO fed-batch cultures in 96-deepwell plates to modulate monoclonal antibody (mAb) glycosylation using medium supplements. To reduce the risk of losing valuable information in an intricate joint screening, 17 compounds were separated into five different groups, considering their mode of biological action. The concentration ranges of the medium supplements were defined according to information encountered in the literature and in-house experience. The screening experiments produced wide glycosylation pattern ranges. Multivariate analysis including principal component analysis and decision trees was used to select the best performing glycosylation modulators. Subsequent D-optimal quadratic design with four factors (three promising compounds and temperature shift) in shake tubes confirmed the outcome of the selection process and provided a solid basis for sequential process development at a larger scale. The glycosylation profile with respect to the specifications for biosimilarity was greatly improved in shake tube experiments: 75% of the conditions were equally close or closer to the specifications for biosimilarity than the best 25% in 96-deepwell plates. Biotechnol. Bioeng. 2017;114: 1448-1458. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Adaptive design methods in clinical trials – a review

    Directory of Open Access Journals (Sweden)

    Chang Mark

    2008-05-01

    Full Text Available Abstract In recent years, the use of adaptive design methods in clinical research and development based on accrued data has become very popular due to its flexibility and efficiency. Based on adaptations applied, adaptive designs can be classified into three categories: prospective, concurrent (ad hoc, and retrospective adaptive designs. An adaptive design allows modifications made to trial and/or statistical procedures of ongoing clinical trials. However, it is a concern that the actual patient population after the adaptations could deviate from the originally target patient population and consequently the overall type I error (to erroneously claim efficacy for an infective drug rate may not be controlled. In addition, major adaptations of trial and/or statistical procedures of on-going trials may result in a totally different trial that is unable to address the scientific/medical questions the trial intends to answer. In this article, several commonly considered adaptive designs in clinical trials are reviewed. Impacts of ad hoc adaptations (protocol amendments, challenges in by design (prospective adaptations, and obstacles of retrospective adaptations are described. Strategies for the use of adaptive design in clinical development of rare diseases are discussed. Some examples concerning the development of Velcade intended for multiple myeloma and non-Hodgkin's lymphoma are given. Practical issues that are commonly encountered when implementing adaptive design methods in clinical trials are also discussed.

  19. Implications of Clinical Trial Design on Sample Size Requirements

    OpenAIRE

    Leon, Andrew C.

    2008-01-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two par...

  20. Can emergency medicine research benefit from adaptive design clinical trials?

    Science.gov (United States)

    Flight, Laura; Julious, Steven A; Goodacre, Steve

    2017-04-01

    Adaptive design clinical trials use preplanned interim analyses to determine whether studies should be stopped or modified before recruitment is complete. Emergency medicine trials are well suited to these designs as many have a short time to primary outcome relative to the length of recruitment. We hypothesised that the majority of published emergency medicine trials have the potential to use a simple adaptive trial design. We reviewed clinical trials published in three emergency medicine journals between January 2003 and December 2013. We determined the proportion that used an adaptive design as well as the proportion that could have used a simple adaptive design based on the time to primary outcome and length of recruitment. Only 19 of 188 trials included in the review were considered to have used an adaptive trial design. A total of 154/165 trials that were fixed in design had the potential to use an adaptive design. Currently, there seems to be limited uptake in the use of adaptive trial designs in emergency medicine despite their potential benefits to save time and resources. Failing to take advantage of adaptive designs could be costly to patients and research. It is recommended that where practical and logistical considerations allow, adaptive designs should be used for all emergency medicine clinical trials. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. The rare and undiagnosed diseases diagnostic service - application of massively parallel sequencing in a state-wide clinical service.

    Science.gov (United States)

    Baynam, Gareth; Pachter, Nicholas; McKenzie, Fiona; Townshend, Sharon; Slee, Jennie; Kiraly-Borri, Cathy; Vasudevan, Anand; Hawkins, Anne; Broley, Stephanie; Schofield, Lyn; Verhoef, Hedwig; Walker, Caroline E; Molster, Caron; Blackwell, Jenefer M; Jamieson, Sarra; Tang, Dave; Lassmann, Timo; Mina, Kym; Beilby, John; Davis, Mark; Laing, Nigel; Murphy, Lesley; Weeramanthri, Tarun; Dawkins, Hugh; Goldblatt, Jack

    2016-06-11

    The Rare and Undiagnosed Diseases Diagnostic Service (RUDDS) refers to a genomic diagnostic platform operating within the Western Australian Government clinical services delivered through Genetic Services of Western Australia (GSWA). GSWA has provided a state-wide service for clinical genetic care for 28 years and it serves a population of 2.5 million people across a geographical area of 2.5milion Km(2). Within this context, GSWA has established a clinically integrated genomic diagnostic platform in partnership with other public health system managers and service providers, including but not limited to the Office of Population Health Genomics, Diagnostic Genomics (PathWest Laboratories) and with executive level support from the Department of Health. Herein we describe report presents the components of this service that are most relevant to the heterogeneity of paediatric clinical genetic care. Briefly the platform : i) offers multiple options including non-genetic testing; monogenic and genomic (targeted in silico filtered and whole exome) analysis; and matchmaking; ii) is delivered in a patient-centric manner that is resonant with the patient journey, it has multiple points for entry, exit and re-entry to allow people access to information they can use, when they want to receive it; iii) is synchronous with precision phenotyping methods; iv) captures new knowledge, including multiple expert review; v) is integrated with current translational genomic research activities and best practice; and vi) is designed for flexibility for interactive generation of, and integration with, clinical research for diagnostics, community engagement, policy and models of care. The RUDDS has been established as part of routine clinical genetic services and is thus sustainable, equitably managed and seeks to translate new knowledge into efficient diagnostics and improved health for the whole community.

  2. Design of clinical trials Phase I and II with radiopharmaceuticals

    International Nuclear Information System (INIS)

    Giannone, C.A.; Soroa, V.E.

    2015-01-01

    We presented some usual designs for clinical studies in Phase I and Phase II. For Phase I we considered the 3 + 3 Classic design, designs with accelerated titration and those with dose escalation schemes with overdose control (EWOC). For Phase II designs with efficacy outcomes are presented. The design proposed by Fleming is discussed as well as those with inclusion of patients in two stages: Gehan’s design and the Optimal two–stage Simon’s design. We also discussed the design of combined endpoints of efficacy and safety of Bryant and Day with an application example of therapeutically Lu-177. Finally some proposals for phase II trials with control group are considered. (authors) [es

  3. System design and energetic characterization of a four-wheel-driven series–parallel hybrid electric powertrain for heavy-duty applications

    International Nuclear Information System (INIS)

    Wang, Enhua; Guo, Di; Yang, Fuyuan

    2015-01-01

    Highlights: • A novel four-wheel-driven series–parallel hybrid powertrain is proposed. • A system model and a rule-based control strategy are designed. • Energetic performance is compared to a rear-wheel-driven hybrid powertrain. • Less torsional oscillation and more robust regenerative braking are achieved. - Abstract: Powertrain topology design is vital for system performance of a hybrid electric vehicle. In this paper, a novel four-wheel-driven series–parallel hybrid electric powertrain is proposed. A motor is connected to the differential of the rear axle. An auxiliary power unit is linked to the differential of the front axle via a clutch. First, a mathematical model was established to evaluate the fuel-saving potential. A rule-based energy management algorithm was subsequently designed, and its working parameters were optimized. The hybrid powertrain system was applied to a transit bus, and the system characteristics were analyzed. Compared to an existing coaxial power-split hybrid powertrain, the fuel economy of the four-wheel-driven series–parallel hybrid powertrain can be at the same level under normal road conditions. However, the proposed four-wheel-driven series–parallel hybrid powertrain can recover braking energy more efficiently under road conditions with a low adhesive coefficient and can alleviate the torsional oscillation occurring at the existing coaxial power-split hybrid powertrain. Therefore, the four-wheel-driven series–parallel hybrid powertrain is a good solution for transit buses toward more robust performance.

  4. Design and implementation of a novel modal space active force control concept for spatial multi-DOF parallel robotic manipulators actuated by electrical actuators.

    Science.gov (United States)

    Yang, Chifu; Zhao, Jinsong; Li, Liyi; Agrawal, Sunil K

    2018-01-01

    Robotic spine brace based on parallel-actuated robotic system is a new device for treatment and sensing of scoliosis, however, the strong dynamic coupling and anisotropy problem of parallel manipulators result in accuracy loss of rehabilitation force control, including big error in direction and value of force. A novel active force control strategy named modal space force control is proposed to solve these problems. Considering the electrical driven system and contact environment, the mathematical model of spatial parallel manipulator is built. The strong dynamic coupling problem in force field is described via experiments as well as the anisotropy problem of work space of parallel manipulators. The effects of dynamic coupling on control design and performances are discussed, and the influences of anisotropy on accuracy are also addressed. With mass/inertia matrix and stiffness matrix of parallel manipulators, a modal matrix can be calculated by using eigenvalue decomposition. Making use of the orthogonality of modal matrix with mass matrix of parallel manipulators, the strong coupled dynamic equations expressed in work space or joint space of parallel manipulator may be transformed into decoupled equations formulated in modal space. According to this property, each force control channel is independent of others in the modal space, thus we proposed modal space force control concept which means the force controller is designed in modal space. A modal space active force control is designed and implemented with only a simple PID controller employed as exampled control method to show the differences, uniqueness, and benefits of modal space force control. Simulation and experimental results show that the proposed modal space force control concept can effectively overcome the effects of the strong dynamic coupling and anisotropy problem in the physical space, and modal space force control is thus a very useful control framework, which is better than the current joint

  5. Prediabetes in pregnancy, can early intervention improve outcomes? A feasibility study for a parallel randomised clinical trial.

    Science.gov (United States)

    Hughes, Ruth C E; Rowan, Janet; Williman, Jonathan

    2018-03-03

    Measurement of glycated haemoglobin (HbA1c) in early pregnancy is routine in New Zealand to identify women with diabetes and prediabetes. However, the benefit of early intervention in women with prediabetes is inconclusive. Our aim was to test the feasibility of a two-arm parallel randomised controlled trial of standard care versus early intervention in pregnancies complicated by prediabetes. Two tertiary referral centres in New Zealand. Women measured at booking, without pre-existing diabetes. Randomisation was done by remote web-based allocation into one of two groups. Women in the early intervention group attended an antenatal diabetes clinic, commenced daily home blood glucose monitoring, and medication was prescribed if lifestyle measures failed to maintain target blood glucose levels. Controls received lifestyle education, continued standard care with their midwife and/or obstetrician, and were asked to perform a 75 g oral glucose tolerance test at 24 weeks' gestation with a referral to clinic if this test was positive. Both groups received lifestyle questionnaires at recruitment and in late pregnancy. Recruitment rate, adherence to protocol and validation of potential primary outcomes. Recruitment rates were lower than expected, especially in Māori and Pacific women. Non-adherence to allocated treatment protocol was significant, 42% (95% CI 24% to 61%) in the early intervention group and 30% (95% CI 16% to 51%) in controls. Caesarean section and pre-eclampsia were signalled as potential primary outcomes, due to both the high observed incidence in the control group and ease of measurement. For a future definitive trial, extending the gestation of eligibility and stepped-wedge cluster randomisation may overcome the identified feasibility issues. Consistent with published observational data, pre-eclampsia and emergency caesarean section could be included as primary outcome measures, both of which have a significant impact on maternal and neonatal morbidity and

  6. Novel, Highly-Parallel Software for the Online Storage System of the ATLAS Experiment at CERN: Design and Performances

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment observes proton-proton collisions delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz, for an average event size of ~1.5 MB. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel SW design, reporting on the effort of exploiting the full power of recently installed multi-core hardware. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, including the recently introduced on-line event-compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we report on the desig...

  7. Novel, Highly-Parallel Software for the Online Storage System of the ATLAS Experiment at CERN: Design and Performances

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    Abstract--- The ATLAS experiment observes proton-proton collisions delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz, for an average event size of ~1.5 MB. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel software design, reporting on the effort of exploiting the full power of multi-core hardware. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, including the recently introduced on-line event-compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we will briefly discuss...

  8. Map-Based Power-Split Strategy Design with Predictive Performance Optimization for Parallel Hybrid Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Jixiang Fan

    2015-09-01

    Full Text Available In this paper, a map-based optimal energy management strategy is proposed to improve the consumption economy of a plug-in parallel hybrid electric vehicle. In the design of the maps, which provide both the torque split between engine and motor and the gear shift, not only the current vehicle speed and power demand, but also the optimality based on the predicted trajectory of vehicle dynamics are considered. To seek the optimality, the equivalent consumption, which trades off the fuel and electricity usages, is chosen as the cost function. Moreover, in order to decrease the model errors in the process of optimization conducted in the discrete time domain, the variational integrator is employed to calculate the evolution of the vehicle dynamics. To evaluate the proposed energy management strategy, the simulation results performed on a professional GT-Suit simulator are demonstrated and the comparison to a real-time optimization method is also given to show the advantage of the proposed off-line optimization approach.

  9. A Design of a New Column-Parallel Analog-to-Digital Converter Flash for Monolithic Active Pixel Sensor.

    Science.gov (United States)

    Chakir, Mostafa; Akhamal, Hicham; Qjidaa, Hassan

    2017-01-01

    The CMOS Monolithic Active Pixel Sensor (MAPS) for the International Linear Collider (ILC) vertex detector (VXD) expresses stringent requirements on their analog readout electronics, specifically on the analog-to-digital converter (ADC). This paper concerns designing and optimizing a new architecture of a low power, high speed, and small-area 4-bit column-parallel ADC Flash. Later in this study, we propose to interpose an S/H block in the converter. This integration of S/H block increases the sensitiveness of the converter to the very small amplitude of the input signal from the sensor and provides a sufficient time to the converter to be able to code the input signal. This ADC is developed in 0.18  μ m CMOS process with a pixel pitch of 35  μ m. The proposed ADC responds to the constraints of power dissipation, size, and speed for the MAPS composed of a matrix of 64 rows and 48 columns where each column ADC covers a small area of 35 × 336.76  μ m 2 . The proposed ADC consumes low power at a 1.8 V supply and 100 MS/s sampling rate with dynamic range of 125 mV. Its DNL and INL are 0.0812/-0.0787 LSB and 0.0811/-0.0787 LSB, respectively. Furthermore, this ADC achieves a high speed more than 5 GHz.

  10. Characterization and Simulation of a New Design Parallel-Plate Ionization Chamber for CT Dosimetry at Calibration Laboratories

    Science.gov (United States)

    Perini, Ana P.; Neves, Lucio P.; Maia, Ana F.; Caldas, Linda V. E.

    2013-12-01

    In this work, a new extended-length parallel-plate ionization chamber was tested in the standard radiation qualities for computed tomography established according to the half-value layers defined at the IEC 61267 standard, at the Calibration Laboratory of the Instituto de Pesquisas Energéticas e Nucleares (IPEN). The experimental characterization was made following the IEC 61674 standard recommendations. The experimental results obtained with the ionization chamber studied in this work were compared to those obtained with a commercial pencil ionization chamber, showing a good agreement. With the use of the PENELOPE Monte Carlo code, simulations were undertaken to evaluate the influence of the cables, insulator, PMMA body, collecting electrode, guard ring, screws, as well as different materials and geometrical arrangements, on the energy deposited on the ionization chamber sensitive volume. The maximum influence observed was 13.3% for the collecting electrode, and regarding the use of different materials and design, the substitutions showed that the original project presented the most suitable configuration. The experimental and simulated results obtained in this work show that this ionization chamber has appropriate characteristics to be used at calibration laboratories, for dosimetry in standard computed tomography and diagnostic radiology quality beams.

  11. Design and Validation of Real-Time Optimal Control with ECMS to Minimize Energy Consumption for Parallel Hybrid Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Aiyun Gao

    2017-01-01

    Full Text Available A real-time optimal control of parallel hybrid electric vehicles (PHEVs with the equivalent consumption minimization strategy (ECMS is presented in this paper, whose purpose is to achieve the total equivalent fuel consumption minimization and to maintain the battery state of charge (SOC within its operation range at all times simultaneously. Vehicle and assembly models of PHEVs are established, which provide the foundation for the following calculations. The ECMS is described in detail, in which an instantaneous cost function including the fuel energy and the electrical energy is proposed, whose emphasis is the computation of the equivalent factor. The real-time optimal control strategy is designed through regarding the minimum of the total equivalent fuel consumption as the control objective and the torque split factor as the control variable. The validation of the control strategy proposed is demonstrated both in the MATLAB/Simulink/Advisor environment and under actual transportation conditions by comparing the fuel economy, the charge sustainability, and parts performance with other three control strategies under different driving cycles including standard, actual, and real-time road conditions. Through numerical simulations and real vehicle tests, the accuracy of the approach used for the evaluation of the equivalent factor is confirmed, and the potential of the proposed control strategy in terms of fuel economy and keeping the deviations of SOC at a low level is illustrated.

  12. 8051 microcontroller to FPGA and ADC interface design for high speed parallel processing systems – Application in ultrasound scanners

    Directory of Open Access Journals (Sweden)

    J. Jean Rossario Raj

    2016-09-01

    Full Text Available Microcontrollers perform the hardware control in many instruments. Instruments requiring huge data throughput and parallel computing use FPGA’s for data processing. The microcontroller in turn configures the application hardware devices such as FPGA’s, ADC’s and Ethernet chips etc. The interfacing of these devices uses address/data bus interface, serial interface or serial peripheral interface. The choice of the interface depends upon the input/output pins available with different devices, programming ease and proprietary interfaces supported by devices such as ADC’s. The novelty of this paper is to describe the programming logic used for various types of interface scenarios from microcontroller to different programmable devices. The study presented describes the methods and logic flowcharts for different interfaces. The implementation of the interface logics were in prototype hardware for ultrasound scanner. The internal devices were controlled from the graphical user interface in a laptop and the scan results are taken. It is seen that the optimum solution of the hardware design can be achieved by using a common serial interface towards all the devices.

  13. Applying Probabilistic Decision Models to Clinical Trial Design

    Science.gov (United States)

    Smith, Wade P; Phillips, Mark H

    2018-01-01

    Clinical trial design most often focuses on a single or several related outcomes with corresponding calculations of statistical power. We consider a clinical trial to be a decision problem, often with competing outcomes. Using a current controversy in the treatment of HPV-positive head and neck cancer, we apply several different probabilistic methods to help define the range of outcomes given different possible trial designs. Our model incorporates the uncertainties in the disease process and treatment response and the inhomogeneities in the patient population. Instead of expected utility, we have used a Markov model to calculate quality adjusted life expectancy as a maximization objective. Monte Carlo simulations over realistic ranges of parameters are used to explore different trial scenarios given the possible ranges of parameters. This modeling approach can be used to better inform the initial trial design so that it will more likely achieve clinical relevance.

  14. Big Data in Designing Clinical Trials: Opportunities and Challenges.

    Science.gov (United States)

    Mayo, Charles S; Matuszak, Martha M; Schipper, Matthew J; Jolly, Shruti; Hayman, James A; Ten Haken, Randall K

    2017-01-01

    Emergence of big data analytics resource systems (BDARSs) as a part of routine practice in Radiation Oncology is on the horizon. Gradually, individual researchers, vendors, and professional societies are leading initiatives to create and demonstrate use of automated systems. What are the implications for design of clinical trials, as these systems emerge? Gold standard, randomized controlled trials (RCTs) have high internal validity for the patients and settings fitting constraints of the trial, but also have limitations including: reproducibility, generalizability to routine practice, infrequent external validation, selection bias, characterization of confounding factors, ethics, and use for rare events. BDARS present opportunities to augment and extend RCTs. Preliminary modeling using single- and muti-institutional BDARS may lead to better design and less cost. Standardizations in data elements, clinical processes, and nomenclatures used to decrease variability and increase veracity needed for automation and multi-institutional data pooling in BDARS also support ability to add clinical validation phases to clinical trial design and increase participation. However, volume and variety in BDARS present other technical, policy, and conceptual challenges including applicable statistical concepts, cloud-based technologies. In this summary, we will examine both the opportunities and the challenges for use of big data in design of clinical trials.

  15. Big Data in Designing Clinical Trials: Opportunities and Challenges

    Directory of Open Access Journals (Sweden)

    Charles S. Mayo

    2017-08-01

    Full Text Available Emergence of big data analytics resource systems (BDARSs as a part of routine practice in Radiation Oncology is on the horizon. Gradually, individual researchers, vendors, and professional societies are leading initiatives to create and demonstrate use of automated systems. What are the implications for design of clinical trials, as these systems emerge? Gold standard, randomized controlled trials (RCTs have high internal validity for the patients and settings fitting constraints of the trial, but also have limitations including: reproducibility, generalizability to routine practice, infrequent external validation, selection bias, characterization of confounding factors, ethics, and use for rare events. BDARS present opportunities to augment and extend RCTs. Preliminary modeling using single- and muti-institutional BDARS may lead to better design and less cost. Standardizations in data elements, clinical processes, and nomenclatures used to decrease variability and increase veracity needed for automation and multi-institutional data pooling in BDARS also support ability to add clinical validation phases to clinical trial design and increase participation. However, volume and variety in BDARS present other technical, policy, and conceptual challenges including applicable statistical concepts, cloud-based technologies. In this summary, we will examine both the opportunities and the challenges for use of big data in design of clinical trials.

  16. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    Hemanth-Kumar, K.; Young, L.C.

    1995-01-01

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  17. A Design of a New Column-Parallel Analog-to-Digital Converter Flash for Monolithic Active Pixel Sensor

    Directory of Open Access Journals (Sweden)

    Mostafa Chakir

    2017-01-01

    Full Text Available The CMOS Monolithic Active Pixel Sensor (MAPS for the International Linear Collider (ILC vertex detector (VXD expresses stringent requirements on their analog readout electronics, specifically on the analog-to-digital converter (ADC. This paper concerns designing and optimizing a new architecture of a low power, high speed, and small-area 4-bit column-parallel ADC Flash. Later in this study, we propose to interpose an S/H block in the converter. This integration of S/H block increases the sensitiveness of the converter to the very small amplitude of the input signal from the sensor and provides a sufficient time to the converter to be able to code the input signal. This ADC is developed in 0.18 μm CMOS process with a pixel pitch of 35 μm. The proposed ADC responds to the constraints of power dissipation, size, and speed for the MAPS composed of a matrix of 64 rows and 48 columns where each column ADC covers a small area of 35 × 336.76 μm2. The proposed ADC consumes low power at a 1.8 V supply and 100 MS/s sampling rate with dynamic range of 125 mV. Its DNL and INL are 0.0812/−0.0787 LSB and 0.0811/−0.0787 LSB, respectively. Furthermore, this ADC achieves a high speed more than 5 GHz.

  18. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  19. Design of Dimensional Model for Clinical Data Storage and Analysis

    Directory of Open Access Journals (Sweden)

    Dipankar SENGUPTA

    2013-06-01

    Full Text Available Current research in the field of Life and Medical Sciences is generating chunk of data on daily basis. It has thus become a necessity to find solutions for efficient storage of this data, trying to correlate and extract knowledge from it. Clinical data generated in Hospitals, Clinics & Diagnostics centers is falling under a similar paradigm. Patient’s records in various hospitals are increasing at an exponential rate, thus adding to the problem of data management and storage. Major problem being faced corresponding to storage, is the varied dimensionality of the data, ranging from images to numerical form. Therefore there is a need for development of efficient data model which can handle this multi-dimensionality data issue and store the data with historical aspect.For the stated problem lying in façade of clinical informatics we propose a clinical dimensional model design which can be used for development of a clinical data mart. The model has been designed keeping in consideration temporal storage of patient's data with respect to all possible clinical parameters which can include both textual and image based data. Availability of said data for each patient can be then used for application of data mining techniques for finding the correlation of all the parameters at the level of individual and population.

  20. Design of clinical trials for therapeutic cancer vaccines development.

    Science.gov (United States)

    Mackiewicz, Jacek; Mackiewicz, Andrzej

    2009-12-25

    Advances in molecular and cellular biology as well as biotechnology led to definition of a group of drugs referred to as medicinal products of advanced technologies. It includes gene therapy products, somatic cell therapeutics and tissue engineering. Therapeutic cancer vaccines including whole cell tumor cells vaccines or gene modified whole cells belong to somatic therapeutics and/or gene therapy products category. The drug development is a multistep complex process. It comprises of two phases: preclinical and clinical. Guidelines on preclinical testing of cell based immunotherapy medicinal products have been defined by regulatory agencies and are available. However, clinical testing of therapeutic cancer vaccines is still under debate. It presents a serious problem since recently clinical efficacy of the number of cancer vaccines has been demonstrated that focused a lot of public attention. In general clinical testing in the current form is very expensive, time consuming and poorly designed what may lead to overlooking of products clinically beneficial for patients. Accordingly regulatory authorities and researches including Cancer Vaccine Clinical Trial Working Group proposed three regulatory solutions to facilitate clinical development of cancer vaccines: cost-recovery program, conditional marketing authorization, and a new development paradigm. Paradigm includes a model in which cancer vaccines are investigated in two types of clinical trials: proof-of-principle and efficacy. The proof-of-principle trial objectives are: safety; dose selection and schedule of vaccination; and demonstration of proof-of-principle. Efficacy trials are randomized clinical trials with objectives of demonstrating clinical benefit either directly or through a surrogate. The clinical end points are still under debate.

  1. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  2. Implications of clinical trial design on sample size requirements.

    Science.gov (United States)

    Leon, Andrew C

    2008-07-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two particularly relevant aspects of such a design often receive insufficient attention in an RCT. Multiple outcomes inflate type I error, and an unreliable assessment process introduces bias and reduces statistical power. Here we describe how both unreliability and multiple outcomes can increase the study costs and duration and reduce the feasibility of the study. The objective of this article is to consider strategies that overcome the problems of unreliability and multiplicity.

  3. Placebo effect in clinical trial design for irritable bowel syndrome.

    Science.gov (United States)

    Shah, Eric; Pimentel, Mark

    2014-04-30

    Ongoing efforts to improve clinical trial design in irritable bowel syndrome have been hindered by high placebo response rates and ineffective outcome measures. We assessed established strategies to minimize placebo effect as well as the various ap-proaches to placebo effect which can affect trial design. These include genetic markers such as catechol-O-methyltransferase, opioidergic and dopaminergic neurobiologic theory, pre-cebo effect centered on expectancy theory, and side effect unblinding grounded on conditioning theory. We reviewed endpoints used in the study of IBS over the past decade including adequate relief and subjective global relief, emphasizing their weaknesses in fully evaluating the IBS condition, specifically their motility effects based on functional net value and relative benefit-harm based on dropouts due to adverse events. The focus of this review is to highlight ongoing efforts to improve clinical trial design which can lead to better outcomes in a real-world setting.

  4. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  5. Designing healthcare information technology to catalyse change in clinical care

    Directory of Open Access Journals (Sweden)

    William Lester

    2008-05-01

    Full Text Available The gap between best practice and actual patient care continues to be a pervasive problem in our healthcare system. Efforts to improve on this knowledge_performance gap have included computerised disease management programs designed to improve guideline adherence. However, current computerised reminder and decision support interventions directed at changing physician behaviour have had only a limited and variable effect on clinical outcomes. Further, immediate pay-for-performance financial pressures on institutions have created an environmentwhere disease management systems are often created under duress, appended to existing clinical systems and poorly integrated into the existing workflow, potentially limiting their realworld effectiveness. The authors present a review of disease management as well as a conceptual framework to guide the development of more effective health information technology (HIT tools for translating clinical information into clinical action.

  6. Parallel MR imaging.

    Science.gov (United States)

    Deshmane, Anagha; Gulani, Vikas; Griswold, Mark A; Seiberlich, Nicole

    2012-07-01

    Parallel imaging is a robust method for accelerating the acquisition of magnetic resonance imaging (MRI) data, and has made possible many new applications of MR imaging. Parallel imaging works by acquiring a reduced amount of k-space data with an array of receiver coils. These undersampled data can be acquired more quickly, but the undersampling leads to aliased images. One of several parallel imaging algorithms can then be used to reconstruct artifact-free images from either the aliased images (SENSE-type reconstruction) or from the undersampled data (GRAPPA-type reconstruction). The advantages of parallel imaging in a clinical setting include faster image acquisition, which can be used, for instance, to shorten breath-hold times resulting in fewer motion-corrupted examinations. In this article the basic concepts behind parallel imaging are introduced. The relationship between undersampling and aliasing is discussed and two commonly used parallel imaging methods, SENSE and GRAPPA, are explained in detail. Examples of artifacts arising from parallel imaging are shown and ways to detect and mitigate these artifacts are described. Finally, several current applications of parallel imaging are presented and recent advancements and promising research in parallel imaging are briefly reviewed. Copyright © 2012 Wiley Periodicals, Inc.

  7. Improving clinical trial design for hepatocellular carcinoma treatments

    Directory of Open Access Journals (Sweden)

    Robert G. Gish

    2011-12-01

    Full Text Available Despite its place as the third leading cause of cancer deaths worldwide, there are currently no approved chemotherapeutic agents, devices or techniques to treat hepatocellular carcinoma. Importantly, there have been no phase III studies demonstrating survival benefit, nor any randomized studies of treatment except for transarterial chemoembolization and most recently sorafenib. The importance of well-designed clinical trials of agents to treat HCC has never been greater. However, general clinical study design issues, combined with HCC-specific issues pose significant challenges in structuring such studies. HCC-related challenges include the heterogeneity of this cancer and the fact that it is frequently accompanied by significant comorbidities at diagnosis, such as active hepatitis B or C virus replication, substantial past or on-going alcohol use, and cirrhosis, itself often a fatal disease. The recently published comparison of a newer treatment, nolatrexed to doxorubicin, and comments about this study’s initial HCC diagnostic criteria, staging system, comparator therapy and choice of endpoints have provided a platform to discuss the challenges unique to the design of HCC clinical trials. The difficulty in accurately framing study results obtained from the constantly changing HCC clinical landscape and approaches to meet these challenges will be reviewed.

  8. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [Michigan State Univ., East Lansing, MI (United States); Mokhov, Nikolai [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Niita, Koji [Research Organization for Information Science and Technology, Ibaraki-ken (Japan)

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  9. Innovative approaches to clinical development and trial design

    Directory of Open Access Journals (Sweden)

    John J Orloff

    2011-01-01

    Full Text Available Pharmaceutical innovation is increasingly risky, costly and at times inefficient, which has led to a decline in industry productivity. Despite the increased investment in R&D by the industry, the number of new molecular entities achieving marketing authorization is not increasing. Novel approaches to clinical development and trial design could have a key role in overcoming some of these challenges by improving efficiency and reducing attrition rates. The effectiveness of clinical development can be improved by adopting a more integrated model that increases flexibility and maximizes the use of accumulated knowledge. Central to this model of drug development are novel tools, including modelling and simulation, Bayesian methodologies, and adaptive designs, such as seamless adaptive designs and sample-size re-estimation methods. Applications of these methodologies to early- and late-stage drug development are described with some specific examples, along with advantages, challenges, and barriers to implementation. Because they are so flexible, these new trial designs require significant statistical analyses, simulations and logistical considerations to verify their operating characteristics, and therefore tend to require more time for the planning and protocol development phase. Greater awareness of the distinct advantages of innovative designs by regulators and sponsors are crucial to increasing the adoption of these modern tools.

  10. FY1995 study of low power LSI design automation software with parallel processing; 1995 nendo heiretsu shori wo katsuyoshita shodenryoku LSI muke sekkei jidoka software no kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The needs for low power LSIs have rapidly increased recently. For the low power LSI development, not only new circuit technologies but also new design automation tools supporting the new technologies are indispensable. The purpose of this project is to develop a new design automation software, which is able to design new digital LSIs with much lower power than that of conventional CMOS LSIs. A new design automation software for very low power LSIs has been developed targeting the pass-transistor logic SPL, a dedicated low power circuit technology. The software includes a logic synthesis function for pass-transistor-based macrocells and a macrocell placement function. Several new algorithms have been developed for the software, e.g. BDD construction. Some of them are designed and implemented for parallel processing in order to reduce the processing time. The logic synthesis function was tested on a set of benchmarks and finally applied to a low power CPU design. The designed 8-bit CPU was fully compatible with Zilog Z-80. The power dissipation of the CPU was compared with that of commercial CMOS Z-80. At most 82% of power of CMOS was reduced by the new CPU. On the other hand, parallel processing speed up was measured on the macrocell placement function. 34 folds speed up was realized. (NEDO)

  11. [Thinking on designation of sham acupuncture in clinical research].

    Science.gov (United States)

    Pan, Li-Jia; Chen, Bo; Zhao, Xue; Guo, Yi

    2014-01-01

    Randomized controlled trials (RCT) is the source of the raw data of evidence-based medicine. Blind method is adopted in most of the high-quality RCT. Sham acupuncture is the main form of blinded in acupuncture clinical trial. In order to improve the quality of acupuncture clinical trail, based on the necessity of sham acupuncture in clinical research, the current situation as well as the existing problems of sham acupuncture, suggestions were put forward from the aspects of new way and new designation method which can be adopted as reference, and factors which have to be considered during the process of implementing. Various subjective and objective factors involving in the process of trial should be considered, and used of the current international standards, try to be quantification, and carry out strict quality monitoring.

  12. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  13. A novel conceptual design of parallel nitrogen expansion liquefaction process for small-scale LNG (liquefied natural gas) plant in skid-mount packages

    International Nuclear Information System (INIS)

    He, Tianbiao; Ju, Yonglin

    2014-01-01

    The utilization of unconventional natural gas is still a great challenge for China due to its distribution locations and small reserves. Thus, liquefying the unconventional natural gas by using small-scale LNG plant in skid-mount packages is a good choice with great economic benefits. A novel conceptual design of parallel nitrogen expansion liquefaction process for small-scale plant in skid-mount packages has been proposed. It first designs a process configuration. Then, thermodynamic analysis of the process is conducted. Next, an optimization model with genetic algorithm method is developed to optimize the process. Finally, the flexibilities of the process are tested by two different feed gases. In conclusion, the proposed parallel nitrogen expansion liquefaction process can be used in small-scale LNG plant in skid-mount packages with high exergy efficiency and great economic benefits. - Highlights: • A novel design of parallel nitrogen expansion liquefaction process is proposed. • Genetic algorithm is applied to optimize the novel process. • The unit energy consumption of optimized process is 0.5163 kWh/Nm 3 . • The exergy efficiency of the optimized case is 0.3683. • The novel process has a good flexibility for different feed gas conditions

  14. Designing a placebo device: involving service users in clinical trial design.

    Science.gov (United States)

    Gooberman-Hill, Rachael; Jinks, Clare; Bouças, Sofia Barbosa; Hislop, Kelly; Dziedzic, Krysia S; Rhodes, Carol; Burston, Amanda; Adams, Jo

    2013-12-01

    Service users are increasingly involved in the design of clinical trials and in product and device development. Service user involvement in placebo development is crucial to a credible and acceptable placebo for clinical trials, but such involvement has not yet been reported. To enhance the design of a future clinical trial of hand splints for thumb-base osteoarthritis (OA), service users were involved in splint selection and design of a placebo splint. This article describes and reflects on this process. Two fora of service users were convened in 2011. Service users who had been prescribed a thumb splint for thumb-base OA were approached about involvement by Occupational Therapy (OT) practitioners. A total of eight service users took part in the fora. Service users discussed their experience of OA and their own splints and then tried a variety of alternative splints. Through this they identified the active features of splints alongside acceptable and unacceptable design features. Service users focused on wearability and support with or without immobilization. Fora discussed whether a placebo group ('arm') was an acceptable feature of a future trial, and service users developed a potential design for a placebo splint. This is the first project that to involve service users in placebo design. Service users are increasingly involved in product and device design and are ideally placed to identify features to make a placebo credible yet lacking key active ingredients. The future trial will include research into its acceptability. © 2013 John Wiley & Sons Ltd.

  15. Design principles for simulation games for learning clinical reasoning: A design-based research approach.

    Science.gov (United States)

    Koivisto, J-M; Haavisto, E; Niemi, H; Haho, P; Nylund, S; Multisilta, J

    2018-01-01

    Nurses sometimes lack the competence needed for recognising deterioration in patient conditions and this is often due to poor clinical reasoning. There is a need to develop new possibilities for learning this crucial competence area. In addition, educators need to be future oriented; they need to be able to design and adopt new pedagogical innovations. The purpose of the study is to describe the development process and to generate principles for the design of nursing simulation games. A design-based research methodology is applied in this study. Iterative cycles of analysis, design, development, testing and refinement were conducted via collaboration among researchers, educators, students, and game designers. The study facilitated the generation of reusable design principles for simulation games to guide future designers when designing and developing simulation games for learning clinical reasoning. This study makes a major contribution to research on simulation game development in the field of nursing education. The results of this study provide important insights into the significance of involving nurse educators in the design and development process of educational simulation games for the purpose of nursing education. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Massively parallel mathematical sieves

    Energy Technology Data Exchange (ETDEWEB)

    Montry, G.R.

    1989-01-01

    The Sieve of Eratosthenes is a well-known algorithm for finding all prime numbers in a given subset of integers. A parallel version of the Sieve is described that produces computational speedups over 800 on a hypercube with 1,024 processing elements for problems of fixed size. Computational speedups as high as 980 are achieved when the problem size per processor is fixed. The method of parallelization generalizes to other sieves and will be efficient on any ensemble architecture. We investigate two highly parallel sieves using scattered decomposition and compare their performance on a hypercube multiprocessor. A comparison of different parallelization techniques for the sieve illustrates the trade-offs necessary in the design and implementation of massively parallel algorithms for large ensemble computers.

  17. Multiscale Modeling in the Clinic: Drug Design and Development

    Energy Technology Data Exchange (ETDEWEB)

    Clancy, Colleen E.; An, Gary; Cannon, William R.; Liu, Yaling; May, Elebeoba E.; Ortoleva, Peter; Popel, Aleksander S.; Sluka, James P.; Su, Jing; Vicini, Paolo; Zhou, Xiaobo; Eckmann, David M.

    2016-02-17

    A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multi-scale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multi-scale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions to guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multi-scale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical techniques employed for multi-scale modeling approaches used in pharmacology and present several examples illustrating the current state-of-the-art regarding drug development for: Excitable Systems (Heart); Cancer (Metastasis and Differentiation); Cancer (Angiogenesis and Drug Targeting); Metabolic Disorders; and Inflammation and Sepsis. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multi-scale models.

  18. Development of design technology on thermal-hydraulic performance in tight-lattice rod bundle. 4. Large paralleled simulation by the advanced two-fluid model code

    International Nuclear Information System (INIS)

    Misawa, Takeharu; Yoshida, Hiroyuki; Akimoto, Hajime

    2008-01-01

    In Japan Atomic Energy Agency (JAEA), the Innovative Water Reactor for Flexible Fuel Cycle (FLWR) has been developed. For thermal design of FLWR, it is necessary to develop analytical method to predict boiling transition of FLWR. Japan Atomic Energy Agency (JAEA) has been developing three-dimensional two-fluid model analysis code ACE-3D, which adopts boundary fitted coordinate system to simulate complex shape channel flow. In this paper, as a part of development of ACE-3D to apply to rod bundle analysis, introduction of parallelization to ACE-3D and assessments of ACE-3D are shown. In analysis of large-scale domain such as a rod bundle, even two-fluid model requires large number of computational cost, which exceeds upper limit of memory amount of 1 CPU. Therefore, parallelization was introduced to ACE-3D to divide data amount for analysis of large-scale domain among large number of CPUs, and it is confirmed that analysis of large-scale domain such as a rod bundle can be performed by parallel computation with keeping parallel computation performance even using large number of CPUs. ACE-3D adopts two-phase flow models, some of which are dependent upon channel geometry. Therefore, analyses in the domains, which simulate individual subchannel and 37 rod bundle, are performed, and compared with experiments. It is confirmed that the results obtained by both analyses using ACE-3D show agreement with past experimental result qualitatively. (author)

  19. Submitted for your consideration: potential advantages of a novel clinical trial design and initial patient reaction

    Directory of Open Access Journals (Sweden)

    Matthew Shane Loop

    2012-08-01

    Full Text Available In many circumstances, individuals do not respond identically to the same treatment. This phenomenon, which is called treatment response heterogeneity (TRH, appears to be present in treatments for many conditions, including obesity. Estimating the total amount of TRH, predicting an individual’s response, and identifying the mediators of TRH are of interest to biomedical researchers. Clinical investigators and physicians commonly postulate that some of these mediators could be genetic. Current designs can estimate TRH as a function of specific, measurable observed factors; however, they cannot estimate the total amount of TRH, nor provide reliable estimates of individual persons’ responses. We propose a new repeated randomizations design (RRD, which can be conceived as a generalization of the Balaam design, that would allow estimates of that variability and facilitate estimation of the total amount of TRH, prediction of an individual’s response, and identification of the mediators of TRH. In a pilot study, we asked 118 subjects entering a weight loss trial for their opinion of the RRD, and they stated a preference for the RRD over the conventional 2-arm parallel groups design. Research is needed as to how the RRD will work in practice and its relative statistical properties, and we invite dialogue about it.

  20. Challenges and opportunities in designing clinical trials for neuromyelitis optica

    Science.gov (United States)

    Barron, Gerard; Behne, Jacinta M.; Bennett, Jeffery L.; Chin, Peter S.; Cree, Bruce A.C.; de Seze, Jerome; Flor, Armando; Fujihara, Kazuo; Greenberg, Benjamin; Higashi, Sayumi; Holt, William; Khan, Omar; Knappertz, Volker; Levy, Michael; Melia, Angela T.; Palace, Jacqueline; Smith, Terry J.; Sormani, Maria Pia; Van Herle, Katja; VanMeter, Susan; Villoslada, Pablo; Walton, Marc K.; Wasiewski, Warren; Wingerchuk, Dean M.; Yeaman, Michael R.

    2015-01-01

    Current management of neuromyelitis optica (NMO) is noncurative and only partially effective. Immunosuppressive or immunomodulatory agents are the mainstays of maintenance treatment. Safer, better-tolerated, and proven effective treatments are needed. The perceived rarity of NMO has impeded clinical trials for this disease. However, a diagnostic biomarker and recognition of a wider spectrum of NMO presentations has expanded the patient population from which study candidates might be recruited. Emerging insights into the pathogenesis of NMO have provided rationale for exploring new therapeutic targets. Academic, pharmaceutical, and regulatory communities are increasingly interested in meeting the unmet needs of patients with NMO. Clinical trials powered to yield unambiguous outcomes and designed to facilitate rapid evaluation of an expanding pipeline of experimental agents are needed. NMO-related disability occurs incrementally as a result of attacks; thus, limiting attack frequency and severity are critical treatment goals. Yet, the severity of NMO and perception that currently available agents are effective pose challenges to study design. We propose strategies for NMO clinical trials to evaluate agents targeting recovery from acute attacks and prevention of relapses, the 2 primary goals of NMO treatment. Aligning the interests of all stakeholders is an essential step to this end. PMID:25841026

  1. Algorithms for parallel computers

    International Nuclear Information System (INIS)

    Churchhouse, R.F.

    1985-01-01

    Until relatively recently almost all the algorithms for use on computers had been designed on the (usually unstated) assumption that they were to be run on single processor, serial machines. With the introduction of vector processors, array processors and interconnected systems of mainframes, minis and micros, however, various forms of parallelism have become available. The advantage of parallelism is that it offers increased overall processing speed but it also raises some fundamental questions, including: (i) which, if any, of the existing 'serial' algorithms can be adapted for use in the parallel mode. (ii) How close to optimal can such adapted algorithms be and, where relevant, what are the convergence criteria. (iii) How can we design new algorithms specifically for parallel systems. (iv) For multi-processor systems how can we handle the software aspects of the interprocessor communications. Aspects of these questions illustrated by examples are considered in these lectures. (orig.)

  2. Methodology Series Module 8: Designing Questionnaires and Clinical Record Forms.

    Science.gov (United States)

    Setia, Maninder Singh

    2017-01-01

    As researchers, we often collect data on a clinical record form or a questionnaire. It is an important part of study design. If the questionnaire is not well designed, the data collected will not be useful. In this section of the module, we have discussed some practical aspects of designing a questionnaire. It is useful to make a list of all the variables that will be assessed in the study before preparing the questionnaire. The researcher should review all the existing questionnaires. It may be efficient to use an existing standardized questionnaire or scale. Many of these scales are freely available and may be used with an appropriate reference. However, some may be under copyright protection and permissions may be required to use the same questionnaire. While designing their own questionnaire, researchers may use open- or close-ended questions. It is important to design the responses appropriately as the format of responses will influence the analysis. Sometimes, one can collect the same information in multiple ways - continuous or categorical response. Besides these, the researcher can also use visual analog scales or Likert's scale in the questionnaire. Some practical take-home points are: (1) Use specific language while framing the questions; (2) write detailed instructions in the questionnaire; (3) use mutually exclusive response categories; (4) use skip patterns; (5) avoid double-barreled questions; and (6) anchor the time period if required.

  3. Lactic-fermented egg white reduced serum cholesterol concentrations in mildly hypercholesterolemic Japanese men: a double-blind, parallel-arm design.

    Science.gov (United States)

    Matsuoka, Ryosuke; Usuda, Mika; Masuda, Yasunobu; Kunou, Masaaki; Utsunomiya, Kazunori

    2017-05-30

    Lactic-fermented egg white (LE), produced by lactic acid fermentation of egg white, is an easy-to-consume form of egg white. Here we assessed the effect of daily consumption of LE for 8 weeks on serum total cholesterol (TC) levels. The study followed a double-blind, parallel-arm design and included 88 adult men with mild hypercholesterolemia (mean ± standard error) serum TC levels, 229 ± 1.6 mg/dL; range, 204-259 mg/dL). The subjects were randomly divided into three groups, which consumed LE containing 4, 6, or 8 g of protein daily for 8 weeks. Blood samples were collected before starting LE consumption (baseline) and at 4 and 8 weeks to measure serum TC and low-density lipoprotein cholesterol (LDL-C) levels. After 8 weeks of consumption, serum TC levels in the 8 g group decreased by 11.0 ± 3.7 mg/dL, a significant decrease compared to baseline (p < 0.05) and a significantly greater decrease than for the 4 g group (3.1 ± 3.4 mg/dL; p < 0.05). Serum LDL-C levels in the 8 g group decreased by 13.7 ± 3.1 mg/dL, again a significant decrease compared with baseline (p < 0.05) and a significantly greater decrease than that for the 4 g group (2.1 ± 2.9 mg/dL; p < 0.05). Consumption of LE for 8 weeks at a daily dose of 8 g of proteins reduced serum TC and LDL-C levels in men with mild hypercholesterolemia, suggesting this may be effective in helping to prevent arteriosclerotic diseases. This clinical trial was retrospectively registered with the Japan Medical Association Center for Clinical Trials, (JMA-IIA00279; registered on 13/03/2017; https://dbcentre3.jmacct.med.or.jp/JMACTR/App/JMACTRE02_04/JMACTRE02_04.aspx?kbn=3&seqno=6530 ).

  4. Comparison between AAPM TG-51 and IAEA TRS-398 for plane parallel ionization chambers irradiated by clinical electron beams

    International Nuclear Information System (INIS)

    Mahmoud, M.A.

    2005-01-01

    We compared the results of absorbed dose determined at reference conditions according to AAPM T G-51 and IAEA TRS-398 using plane parallel ionization chambers. The study showed agreement between the two protocols for Holt ,Exradin P11, NACP, Attix RMI 449 and Roos ionization chambers. For Markus ionization chambers the absorbed dose calculated using AAPM TG-51 is higher than that calculated using IAEA TRS-398 by 1.8 % for R 5 0 =2 cm and decrease with increased R 5 0 to reach 1.2 % for R 5 0 =20 cm. For Capintec PS-033 ionization chambers the absorbed dose calculated using AAPM TG-51 is constantly higher than that calculated by IAEA TRS-398 by 1.5 %. A theoretical explanation was introduced for these results

  5. Basics of case report form designing in clinical research

    Directory of Open Access Journals (Sweden)

    Shantala Bellary

    2014-01-01

    Full Text Available Case report form (CRF is a specialized document in clinical research. It should be study protocol driven, robust in content and have material to collect the study specific data. Though paper CRFs are still used largely, use of electronic CRFs (eCRFS are gaining popularity due to the advantages they offer such as improved data quality, online discrepancy management and faster database lock etc. Main objectives behind CRF development are preserving and maintaining quality and integrity of data. CRF design should be standardized to address the needs of all users such as investigator, site coordinator, study monitor, data entry personnel, medical coder and statistician. Data should be organized in a format that facilitates and simplifies data analysis. Collection of large amount of data will result in wasted resources in collecting and processing it and in many circumstances, will not be utilized for analysis. Apart from that, standard guidelines should be followed while designing the CRF. CRF completion manual should be provided to the site personnel to promote accurate data entry by them. These measures will result in reduced query generations and improved data integrity. It is recommended to establish and maintain a library of templates of standard CRF modules as they are time saving and cost-effective. This article is an attempt to describe the methods of CRF designing in clinical research and discusses the challenges encountered in this process.

  6. Basics of case report form designing in clinical research.

    Science.gov (United States)

    Bellary, Shantala; Krishnankutty, Binny; Latha, M S

    2014-10-01

    Case report form (CRF) is a specialized document in clinical research. It should be study protocol driven, robust in content and have material to collect the study specific data. Though paper CRFs are still used largely, use of electronic CRFs (eCRFS) are gaining popularity due to the advantages they offer such as improved data quality, online discrepancy management and faster database lock etc. Main objectives behind CRF development are preserving and maintaining quality and integrity of data. CRF design should be standardized to address the needs of all users such as investigator, site coordinator, study monitor, data entry personnel, medical coder and statistician. Data should be organized in a format that facilitates and simplifies data analysis. Collection of large amount of data will result in wasted resources in collecting and processing it and in many circumstances, will not be utilized for analysis. Apart from that, standard guidelines should be followed while designing the CRF. CRF completion manual should be provided to the site personnel to promote accurate data entry by them. These measures will result in reduced query generations and improved data integrity. It is recommended to establish and maintain a library of templates of standard CRF modules as they are time saving and cost-effective. This article is an attempt to describe the methods of CRF designing in clinical research and discusses the challenges encountered in this process.

  7. Mandibular advancement appliance for obstructive sleep apnoea: results of a randomised placebo controlled trial using parallel group design

    DEFF Research Database (Denmark)

    Petri, N.; Svanholt, P.; Solow, B.

    2008-01-01

    The aim of this trial was to evaluate the efficacy of a mandibular advancement appliance (MAA) for obstructive sleep apnoea (OSA). Ninety-three patients with OSA and a mean apnoea-hypopnoea index (AHI) of 34.7 were centrally randomised into three, parallel groups: (a) MAA; (b) mandibular non......). Eighty-one patients (87%) completed the trial. The MAA group achieved mean AHI and Epworth scores significantly lower (P group and the no-intervention group. No significant differences were found between the MNA group and the no-intervention group. The MAA group had...

  8. Design Analysis and Dynamic Modeling of a High-Speed 3T1R Pick-and-Place Parallel Robot

    DEFF Research Database (Denmark)

    Wu, Guanglei; Bai, Shaoping; Hjørnet, Preben

    2015-01-01

    This paper introduces a four degree-of-freedom parallel robot producing three translation and one rotation (Schönflies motion). This robot can generate a rectangular workspace that is close to the applicable work envelope and suitable for pick-and-place operations. The kinematics of the robot...... is studied to analyze the workspace and the isocontours of the local dexterity over the representative regular workspace are visualized. The simplified dynamics is modeled and compared with Adams model to show its effectiveness....

  9. An Optimization-Based Reconfigurable Design for a 6-Bit 11-MHz Parallel Pipeline ADC with Double-Sampling S&H

    Directory of Open Access Journals (Sweden)

    Wilmar Carvajal

    2012-01-01

    Full Text Available This paper presents a 6 bit, 11 MS/s time-interleaved pipeline A/D converter design. The specification process, from block level to elementary circuits, is gradually covered to draw a design methodology. Both power consumption and mismatch between the parallel chain elements are intended to be reduced by using some techniques such as double and bottom-plate sampling, fully differential circuits, RSD digital correction, and geometric programming (GP optimization of the elementary analog circuits (OTAs and comparators design. Prelayout simulations of the complete ADC are presented to characterize the designed converter, which consumes 12 mW while sampling a 500 kHz input signal. Moreover, the block inside the ADC with the most stringent requirements in power, speed, and precision was sent to fabrication in a CMOS 0.35 μm AMS technology, and some postlayout results are shown.

  10. Opportunities and challenges for the integration of massively parallel genomic sequencing into clinical practice: lessons from the ClinSeq project.

    Science.gov (United States)

    Biesecker, Leslie G

    2012-04-01

    The debate surrounding the return of results from high-throughput genomic interrogation encompasses many important issues including ethics, law, economics, and social policy. As well, the debate is also informed by the molecular, genetic, and clinical foundations of the emerging field of clinical genomics, which is based on this new technology. This article outlines the main biomedical considerations of sequencing technologies and demonstrates some of the early clinical experiences with the technology to enable the debate to stay focused on real-world practicalities. These experiences are based on early data from the ClinSeq project, which is a project to pilot the use of massively parallel sequencing in a clinical research context with a major aim to develop modes of returning results to individual subjects. The study has enrolled >900 subjects and generated exome sequence data on 572 subjects. These data are beginning to be interpreted and returned to the subjects, which provides examples of the potential usefulness and pitfalls of clinical genomics. There are numerous genetic results that can be readily derived from a genome including rare, high-penetrance traits, and carrier states. However, much work needs to be done to develop the tools and resources for genomic interpretation. The main lesson learned is that a genome sequence may be better considered as a health-care resource, rather than a test, one that can be interpreted and used over the lifetime of the patient.

  11. Behavioral Design Teams: The Next Frontier in Clinical Delivery Innovation?

    Science.gov (United States)

    Robertson, Ted; Darling, Matthew; Leifer, Jennifer; Footer, Owen; Gordski, Dani

    2017-11-01

    A deep understanding of human behavior is critical to designing effective health care delivery models, tools, and processes. Currently, however, few mechanisms exist to systematically apply insights about human behavior to improve health outcomes. Behavioral design teams (BDTs) are a successful model for applying behavioral insights within an organization. Already operational within government, this model can be adapted to function in a health care setting. To explore how BDTs could be applied to clinical care delivery and review models for integrating these teams within health care organizations. Interviews with experts in clinical delivery innovation and applied behavioral science, as well as leaders of existing government BDTs. BDTs are most effective when they enjoy top-level executive support, are co-led by a domain expert and behavioral scientist, collaborate closely with key staff and departments, have access to data and IT support, and operate a portfolio of projects. BDTs could be embedded in health care organizations in multiple ways, including in or just below the CEO’s office, within a quality improvement unit, or within an internal innovation center. When running a portfolio, BDTs achieve a greater number and diversity of insights at lower costs. They also become a platform for strategic learning and scaling.

  12. Does Clinical Staging and Histological Grading Show Parallelism In Oral Submucous Fibrosis? A Retrospective Study from an Indian City

    Directory of Open Access Journals (Sweden)

    Manish Narayan

    2014-06-01

    Conclusions: There was no correlation between clinical staging and histopathological grading of oral submucous fibrosis. The test results were statistically not significant. (p=0.635 This may be due to difference in severity and extent of fibrosis in different parts of the oral mucosa. [J Interdiscipl Histopathol 2014; 2(3.000: 145-149

  13. Design, manufacture and evaluation of a new flexible constant velocity mechanism for transmission of power between parallel shafts

    Energy Technology Data Exchange (ETDEWEB)

    Yaghoubi, Majid [University of Tehran, Tehran (Iran, Islamic Republic of); Sanaeifar, Alireza [Shiraz University, Shiraz (Iran, Islamic Republic of)

    2015-08-15

    This paper presents a new mechanism (coupling) for power transmission between parallel shafts in more ranges. The mechanism consists of one drive shaft and one driven shaft, 3 S-shape transmitter links and 8 connecting links. The advantage of this mechanism is that the velocity ratio between input and output shafts remains constant at all movements, and its capacity to offset misalignments is greater than that of other couplings. This research also includes a kinematic analysis and simulations using Visual NASTRAN, Autodesk inventor dynamic and COSMOS motion to prove that the mechanism exhibits a constant velocity. Finally, the mechanism was fabricated and evaluated; results showed that the mechanism can practically transmit a constant velocity ratio.

  14. Design, manufacture and evaluation of a new flexible constant velocity mechanism for transmission of power between parallel shafts

    International Nuclear Information System (INIS)

    Yaghoubi, Majid; Sanaeifar, Alireza

    2015-01-01

    This paper presents a new mechanism (coupling) for power transmission between parallel shafts in more ranges. The mechanism consists of one drive shaft and one driven shaft, 3 S-shape transmitter links and 8 connecting links. The advantage of this mechanism is that the velocity ratio between input and output shafts remains constant at all movements, and its capacity to offset misalignments is greater than that of other couplings. This research also includes a kinematic analysis and simulations using Visual NASTRAN, Autodesk inventor dynamic and COSMOS motion to prove that the mechanism exhibits a constant velocity. Finally, the mechanism was fabricated and evaluated; results showed that the mechanism can practically transmit a constant velocity ratio.

  15. Parallel magnetic resonance imaging

    International Nuclear Information System (INIS)

    Larkman, David J; Nunes, Rita G

    2007-01-01

    Parallel imaging has been the single biggest innovation in magnetic resonance imaging in the last decade. The use of multiple receiver coils to augment the time consuming Fourier encoding has reduced acquisition times significantly. This increase in speed comes at a time when other approaches to acquisition time reduction were reaching engineering and human limits. A brief summary of spatial encoding in MRI is followed by an introduction to the problem parallel imaging is designed to solve. There are a large number of parallel reconstruction algorithms; this article reviews a cross-section, SENSE, SMASH, g-SMASH and GRAPPA, selected to demonstrate the different approaches. Theoretical (the g-factor) and practical (coil design) limits to acquisition speed are reviewed. The practical implementation of parallel imaging is also discussed, in particular coil calibration. How to recognize potential failure modes and their associated artefacts are shown. Well-established applications including angiography, cardiac imaging and applications using echo planar imaging are reviewed and we discuss what makes a good application for parallel imaging. Finally, active research areas where parallel imaging is being used to improve data quality by repairing artefacted images are also reviewed. (invited topical review)

  16. Differential mode EMI filter design for ultra high efficiency partial parallel isolated full-bridge boost converter

    DEFF Research Database (Denmark)

    Makda, Ishtiyaq Ahmed; Nymand, M.

    2013-01-01

    for such application, it calls for a carefully optimized EMI filter which is designed and implemented in this work. Moreover, the negative input impedance of the regulated converter is extremely low; well-designed filter damping branch is also included. Differential mode noise is analyzed analytically for a 3KW/400V...

  17. The Managed Ventricular pacing versus VVI 40 Pacing (MVP) Trial: clinical background, rationale, design, and implementation.

    Science.gov (United States)

    Sweeney, Michael O; Ellenbogen, Kenneth A; Miller, Elaine Hogan; Sherfesee, Lou; Sheldon, Todd; Whellan, David

    2006-12-01

    Implantable cardioverter defibrillators (ICDs) reduce mortality among appropriately selected patients who have had or are at risk for life-threatening ventricular arrhythmia. Right ventricular apical (RVA) pacing has been implicated in worsening heart failure and death. The optimal pacemaker mode for bradycardia support while minimizing unnecessary and potentially harmful RVA pacing has not been determined. The Managed Ventricular pacing vs. VVI 40 Pacing Trial (MVP) is a prospective, multicenter, randomized, single-blind, parallel, controlled clinical trial designed to establish whether atrial-based dual-chamber managed ventricular pacing mode (MVP) is equivalent or superior to back-up only ventricular pacing (VVI 40) among patients with standard indications for ICD therapy and no indication for bradycardia pacing. The MVP Trial is designed with 80% power to detect a 10% reduction in the primary endpoint of new or worsening heart failure or all-cause mortality in the MVP-treated group. Approximately 1,000 patients at 80 centers in the United States, Canada, Western Europe, and Israel will be randomized to MVP or VVI 40 pacing after successful implantation of a dual-chamber ICD. Heart failure therapies will be optimized in accordance with evidence-based guidelines. Prespecified secondary endpoints will include ventricular arrhythmias, atrial fibrillation, new indication for bradycardia pacing, health-related quality of life, and cost effectiveness. Enrollment began in October 2004 and concluded in April 2006. The study will be terminated upon recommendation of the Data Monitoring Committee or when the last patient enrolled and surviving has reached a minimum 2 years of follow-up. The MVP Trial will meet the clinical need for carefully designed prospective studies to define the benefits of atrial-based dual-chamber minimal ventricular pacing versus single-chamber ventricular pacing in conventional ICD patients.

  18. Evaluating clinical and public health interventions: a practical guide to study design and statistics

    National Research Council Canada - National Science Library

    Katz, Mitchell H

    2010-01-01

    .... Because the choice of research design depends on the nature of the intervention, the book covers randomized and nonrandomized designs, prospective and retrospective studies, planned clinical trials...

  19. The next generation of sepsis clinical trial designs: what is next after the demise of recombinant human activated protein C?*.

    Science.gov (United States)

    Opal, Steven M; Dellinger, R Phillip; Vincent, Jean-Louis; Masur, Henry; Angus, Derek C

    2014-07-01

    The developmental pipeline for novel therapeutics to treat sepsis has diminished to a trickle compared to previous years of sepsis research. While enormous strides have been made in understanding the basic molecular mechanisms that underlie the pathophysiology of sepsis, a long list of novel agents have now been tested in clinical trials without a single immunomodulating therapy showing consistent benefit. The only antisepsis agent to successfully complete a phase III clinical trial was human recumbent activated protein C. This drug was taken off the market after a follow-up placebo-controlled trial (human recombinant activated Protein C Worldwide Evaluation of Severe Sepsis and septic Shock [PROWESS SHOCK]) failed to replicate the favorable results of the initial registration trial performed ten years earlier. We must critically reevaluate our basic approach to the preclinical and clinical evaluation of new sepsis therapies. We selected the major clinical studies that investigated interventional trials with novel therapies to treat sepsis over the last 30 years. Phase II and phase III trials investigating new treatments for sepsis and editorials and critiques of these studies. Selected manuscripts and clinical study reports were analyzed from sepsis trials. Specific shortcomings and potential pit falls in preclinical evaluation and clinical study design and analysis were reviewed and synthesized. After review and discussion, a series of 12 recommendations were generated with suggestions to guide future studies with new treatments for sepsis. We need to improve our ability to define appropriate molecular targets for preclinical development and develop better methods to determine the clinical value of novel sepsis agents. Clinical trials must have realistic sample sizes and meaningful endpoints. Biomarker-driven studies should be considered to categorize specific "at risk" populations most likely to benefit from a new treatment. Innovations in clinical trial design

  20. OARSI Clinical Trials Recommendations: Design and conduct of clinical trials for hand osteoarthritis.

    Science.gov (United States)

    Kloppenburg, M; Maheu, E; Kraus, V B; Cicuttini, F; Doherty, M; Dreiser, R-L; Henrotin, Y; Jiang, G-L; Mandl, L; Martel-Pelletier, J; Nelson, A E; Neogi, T; Pelletier, J-P; Punzi, L; Ramonda, R; Simon, L S; Wang, S

    2015-05-01

    Hand osteoarthritis (OA) is a very frequent disease, but yet understudied. However, a lot of works have been published in the past 10 years, and much has been done to better understand its clinical course and structural progression. Despite this new knowledge, few therapeutic trials have been conducted in hand OA. The last OARSI recommendations for the conduct of clinical trials in hand OA dates back to 2006. The present recommendations aimed at updating previous recommendations, by incorporating new data. The purpose of this expert opinion, consensus driven exercise is to provide evidence-based guidance on the design, execution and analysis of clinical trials in hand OA, where published evidence is available, supplemented by expert opinion, where evidence is lacking, to perform clinical trials in hand OA, both for symptom and for structure-modification. They indicate core outcome measurement sets for studies in hand OA, and list the methods and instruments that should be used to measure symptoms or structure. For both symptom- and structure-modification, at least pain, physical function, patient global assessment, HR-QoL, joint activity and hand strength should be assessed. In addition, for structure-modification trials, structural progression should be measured by radiographic changes. We also provide a research agenda listing many unsolved issues that seem to most urgently need to be addressed from the perspective of performing "good" clinical trials in hand OA. These updated OARSI recommendations should allow for better standardizing the conduct of clinical trials in hand OA in the next future. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  1. Personalized medicine enrichment design for DHA supplementation clinical trial

    Directory of Open Access Journals (Sweden)

    Yang Lei

    2017-03-01

    Full Text Available Personalized medicine aims to match patient subpopulation to the most beneficial treatment. The purpose of this study is to design a prospective clinical trial in which we hope to achieve the highest level of confirmation in identifying and making treatment recommendations for subgroups, when the risk levels in the control arm can be ordered. This study was motivated by our goal to identify subgroups in a DHA (docosahexaenoic acid supplementation trial to reduce preterm birth (gestational age<37 weeks rate. We performed a meta-analysis to obtain informative prior distributions and simulated operating characteristics to ensure that overall Type I error rate was close to 0.05 in designs with three different models: independent, hierarchical, and dynamic linear models. We performed simulations and sensitivity analysis to examine the subgroup power of models and compared results to a chi-square test. We performed simulations under two hypotheses: a large overall treatment effect and a small overall treatment effect. Within each hypothesis, we designed three different subgroup effects scenarios where resulting subgroup rates are linear, flat, or nonlinear. When the resulting subgroup rates are linear or flat, dynamic linear model appeared to be the most powerful method to identify the subgroups with a treatment effect. It also outperformed other methods when resulting subgroup rates are nonlinear and the overall treatment effect is big. When the resulting subgroup rates are nonlinear and the overall treatment effect is small, hierarchical model and chi-square test did better. Compared to independent and hierarchical models, dynamic linear model tends to be relatively robust and powerful when the control arm has ordinal risk subgroups.

  2. Personalized Medicine Enrichment Design for DHA Supplementation Clinical Trial.

    Science.gov (United States)

    Lei, Yang; Mayo, Matthew S; Carlson, Susan E; Gajewski, Byron J

    2017-03-01

    Personalized medicine aims to match patient subpopulation to the most beneficial treatment. The purpose of this study is to design a prospective clinical trial in which we hope to achieve the highest level of confirmation in identifying and making treatment recommendations for subgroups, when the risk levels in the control arm can be ordered. This study was motivated by our goal to identify subgroups in a DHA (docosahexaenoic acid) supplementation trial to reduce preterm birth (gestational agerate. We performed a meta-analysis to obtain informative prior distributions and simulated operating characteristics to ensure that overall Type I error rate was close to 0.05 in designs with three different models: independent, hierarchical, and dynamic linear models. We performed simulations and sensitivity analysis to examine the subgroup power of models and compared results to a chi-square test. We performed simulations under two hypotheses: a large overall treatment effect and a small overall treatment effect. Within each hypothesis, we designed three different subgroup effects scenarios where resulting subgroup rates are linear, flat, or nonlinear. When the resulting subgroup rates are linear or flat, dynamic linear model appeared to be the most powerful method to identify the subgroups with a treatment effect. It also outperformed other methods when resulting subgroup rates are nonlinear and the overall treatment effect is big. When the resulting subgroup rates are nonlinear and the overall treatment effect is small, hierarchical model and chi-square test did better. Compared to independent and hierarchical models, dynamic linear model tends to be relatively robust and powerful when the control arm has ordinal risk subgroups.

  3. Understanding complex clinical reasoning in infectious diseases for improving clinical decision support design.

    Science.gov (United States)

    Islam, Roosan; Weir, Charlene R; Jones, Makoto; Del Fiol, Guilherme; Samore, Matthew H

    2015-11-30

    Clinical experts' cognitive mechanisms for managing complexity have implications for the design of future innovative healthcare systems. The purpose of the study is to examine the constituents of decision complexity and explore the cognitive strategies clinicians use to control and adapt to their information environment. We used Cognitive Task Analysis (CTA) methods to interview 10 Infectious Disease (ID) experts at the University of Utah and Salt Lake City Veterans Administration Medical Center. Participants were asked to recall a complex, critical and vivid antibiotic-prescribing incident using the Critical Decision Method (CDM), a type of Cognitive Task Analysis (CTA). Using the four iterations of the Critical Decision Method, questions were posed to fully explore the incident, focusing in depth on the clinical components underlying the complexity. Probes were included to assess cognitive and decision strategies used by participants. The following three themes emerged as the constituents of decision complexity experienced by the Infectious Diseases experts: 1) the overall clinical picture does not match the pattern, 2) a lack of comprehension of the situation and 3) dealing with social and emotional pressures such as fear and anxiety. All these factors contribute to decision complexity. These factors almost always occurred together, creating unexpected events and uncertainty in clinical reasoning. Five themes emerged in the analyses of how experts deal with the complexity. Expert clinicians frequently used 1) watchful waiting instead of over- prescribing antibiotics, engaged in 2) theory of mind to project and simulate other practitioners' perspectives, reduced very complex cases into simple 3) heuristics, employed 4) anticipatory thinking to plan and re-plan events and consulted with peers to share knowledge, solicit opinions and 5) seek help on patient cases. The cognitive strategies to deal with decision complexity found in this study have important

  4. Clinical Digital Libraries Project: design approach and exploratory assessment of timely use in clinical environments*

    Science.gov (United States)

    MacCall, Steven L.

    2006-01-01

    Objective: The paper describes and evaluates the use of Clinical Digital Libraries Project (CDLP) digital library collections in terms of their facilitation of timely clinical information seeking. Design: A convenience sample of CDLP Web server log activity over a twelve-month period (7/2002 to 6/2003) was analyzed for evidence of timely information seeking after users were referred to digital library clinical topic pages from Web search engines. Sample searches were limited to those originating from medical schools (26% North American and 19% non-North American) and from hospitals or clinics (51% North American and 4% non-North American). Measurement: Timeliness was determined based on a calculation of the difference between the timestamps of the first and last Web server log “hit” during each search in the sample. The calculated differences were mapped into one of three ranges: less than one minute, one to three minutes, and three to five minutes. Results: Of the 864 searches analyzed, 48% were less than 1 minute, 41% were 1 to 3 minutes, and 11% were 3 to 5 minutes. These results were further analyzed by environment (medical schools versus hospitals or clinics) and by geographic location (North America versus non-North American). Searches reflected a consistent pattern of less than 1 minute in these environments. Though the results were not consistent on a month-by-month basis over the entire time period, data for 8 of 12 months showed that searches shorter than 1 minute predominated and data for 1 month showed an equal number of less than 1 minute and 1 to 3 minute searches. Conclusions: The CDLP digital library collections provided timely access to high-quality Web clinical resources when used for information seeking in medical education and hospital or clinic environments from North American and non–North American locations and consistently provided access to the sought information within the documented two-minute standard. The limitations of the use of

  5. Clinical and pathological outcomes after irreversible electroporation of the pancreas using two parallel plate electrodes: a porcine model.

    Science.gov (United States)

    Rombouts, Steffi J E; van Dijck, Willemijn P M; Nijkamp, Maarten W; Derksen, Tyche C; Brosens, Lodewijk A A; Hoogwater, Frederik J H; van Leeuwen, Maarten S; Borel Rinkes, Inne H M; van Hillegersberg, Richard; Wittkampf, Fred H; Molenaar, Izaak Q

    2017-12-01

    Irreversible electroporation (IRE) by inserting needles around the tumor as treatment for locally advanced pancreatic cancer entails several disadvantages, such as incomplete ablation due to field inhomogeneity, technical difficulties in needle placement and a risk of pancreatic fistula development. This experimental study evaluates outcomes of IRE using paddles in a porcine model. Six healthy pigs underwent laparotomy and were treated with 2 separate ablations (in head and tail of the pancreas). Follow-up consisted of clinical and laboratory parameters and contrast-enhanced computed tomography (ceCT) imaging. After 2 weeks, pancreatoduodenectomy was performed for histology and the pigs were terminated. All animals survived 14 days. None of the animals developed signs of infection or significant abdominal distention. Serum amylase and lipase peaked at day 1 postoperatively in all pigs, but normalized without signs of pancreatitis. On ceCT-imaging the ablation zone was visible as an ill-defined, hypodense lesion. No abscesses, cysts or ascites were seen. Histology showed a homogenous fibrotic lesion in all pigs. IRE ablation of healthy porcine pancreatic tissue using two plate electrodes is feasible and safe and creates a homogeneous fibrotic lesion. IRE-paddles should be tested on pancreatic adenocarcinoma to determine the effect in cancer tissue. Copyright © 2017. Published by Elsevier Ltd.

  6. Scale-up for model verification; design, modeling and control of an elastic parallel kinematic 6-DOFs manipulator

    NARCIS (Netherlands)

    Huijts, Martijn; Brouwer, Dannis Michel; van Dijk, Johannes

    2009-01-01

    Manipulators with guidance constructions based on elastic mechanisms are of interest for the increasing number of precision vacuum applications. Previously, the design of an elastic MEMS-based 6-DOFs manipulator was presented. Characterization in six degrees of freedom (DOFs) of a manipulator the

  7. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  8. Series vs parallel connected organic tandem solar cells : cell performance and impact on the design and operation of functional modules

    NARCIS (Netherlands)

    Etxebarriaa, I.; Furlan, A.; Ajuria, J.; Fecher, F.W.; Voigt, de M.J.A.; Brabecd, C.J.; Wienk, M.M.; Slooff, L.H.; Veenstra, S.; Gilot, J.; Pacios, R.

    2014-01-01

    Tandem solar cells are the best approach to maximize the light harvesting and adjust the overall absorption of the cell to the solar irradiance spectrum. Usually, the front and back subcells are connected in series in two-terminal device (2T) designs which require a current matching between both

  9. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jakeman, John Davis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stephens, John Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vigil, Dena M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wildey, Timothy Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bohnhoff, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hu, Kenneth T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dalbey, Keith R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauman, Lara E [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hough, Patricia Diane [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  10. Parallel R

    CERN Document Server

    McCallum, Ethan

    2011-01-01

    It's tough to argue with R as a high-quality, cross-platform, open source statistical software product-unless you're in the business of crunching Big Data. This concise book introduces you to several strategies for using R to analyze large datasets. You'll learn the basics of Snow, Multicore, Parallel, and some Hadoop-related tools, including how to find them, how to use them, when they work well, and when they don't. With these packages, you can overcome R's single-threaded nature by spreading work across multiple CPUs, or offloading work to multiple machines to address R's memory barrier.

  11. Clinic Design as Placebo—Using Design to Promote Healing and Support Treatments

    Directory of Open Access Journals (Sweden)

    Jonas Rehn

    2017-11-01

    Full Text Available Analogously to the medical placebo effect, people seem to anticipate the quality of treatments based on external stimuli. In order to gain insights on the effect the built environment can have on a person’s judgments and behavior with a particular focus on health related issues, a quantitative survey (N = 851 with four groups before and after the renovation of a rehabilitation clinic has been conducted. In line with an overall modernization of the clinic, the entrance, the lobby, and some patient rooms have been changed. In the lobby, a service counter and coffee bar have been added as well as light colors and new flooring material to achieve a more modern and clean atmosphere in the sense of aesthetical appearance of the space. The outcome revealed that patients rate the intention to change their health behavior as well as the quality of food or significantly higher in a modernized clinic. These differences cannot be directly attributed solely to the changes in the building. Analogously to the medical placebo, an effect referred to as design placebo effect is, therefore, proposed to explain improved ratings of aspects that have not directly been changed due to the intervention. Other significant effects are attributable to winter and summer climate. During summer time, ratings for waiting area, atmosphere, patient rooms, as well as for staff were significantly higher. It is, therefore, assumed that aesthetic attributes, such as architectural design, or friendliness of the weather, exert their effects as perceptual placebos that directly influence judgment outcomes and behavioral intentions. Further research is needed to match certain design and general environmental features to their effects on patients and investigate their effect strength.

  12. Clinic Design as Placebo-Using Design to Promote Healing and Support Treatments.

    Science.gov (United States)

    Rehn, Jonas; Schuster, Kai

    2017-11-09

    Analogously to the medical placebo effect, people seem to anticipate the quality of treatments based on external stimuli. In order to gain insights on the effect the built environment can have on a person's judgments and behavior with a particular focus on health related issues, a quantitative survey ( N = 851) with four groups before and after the renovation of a rehabilitation clinic has been conducted. In line with an overall modernization of the clinic, the entrance, the lobby, and some patient rooms have been changed. In the lobby, a service counter and coffee bar have been added as well as light colors and new flooring material to achieve a more modern and clean atmosphere in the sense of aesthetical appearance of the space. The outcome revealed that patients rate the intention to change their health behavior as well as the quality of food or significantly higher in a modernized clinic. These differences cannot be directly attributed solely to the changes in the building. Analogously to the medical placebo, an effect referred to as design placebo effect is, therefore, proposed to explain improved ratings of aspects that have not directly been changed due to the intervention. Other significant effects are attributable to winter and summer climate. During summer time, ratings for waiting area, atmosphere, patient rooms, as well as for staff were significantly higher. It is, therefore, assumed that aesthetic attributes, such as architectural design, or friendliness of the weather, exert their effects as perceptual placebos that directly influence judgment outcomes and behavioral intentions. Further research is needed to match certain design and general environmental features to their effects on patients and investigate their effect strength.

  13. A magneto-motive ultrasound platform designed for pre-clinical and clinical applications

    Directory of Open Access Journals (Sweden)

    Diego Ronaldo Thomaz Sampaio

    Full Text Available Abstract Introduction Magneto-motive ultrasound (MMUS combines magnetism and ultrasound (US to detect magnetic nanoparticles in soft tissues. One type of MMUS called shear-wave dispersion magneto-motive ultrasound (SDMMUS analyzes magnetically induced shear waves (SW to quantify the elasticity and viscosity of the medium. The lack of an established presets or protocols for pre-clinical and clinical studies currently limits the use of MMUS techniques in the clinical setting. Methods This paper proposes a platform to acquire, process, and analyze MMUS and SDMMUS data integrated with a clinical ultrasound equipment. For this purpose, we developed an easy-to-use graphical user interface, written in C++/Qt4, to create an MMUS pulse sequence and collect the ultrasonic data. We designed a graphic interface written in MATLAB to process, display, and analyze the MMUS images. To exemplify how useful the platform is, we conducted two experiments, namely (i MMUS imaging to detect magnetic particles in the stomach of a rat, and (ii SDMMUS to estimate the viscoelasticity of a tissue-mimicking phantom containing a spherical target of ferrite. Results The developed software proved to be an easy-to-use platform to automate the acquisition of MMUS/SDMMUS data and image processing. In an in vivo experiment, the MMUS technique detected an area of 6.32 ± 1.32 mm2 where magnetic particles were heterogeneously distributed in the stomach of the rat. The SDMMUS method gave elasticity and viscosity values of 5.05 ± 0.18 kPa and 2.01 ± 0.09 Pa.s, respectively, for a tissue-mimicking phantom. Conclusion Implementation of an MMUS platform with addressed presets and protocols provides a step toward the clinical implementation of MMUS imaging equipment. This platform may help to localize magnetic particles and quantify the elasticity and viscosity of soft tissues, paving a way for its use in pre-clinical and clinical studies.

  14. Parallel steady state studies on a milliliter scale accelerate fed-batch bioprocess design for recombinant protein production with Escherichia coli.

    Science.gov (United States)

    Schmideder, Andreas; Cremer, Johannes H; Weuster-Botz, Dirk

    2016-11-01

    In general, fed-batch processes are applied for recombinant protein production with Escherichia coli (E. coli). However, state of the art methods for identifying suitable reaction conditions suffer from severe drawbacks, i.e. direct transfer of process information from parallel batch studies is often defective and sequential fed-batch studies are time-consuming and cost-intensive. In this study, continuously operated stirred-tank reactors on a milliliter scale were applied to identify suitable reaction conditions for fed-batch processes. Isopropyl β-d-1-thiogalactopyranoside (IPTG) induction strategies were varied in parallel-operated stirred-tank bioreactors to study the effects on the continuous production of the recombinant protein photoactivatable mCherry (PAmCherry) with E. coli. Best-performing induction strategies were transferred from the continuous processes on a milliliter scale to liter scale fed-batch processes. Inducing recombinant protein expression by dynamically increasing the IPTG concentration to 100 µM led to an increase in the product concentration of 21% (8.4 g L -1 ) compared to an implemented high-performance production process with the most frequently applied induction strategy by a single addition of 1000 µM IPGT. Thus, identifying feasible reaction conditions for fed-batch processes in parallel continuous studies on a milliliter scale was shown to be a powerful, novel method to accelerate bioprocess design in a cost-reducing manner. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1426-1435, 2016. © 2016 American Institute of Chemical Engineers.

  15. Design and development of repetitive capacitor charging power supply based on series-parallel resonant converter topology.

    Science.gov (United States)

    Patel, Ankur; Nagesh, K V; Kolge, Tanmay; Chakravarthy, D P

    2011-04-01

    LCL resonant converter based repetitive capacitor charging power supply (CCPS) is designed and developed in the division. The LCL converter acts as a constant current source when switching frequency is equal to the resonant frequency. When both resonant inductors' values of LCL converter are same, it results in inherent zero current switching (ZCS) in switches. In this paper, ac analysis with fundamental frequency approximation of LCL resonant tank circuit, frequency dependent of current gain converter followed by design, development, simulation, and practical result is described. Effect of change in switching frequency and resonant frequency and change in resonant inductors ratio on CCPS will be discussed. An efficient CCPS of average output power of 1.2 kJ/s, output voltage 3 kV, and 300 Hz repetition rate is developed in the division. The performance of this CCPS has been evaluated in the laboratory by charging several values of load capacitance at various repetition rates. These results indicate that this design is very feasible for use in capacitor-charging applications. © 2011 American Institute of Physics

  16. The effect of a corticosteroid cream and a barrier-strengthening moisturizer in hand eczema. A double-blind, randomized, prospective, parallel group clinical trial.

    Science.gov (United States)

    Lodén, M; Wirén, K; Smerud, K T; Meland, N; Hønnås, H; Mørk, G; Lützow-Holm, C; Funk, J; Meding, B

    2012-05-01

    Hand eczema is a common and persistent disease with a relapsing course. Clinical data suggest that once daily treatment with corticosteroids is just as effective as twice daily treatment. The aim of this study was to compare once and twice daily applications of a strong corticosteroid cream in addition to maintenance therapy with a moisturizer in patients with a recent relapse of hand eczema. The study was a parallel, double-blind, randomized, clinical trial on 44 patients. Twice daily application of a strong corticosteroid cream (betamethasone valerate 0.1%) was compared with once daily application, where a urea-containing moisturizer was substituted for the corticosteroid cream in the morning. The investigator scored the presence of eczema and the patients judged the health-related quality of life (HRQoL) using the Dermatology Life Quality Index (DLQI), which measures how much the patient's skin problem has affected his/her life over the past week. The patients also judged the severity of their eczema daily on a visual analogue scale. Both groups improved in terms of eczema and DLQI. However, the clinical scoring demonstrated that once daily application of corticosteroid was superior to twice daily application in diminishing eczema, especially in the group of patients with lower eczema scores at inclusion. Twice daily use of corticosteroids was not superior to once daily use in treating eczema. On the contrary, the clinical assessment showed a larger benefit from once daily treatment compared with twice daily, especially in the group of patients with a moderate eczema at inclusion. © 2011 The Authors. Journal of the European Academy of Dermatology and Venereology © 2011 European Academy of Dermatology and Venereology.

  17. MOEA based design of decentralized controllers for LFC of interconnected power systems with nonlinearities, AC-DC parallel tie-lines and SMES units

    International Nuclear Information System (INIS)

    Ganapathy, S.; Velusami, S.

    2010-01-01

    A new design of Multi-Objective Evolutionary Algorithm based decentralized controllers for load-frequency control of interconnected power systems with Governor Dead Band and Generation Rate Constraint nonlinearities, AC-DC parallel tie-lines and Superconducting Magnetic Energy Storage (SMES) units, is proposed in this paper. The HVDC link is used as system interconnection in parallel with AC tie-line to effectively damp the frequency oscillations of AC system while the SMES unit provides bulk energy storage and release, thereby achieving combined benefits. The proposed controller satisfies two main objectives, namely, minimum Integral Squared Error of the system output and maximum closed-loop stability of the system. Simulation studies are conducted on a two area interconnected power system with nonlinearities, AC-DC tie-lines and SMES units. Results indicate that the proposed controller improves the transient responses and guarantees the closed-loop stability of the overall system even in the presence of system nonlinearities and with parameter changes.

  18. Efficacy and safety of sacubitril/valsartan (LCZ696) in Japanese patients with chronic heart failure and reduced ejection fraction: Rationale for and design of the randomized, double-blind PARALLEL-HF study.

    Science.gov (United States)

    Tsutsui, Hiroyuki; Momomura, Shinichi; Saito, Yoshihiko; Ito, Hiroshi; Yamamoto, Kazuhiro; Ohishi, Tomomi; Okino, Naoko; Guo, Weinong

    2017-09-01

    The prognosis of heart failure patients with reduced ejection fraction (HFrEF) in Japan remains poor, although there is growing evidence for increasing use of evidence-based pharmacotherapies in Japanese real-world HF registries. Sacubitril/valsartan (LCZ696) is a first-in-class angiotensin receptor neprilysin inhibitor shown to reduce mortality and morbidity in the recently completed largest outcome trial in patients with HFrEF (PARADIGM-HF trial). The prospectively designed phase III PARALLEL-HF (Prospective comparison of ARNI with ACE inhibitor to determine the noveL beneficiaL trEatment vaLue in Japanese Heart Failure patients) study aims to assess the clinical efficacy and safety of LCZ696 in Japanese HFrEF patients, and show similar improvements in clinical outcomes as the PARADIGM-HF study enabling the registration of LCZ696 in Japan. This is a multicenter, randomized, double-blind, parallel-group, active controlled study of 220 Japanese HFrEF patients. Eligibility criteria include a diagnosis of chronic HF (New York Heart Association Class II-IV) and reduced ejection fraction (left ventricular ejection fraction ≤35%) and increased plasma concentrations of natriuretic peptides [N-terminal pro B-type natriuretic peptide (NT-proBNP) ≥600pg/mL, or NT-proBNP ≥400pg/mL for those who had a hospitalization for HF within the last 12 months] at the screening visit. The study consists of three phases: (i) screening, (ii) single-blind active LCZ696 run-in, and (iii) double-blind randomized treatment. Patients tolerating LCZ696 50mg bid during the treatment run-in are randomized (1:1) to receive LCZ696 100mg bid or enalapril 5mg bid for 4 weeks followed by up-titration to target doses of LCZ696 200mg bid or enalapril 10mg bid in a double-blind manner. The primary outcome is the composite of cardiovascular death or HF hospitalization and the study is an event-driven trial. The design of the PARALLEL-HF study is aligned with the PARADIGM-HF study and aims to assess

  19. Parallel Lines

    Directory of Open Access Journals (Sweden)

    James G. Worner

    2017-05-01

    Full Text Available James Worner is an Australian-based writer and scholar currently pursuing a PhD at the University of Technology Sydney. His research seeks to expose masculinities lost in the shadow of Australia’s Anzac hegemony while exploring new opportunities for contemporary historiography. He is the recipient of the Doctoral Scholarship in Historical Consciousness at the university’s Australian Centre of Public History and will be hosted by the University of Bologna during 2017 on a doctoral research writing scholarship.   ‘Parallel Lines’ is one of a collection of stories, The Shapes of Us, exploring liminal spaces of modern life: class, gender, sexuality, race, religion and education. It looks at lives, like lines, that do not meet but which travel in proximity, simultaneously attracted and repelled. James’ short stories have been published in various journals and anthologies.

  20. PC6 acupoint stimulation for the prevention of postcardiac surgery nausea and vomiting: a protocol for a two-group, parallel, superiority randomised clinical trial.

    Science.gov (United States)

    Cooke, Marie; Rickard, Claire; Rapchuk, Ivan; Shekar, Kiran; Marshall, Andrea P; Comans, Tracy; Doi, Suhail; McDonald, John; Spooner, Amy

    2014-11-13

    Postoperative nausea and vomiting (PONV) are frequent but unwanted complications for patients following anaesthesia and cardiac surgery, affecting at least a third of patients, despite pharmacological treatment. The primary aim of the proposed research is to test the efficacy of PC6 acupoint stimulation versus placebo for reducing PONV in cardiac surgery patients. In conjunction with this we aim to develop an understanding of intervention fidelity and factors that support, or impede, the use of PC6 acupoint stimulation, a knowledge translation approach. 712 postcardiac surgery participants will be recruited to take part in a two-group, parallel, superiority, randomised controlled trial. Participants will be randomised to receive a wrist band on each wrist providing acupressure to PC six using acupoint stimulation or a placebo. Randomisation will be computer generated, use randomly varied block sizes, and be concealed prior to the enrolment of each patient. The wristbands will remain in place for 36 h. PONV will be evaluated by the assessment of both nausea and vomiting, use of rescue antiemetics, quality of recovery and cost. Patient satisfaction with PONV care will be measured and clinical staff interviewed about the clinical use, feasibility, acceptability and challenges of using acupressure wristbands for PONV. Ethics approval will be sought from appropriate Human Research Ethics Committee/s before start of the study. A systematic review of the use of wrist acupressure for PC6 acupoint stimulation reported minor side effects only. Study progress will be reviewed by a Data Safety Monitoring Committee (DSMC) for nausea and vomiting outcomes at n=350. Dissemination of results will include conference presentations at national and international scientific meetings and publications in peer-reviewed journals. Study participants will receive a one-page lay-summary of results. Australian New Zealand Clinical Trials Registry--ACTRN12614000589684. Published by the BMJ

  1. Continuous quality improvement interventions to improve long-term outcomes of antiretroviral therapy in women who initiated therapy during pregnancy or breastfeeding in the Democratic Republic of Congo: design of an open-label, parallel, group randomized trial.

    Science.gov (United States)

    Yotebieng, Marcel; Behets, Frieda; Kawende, Bienvenu; Ravelomanana, Noro Lantoniaina Rosa; Tabala, Martine; Okitolonda, Emile W

    2017-04-26

    Despite the rapid adoption of the World Health Organization's 2013 guidelines, children continue to be infected with HIV perinatally because of sub-optimal adherence to the continuum of HIV care in maternal and child health (MCH) clinics. To achieve the UNAIDS goal of eliminating mother-to-child HIV transmission, multiple, adaptive interventions need to be implemented to improve adherence to the HIV continuum. The aim of this open label, parallel, group randomized trial is to evaluate the effectiveness of Continuous Quality Improvement (CQI) interventions implemented at facility and health district levels to improve retention in care and virological suppression through 24 months postpartum among pregnant and breastfeeding women receiving ART in MCH clinics in Kinshasa, Democratic Republic of Congo. Prior to randomization, the current monitoring and evaluation system will be strengthened to enable collection of high quality individual patient-level data necessary for timely indicators production and program outcomes monitoring to inform CQI interventions. Following randomization, in health districts randomized to CQI, quality improvement (QI) teams will be established at the district level and at MCH clinics level. For 18 months, QI teams will be brought together quarterly to identify key bottlenecks in the care delivery system using data from the monitoring system, develop an action plan to address those bottlenecks, and implement the action plan at the level of their district or clinics. If proven to be effective, CQI as designed here, could be scaled up rapidly in resource-scarce settings to accelerate progress towards the goal of an AIDS free generation. The protocol was retrospectively registered on February 7, 2017. ClinicalTrials.gov Identifier: NCT03048669 .

  2. Clinical simulation as a boundary object in design of health IT-systems

    DEFF Research Database (Denmark)

    Rasmussen, Stine Loft; Jensen, Sanne; Lyng, Karen Marie

    2013-01-01

    simulation provides the opportunity to evaluate the design and the usage of clinical IT-systems without endangering the patients and interrupting clinical work. In this paper we present how clinical simulation additionally holds the potential to function as a boundary object in the design process. The case...... points out that clinical simulation provides an opportunity for discussions and mutual learning among the various stakeholders involved in design of standardized electronic clinical documentation templates. The paper presents and discusses the use of clinical simulation in the translation, transfer...... and transformation of knowledge between various stakeholders in a large healthcare organization...

  3. Parallel Framework for Cooperative Processes

    Directory of Open Access Journals (Sweden)

    Mitică Craus

    2005-01-01

    Full Text Available This paper describes the work of an object oriented framework designed to be used in the parallelization of a set of related algorithms. The idea behind the system we are describing is to have a re-usable framework for running several sequential algorithms in a parallel environment. The algorithms that the framework can be used with have several things in common: they have to run in cycles and the work should be possible to be split between several "processing units". The parallel framework uses the message-passing communication paradigm and is organized as a master-slave system. Two applications are presented: an Ant Colony Optimization (ACO parallel algorithm for the Travelling Salesman Problem (TSP and an Image Processing (IP parallel algorithm for the Symmetrical Neighborhood Filter (SNF. The implementations of these applications by means of the parallel framework prove to have good performances: approximatively linear speedup and low communication cost.

  4. Osteoarthritis subpopulations and implications for clinical trial design

    NARCIS (Netherlands)

    S.M. Bierma-Zeinstra (Sita); A.P. Verhagen (Arianne)

    2011-01-01

    textabstractTreatment guidelines for osteoarthritis have stressed the need for research on clinical predictors of response to different treatments. However, identifying such clinical predictors of response is less easy than it seems, and there is not a given classification of osteoarthritis

  5. Evaluation of pulsing magnetic field effects on paresthesia in multiple sclerosis patients, a randomized, double-blind, parallel-group clinical trial.

    Science.gov (United States)

    Afshari, Daryoush; Moradian, Nasrin; Khalili, Majid; Razazian, Nazanin; Bostani, Arash; Hoseini, Jamal; Moradian, Mohamad; Ghiasian, Masoud

    2016-10-01

    Evidence is mounting that magnet therapy could alleviate the symptoms of multiple sclerosis (MS). This study was performed to test the effects of the pulsing magnetic fields on the paresthesia in MS patients. This study has been conducted as a randomized, double-blind, parallel-group clinical trial during the April 2012 to October 2013. The subjects were selected among patients referred to MS clinic of Imam Reza Hospital; affiliated to Kermanshah University of Medical Sciences, Iran. Sixty three patients with MS were included in the study and randomly were divided into two groups, 35 patients were exposed to a magnetic pulsing field of 4mT intensity and 15-Hz frequency sinusoidal wave for 20min per session 2 times per week over a period of 2 months involving 16 sessions and 28 patients was exposed to a magnetically inactive field (placebo) for 20min per session 2 times per week over a period of 2 months involving 16 sessions. The severity of paresthesia was measured by the numerical rating scale (NRS) at 30, 60days. The study primary end point was NRS change between baseline and 60days. The secondary outcome was NRS change between baseline and 30days. Patients exposing to magnetic field showed significant paresthesia improvement compared with the group of patients exposing to placebo. According to our results pulsed magnetic therapy could alleviate paresthesia in MS patients .But trials with more patients and longer duration are mandatory to describe long-term effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Intensive insulin treatment improves forearm blood flow in critically ill patients: a randomized parallel design clinical trial.

    Science.gov (United States)

    Žuran, Ivan; Poredos, Pavel; Skale, Rafael; Voga, Gorazd; Gabrscek, Lucija; Pareznik, Roman

    2009-01-01

    Intensive insulin treatment of critically ill patients was seen as a promising method of treatment, though recent studies showed that reducing the blood glucose level below 6 mmol/l had a detrimental outcome. The mechanisms of the effects of insulin in the critically ill are not completely understood. The purpose of the study was to test the hypothesis that intensive insulin treatment may influence forearm blood flow independently of global hemodynamic indicators. The study encompassed 29 patients of both sexes who were admitted to the intensive care unit due to sepsis and required artificial ventilation as the result of acute respiratory failure. 14 patients were randomly selected for intensive insulin treatment (Group 1; blood glucose concentration 4.4-6.1 mmol/l), and 15 were selected for conventional insulin treatment (Group 2; blood glucose level 7.0 mmol/l-11.0 mmol/l). At the start of the study (t0, beginning up to 48 hours after admittance and the commencement of artificial ventilation), at 2 hours (t1), 24 hours (t2), and 72 hours (t3) flow in the forearm was measured for 60 minutes using the strain-gauge plethysmography method. Student's t-test of independent samples was used for comparisons between the two groups, and Mann-Whitney's U-test where appropriate. Linear regression analysis and the Pearson correlation coefficient were used to determine the levels of correlation. The difference in 60-minute forearm flow at the start of the study (t0) was not statistically significant between groups, while at t2 and t3 significantly higher values were recorded in Group 1 (t2; Group 1: 420.6 +/- 188.8 ml/100 ml tissue; Group 2: 266.1 +/- 122.2 ml/100 ml tissue (95% CI 30.9-278.0, P = 0.02); t3; Group 1: 369.9 +/- 150.3 ml/100 ml tissue; Group 2: 272.6 +/- 85.7 ml/100 ml tissue (95% CI 5.4-190.0, P = 0.04). At t1 a trend towards significantly higher values in Group 1 was noted (P = 0.05). The level of forearm flow was related to the amount of insulin infusion (r = 0.40). Compared to standard treatment, intensive insulin treatment of critically ill patients increases forearm flow. Flow increase was weakly related to the insulin dose, though not to blood glucose concentration. ISRCTN39026810.

  7. Data Warehouse Design from HL7 Clinical Document Architecture Schema.

    Science.gov (United States)

    Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L

    2015-01-01

    This paper proposes a semi-automatic approach to extract clinical information structured in a HL7 Clinical Document Architecture (CDA) and transform it in a data warehouse dimensional model schema. It is based on a conceptual framework published in a previous work that maps the dimensional model primitives with CDA elements. Its feasibility is demonstrated providing a case study based on the analysis of vital signs gathered during laboratory tests.

  8. Group sequential and confirmatory adaptive designs in clinical trials

    CERN Document Server

    Wassmer, Gernot

    2016-01-01

    This book provides an up-to-date review of the general principles of and techniques for confirmatory adaptive designs. Confirmatory adaptive designs are a generalization of group sequential designs. With these designs, interim analyses are performed in order to stop the trial prematurely under control of the Type I error rate. In adaptive designs, it is also permissible to perform a data-driven change of relevant aspects of the study design at interim stages. This includes, for example, a sample-size reassessment, a treatment-arm selection or a selection of a pre-specified sub-population. Essentially, this adaptive methodology was introduced in the 1990s. Since then, it has become popular and the object of intense discussion and still represents a rapidly growing field of statistical research. This book describes adaptive design methodology at an elementary level, while also considering designing and planning issues as well as methods for analyzing an adaptively planned trial. This includes estimation methods...

  9. Expressing Parallelism with ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Piparo, D. [CERN; Tejedor, E. [CERN; Guiraud, E. [CERN; Ganis, G. [CERN; Mato, P. [CERN; Moneta, L. [CERN; Valls Pla, X. [CERN; Canal, P. [Fermilab

    2017-11-22

    The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module in Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.

  10. Expressing Parallelism with ROOT

    Science.gov (United States)

    Piparo, D.; Tejedor, E.; Guiraud, E.; Ganis, G.; Mato, P.; Moneta, L.; Valls Pla, X.; Canal, P.

    2017-10-01

    The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module in Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.

  11. Designing an automated clinical decision support system to match clinical practice guidelines for opioid therapy for chronic pain

    Directory of Open Access Journals (Sweden)

    Clark Michael E

    2010-04-01

    Full Text Available Abstract Background Opioid prescribing for chronic pain is common and controversial, but recommended clinical practices are followed inconsistently in many clinical settings. Strategies for increasing adherence to clinical practice guideline recommendations are needed to increase effectiveness and reduce negative consequences of opioid prescribing in chronic pain patients. Methods Here we describe the process and outcomes of a project to operationalize the 2003 VA/DOD Clinical Practice Guideline for Opioid Therapy for Chronic Non-Cancer Pain into a computerized decision support system (DSS to encourage good opioid prescribing practices during primary care visits. We based the DSS on the existing ATHENA-DSS. We used an iterative process of design, testing, and revision of the DSS by a diverse team including guideline authors, medical informatics experts, clinical content experts, and end-users to convert the written clinical practice guideline into a computable algorithm to generate patient-specific recommendations for care based upon existing information in the electronic medical record (EMR, and a set of clinical tools. Results The iterative revision process identified numerous and varied problems with the initially designed system despite diverse expert participation in the design process. The process of operationalizing the guideline identified areas in which the guideline was vague, left decisions to clinical judgment, or required clarification of detail to insure safe clinical implementation. The revisions led to workable solutions to problems, defined the limits of the DSS and its utility in clinical practice, improved integration into clinical workflow, and improved the clarity and accuracy of system recommendations and tools. Conclusions Use of this iterative process led to development of a multifunctional DSS that met the approval of the clinical practice guideline authors, content experts, and clinicians involved in testing. The

  12. Identification of a botanical inhibitor of intestinal diacylglyceride acyltransferase 1 activity via in vitro screening and a parallel, randomized, blinded, placebo-controlled clinical trial.

    Science.gov (United States)

    Velliquette, Rodney A; Grann, Kerry; Missler, Stephen R; Patterson, Jennifer; Hu, Chun; Gellenbeck, Kevin W; Scholten, Jeffrey D; Randolph, R Keith

    2015-01-01

    Diacylglyceride acyltransferase 1 (DGAT1) is the enzyme that adds the final fatty acid on to a diacylglyceride during triglyceride (TG) synthesis. DGAT1 plays a key role in the repackaging of dietary TG into circulating TG rich chylomicrons. A growing amount of research has indicated that an exaggerated postprandial circulating TG level is a risk indicator for cardiovascular and metabolic disorders. The aim of this research was to identify a botanical extract that inhibits intestinal DGAT1 activity and attenuates postprandial hypertriglyceridemia in overweight and obese humans. Twenty individual phytochemicals and an internal proprietary botanical extract library were screened with a primary cell-free DGAT1 enzyme assay that contained dioleoyl glycerol and palmitoleoyl Coenzyme A as substrates plus human intestinal microsomes as the DGAT1 enzyme source. Botanical extracts with IC50 values botanical extracts were then evaluated in a parallel, double-blind, placebo-controlled clinical trial. Ninety healthy, overweight and obese participants were randomized to receive 2 g daily of placebo or individual botanical extracts (the investigational product) for seven days. Serum TG levels were measured before and after consuming a high fat meal (HFM) challenge (0.354 L drink/shake; 77 g fat, 25 g carbohydrate and 9 g protein) as a marker of intestinal DGAT1 enzyme activity. Phenolic acids (i.e., gallic acid) and polyphenols (i.e., cyanidin) abundantly found in nature appeared to inhibit DGAT1 enzyme activity in vitro. Four polyphenolic rich botanical extracts were identified from in vitro evaluation in both cell-free and cellular model systems: apple peel extract (APE), grape extract (GE), red raspberry leaf extract (RLE) and apricot/nectarine extract (ANE) (IC50 = 1.4, 5.6, and 10.4 and 3.4 μg/mL, respectively). In the seven day clinical trial, compared to placebo, only GE significantly reduced the baseline subtracted change in serum TG AUC following

  13. Ensuring suitable quality of clinical measurements through design.

    Science.gov (United States)

    Orzechowski, Anthony; Petrides, Victoria; Scopp, Richard

    2018-04-17

    To design and deliver high quality, safe and effective products, manufacturers of in vitro diagnostic (IVD) products follow a structured, traceable process for controlling the uncertainty of results reported from their measurement systems. This process and its results however, are not often shared in detail with those outside of the manufacturing company. The objective of this paper is to facilitate discussion by describing some of the best practices used during the IVD design and development process, highlighting some design challenges manufacturers face, and to offer ideas for how IVD manufacturers and laboratories could work together to drive further improvement to public health. Copyright © 2018. Published by Elsevier Inc.

  14. Design of universal parallel-transmit refocusing kT -point pulses and application to 3D T2 -weighted imaging at 7T.

    Science.gov (United States)

    Gras, Vincent; Mauconduit, Franck; Vignaud, Alexandre; Amadon, Alexis; Le Bihan, Denis; Stöcker, Tony; Boulant, Nicolas

    2018-07-01

    T 2 -weighted sequences are particularly sensitive to the radiofrequency (RF) field inhomogeneity problem at ultra-high-field because of the errors accumulated by the imperfections of the train of refocusing pulses. As parallel transmission (pTx) has proved particularly useful to counteract RF heterogeneities, universal pulses were recently demonstrated to save precious time and computational efforts by skipping B 1 calibration and online RF pulse tailoring. Here, we report a universal RF pulse design for non-selective refocusing pulses to mitigate the RF inhomogeneity problem at 7T in turbo spin-echo sequences with variable flip angles. Average Hamiltonian theory was used to synthetize a single non-selective refocusing pulse with pTx while optimizing its scaling properties in the presence of static field offsets. The design was performed under explicit power and specific absorption rate constraints on a database of 10 subjects using a 8Tx-32Rx commercial coil at 7T. To validate the proposed design, the RF pulses were tested in simulation and applied in vivo on 5 additional test subjects. The root-mean-square rotation angle error (RA-NRMSE) evaluation and experimental data demonstrated great improvement with the proposed universal pulses (RA-NRMSE ∼8%) compared to the standard circularly polarized mode of excitation (RA-NRMSE ∼26%). This work further completes the spectrum of 3D universal pulses to mitigate RF field inhomogeneity throughout all 3D MRI sequences without any pTx calibration. The approach returns a single pulse that can be scaled to match the desired flip angle train, thereby increasing the modularity of the proposed plug and play approach. Magn Reson Med 80:53-65, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  15. Churchill: an ultra-fast, deterministic, highly scalable and balanced parallelization strategy for the discovery of human genetic variation in clinical and population-scale genomics.

    Science.gov (United States)

    Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter

    2015-01-20

    While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.

  16. Design of an optimization algorithm for clinical use

    International Nuclear Information System (INIS)

    Gustafsson, Anders

    1995-01-01

    Radiation therapy optimization has received much attention in the past few years. In combination with biological objective functions, the different optimization schemes has shown a potential to considerably increase the treatment outcome. With improved radiobiological models and increased computer capacity, radiation therapy optimization has now reached a stage where implementation in a clinical treatment planning system is realistic. A radiation therapy optimization method has been investigated with respect to its feasibility as a tool in a clinical 3D treatment planning system. The optimization algorithm is a constrained iterative gradient method. Photon dose calculation is performed using the clinically validated pencil-beam based algorithm of the clinical treatment planning system. Dose calculation within the optimization scheme is very time consuming and measures are required to decrease the calculation time. Different methods for more effective dose calculation within the optimization scheme have been investigated. The optimization results for adaptive sampling of calculation points, and secondary effect approximations in the dose calculation algorithm are compared with the optimization result for accurate dose calculation in all voxels of interest

  17. Rigorous Clinical Trial Design in Public Health Emergencies Is Essential

    DEFF Research Database (Denmark)

    Ellenberg, Susan S; Keusch, Gerald T; Babiker, Abdel G

    2018-01-01

    Randomized clinical trials are the most reliable approaches to evaluating the effects of new treatments and vaccines. During the 2014-15 West African Ebola epidemic, many argued that such trials were neither ethical nor feasible in an environment of limited health infrastructure and severe disease...

  18. Design and Evaluation of a Bacterial Clinical Infectious Diseases Ontology

    Science.gov (United States)

    Gordon, Claire L.; Pouch, Stephanie; Cowell, Lindsay G.; Boland, Mary Regina; Platt, Heather L.; Goldfain, Albert; Weng, Chunhua

    2013-01-01

    With antimicrobial resistance increasing worldwide, there is a great need to use automated antimicrobial decision support systems (ADSSs) to lower antimicrobial resistance rates by promoting appropriate antimicrobial use. However, they are infrequently used mostly because of their poor interoperability with different health information technologies. Ontologies can augment portable ADSSs by providing an explicit knowledge representation for biomedical entities and their relationships, helping to standardize and integrate heterogeneous data resources. We developed a bacterial clinical infectious diseases ontology (BCIDO) using Protégé-OWL. BCIDO defines a controlled terminology for clinical infectious diseases along with domain knowledge commonly used in hospital settings for clinical infectious disease treatment decision-making. BCIDO has 599 classes and 2355 object properties. Terms were imported from or mapped to Systematized Nomenclature of Medicine, Unified Medical Language System, RxNorm and National Center for Bitechnology Information Organismal Classification where possible. Domain expert evaluation using the “laddering” technique, ontology visualization, and clinical notes and scenarios, confirmed the correctness and potential usefulness of BCIDO. PMID:24551353

  19. Pharmacokinetics and bioavailability of plant lignan 7-hydroxymatairesinol and effects on serum enterolactone and clinical symptoms in postmenopausal women: a single-blinded, parallel, dose-comparison study.

    Science.gov (United States)

    Udani, Jay K; Brown, Donald J; Tan, Maria Olivia C; Hardy, Mary

    2013-01-01

    7-Hydroxymaitairesinol (7-HMR) is a naturally occurring plant lignan found in whole grains and the Norway spruce (Piciea abies). The purpose of this study was to evaluate the bioavailability of a proprietary 7-HMR product (HMRlignan, Linnea SA, Locarno, Switzerland) through measurement of lignan metabolites and metabolic precursors. A single-blind, parallel, pharmacokinetic and dose-comparison study was conducted on 22 postmenopausal females not receiving hormone replacement therapy. Subjects were enrolled in either a 36 mg/d (low-dose) or 72 mg/d dose (high-dose) regimen for 8 weeks. Primary measured outcomes included plasma levels of 7-HMR and enterolactone (ENL), and single-dose pharmacokinetic analysis was performed on a subset of subjects in the low-dose group. Safety data and adverse event reports were collected as well as data on hot flash frequency and severity. Pharmacokinetic studies demonstrated 7-HMR C max = 757.08 ng/ml at 1 hour and ENL C max = 4.8 ng/ml at 24 hours. From baseline to week 8, plasma 7-HMR levels increased by 191% in the low-dose group (p < 0.01) and by 1238% in the high-dose group (p < 0.05). Plasma ENL levels consistently increased as much as 157% from baseline in the low-dose group and 137% in the high-dose group. Additionally, the mean number of weekly hot flashes decreased by 50%, from 28.0/week to 14.3/week (p < 0.05) in the high-dose group. No significant safety issues were identified in this study. The results demonstrate that HMRlignan is quickly absorbed into the plasma and is metabolized to ENL in healthy postmenopausal women. Clinically, the data demonstrate a statistically significant improvement in hot flash frequency. Doses up to 72 mg/d HMRlignan for 8 weeks were safe and well tolerated in this population.

  20. Analysis and design of a parallel-connected single active bridge DC-DC converter for high-power wind farm applications

    DEFF Research Database (Denmark)

    Park, Kiwoo; Chen, Zhe

    2013-01-01

    This paper presents a parallel-connected Single Active Bridge (SAB) dc-dc converter for high-power applications. Paralleling lower-power converters can lower the current rating of each modular converter and interleaving the outputs can significantly reduce the magnitudes of input and output curre...

  1. Reinventing clinical trials: a review of innovative biomarker trial designs in cancer therapies.

    Science.gov (United States)

    Lin, Ja-An; He, Pei

    2015-06-01

    Recently, new clinical trial designs involving biomarkers have been studied and proposed in cancer clinical research, in the hope of incorporating the rapid growing basic research into clinical practices. Journal articles related to various biomarkers and their role in cancer clinical trial, articles and books about statistical issues in trial design, and regulatory website, documents, and guidance for submission of targeted cancer therapies. The drug development process involves four phases. The confirmatory Phase III is essential in regulatory approval of a special treatment. Regulatory agency has restrictions on confirmatory trials 'using adaptive designs'. No rule of thumb to pick the most appropriate design for biomarker-related trials. Statistical issues to solve in new designs. Regulatory acceptance of the 'newly proposed trial designs'. Biomarker-related trial designs that can resolve the statistical issues and satisfy the regulatory requirement. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Design and implementation of an integrated architecture for massive parallel data treatment of analogue signals supplied by silicon detectors of very high spatial resolution

    International Nuclear Information System (INIS)

    Michel, J.

    1993-02-01

    This doctorate thesis studies an integrated architecture designed to a parallel massive treatment of analogue signals supplied by silicon detectors of very high spatial resolution. The first chapter is an introduction presenting the general outline and the triggering conditions of the spectrometer. Chapter two describes the operational structure of a microvertex detector made of Si micro-plates associated to the measuring chains. Information preconditioning is related to the pre-amplification stage, to the pile-up effects and to the reduction in the time characteristic due to the high counting rates. The chapter three describes the architecture of the analogue delay buffer, makes an analysis of the intrinsic noise and presents the operational testings and input/output control operations. The fourth chapter is devoted to the description of the analogue pulse shape processor and gives also the testings and the corresponding measurements on the circuit. Finally, the chapter five deals with the simplest modeling of the entire conditioning chain. Also, the testings and measuring procedures are here discussed. In conclusion the author presents some prospects for improving the signal-to-noise ratio by summation of the de-convoluted micro-paths. 78 refs., 78 figs., 1 annexe

  3. Principles of human joint replacement design and clinical application

    CERN Document Server

    Buechel, Frederick F

    2015-01-01

    This book is written for the users and designers of joint replacements. In its second extended edition it conveys to the reader the knowledge accumulated by the authors during their forty year effort on the development of replacement devices for the lower limb for the purpose of aiding the reader in their design and evaluation of joint replacement devices. The early chapters describe the engineering, scientific and medical principles needed for replacement joint evaluation. One must understand the nature and performance of the materials involved and their characteristics in vivo, i.e. the response of the body to implant materials. It is also essential to understand the response of the implants to applied loading and motion, particularly in the hostile physiological environment. A chapter describes the design methodology now required for joint replacement in the USA and EU countries. The remaining chapters provide a history of joint replacement, an evaluation of earlier and current devices and sample case hist...

  4. Principles of Human Joint Replacement Design and Clinical Application

    CERN Document Server

    Buechel, Frederick F

    2012-01-01

    Drs. Buechel, an orthopaedic surgeon, and Pappas, a professor of Mechanical Engineering, are the designers of several successful joint replacement systems. The most well-known of these is the pioneering LCS knee replacement. They have written this book for the users and designers of joint replacements. It is an attempt to convey to the reader the knowledge accumulated by the authors during their thirty five year effort on the development of replacement devices for the lower limb for the purpose of aiding the reader in their design and evaluation of joint replacement devices. The early chapters describe the engineering, scientific and medical principles needed for replacement joint evaluation. One must understand the nature and performance of the materials involved and their characteristics in vivo, i.e. the response of the body to implant materials. It is also essential to understand the response of the implants to applied loading and motion, particularly in the hostile physiological environment. A chapter de...

  5. The Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  6. Immediate interruption of sedation compared with usual sedation care in critically ill postoperative patients (SOS-Ventilation): a randomised, parallel-group clinical trial.

    Science.gov (United States)

    Chanques, Gerald; Conseil, Matthieu; Roger, Claire; Constantin, Jean-Michel; Prades, Albert; Carr, Julie; Muller, Laurent; Jung, Boris; Belafia, Fouad; Cissé, Moussa; Delay, Jean-Marc; de Jong, Audrey; Lefrant, Jean-Yves; Futier, Emmanuel; Mercier, Grégoire; Molinari, Nicolas; Jaber, Samir

    2017-10-01

    Avoidance of excessive sedation and subsequent prolonged mechanical ventilation in intensive care units (ICUs) is recommended, but no data are available for critically ill postoperative patients. We hypothesised that in such patients stopping sedation immediately after admission to the ICU could reduce unnecessary sedation and improve patient outcomes. We did a randomised, parallel-group, clinical trial at three ICUs in France. Stratified randomisation with minimisation (1:1 via a restricted web platform) was used to assign eligible patients (aged ≥18 years, admitted to an ICU after abdominal surgery, and expected to require at least 12 h of mechanical ventilation because of a critical illness defined by a Sequential Organ Failure Assessment score >1 for any organ, but without severe acute respiratory distress syndrome or brain injury) to usual sedation care provided according to recommended practices (control group) or to immediate interruption of sedation (intervention group). The primary outcome was the time to successful extubation (defined as the time from randomisation to the time of extubation [or tracheotomy mask] for at least 48 h). All patients who underwent randomisation (except for those who were excluded after randomisation) were included in the intention-to-treat analysis. This study is registered with ClinicalTrials.gov, number NCT01486121. Between Dec 2, 2011, and Feb 27, 2014, 137 patients were randomly assigned to the control (n=68) or intervention groups (n=69). In the intention-to-treat analysis, time to successful extubation was significantly lower in the intervention group than in the control group (median 8 h [IQR 4-36] vs 50 h [29-93], group difference -33·6 h [95% CI -44·9 to -22·4]; p<0·0001). The adjusted hazard ratio was 5·2 (95% CI 3·1-8·8, p<0·0001). Immediate interruption of sedation in critically ill postoperative patients with organ dysfunction who were admitted to the ICU after abdominal surgery improved outcomes compared

  7. Concordance: Design Ideal for Facilitating Situated Negotiations in Out-of-clinic Healthcare

    DEFF Research Database (Denmark)

    Bagalkot, Naveen L.; Gronvall, Erik; Sokoler, Tomas

    2014-01-01

    Healthcare HCI research has explored various designs that encourage people to follow prescribed treatments, mostly adopting compliance and adherence as design ideals. However, within the medical sciences the notion of concordance also exists. Concordance promotes negotiation between the patient...... and healthcare professional for forging a therapeutic alliance. However, the HCI community has still not adopted concordance as a design ideal. This paper revisits four old design-cases to explore the role of concordance in out-of-clinic healthcare. We argue that concordance, as a design ideal, can guide new...... designs that promote a more active patient-role both at the clinic and beyond....

  8. 77 FR 30016 - Clinical Study Design and Performance of Hospital Glucose Sensors

    Science.gov (United States)

    2012-05-21

    ...] Clinical Study Design and Performance of Hospital Glucose Sensors AGENCY: Food and Drug Administration, HHS... Sensors.'' The purpose of this public meeting is to discuss clinical study design considerations and performance metrics for innovative glucose sensors intended to be used in hospital point of care settings...

  9. Parallel integer sorting with medium and fine-scale parallelism

    Science.gov (United States)

    Dagum, Leonardo

    1993-01-01

    Two new parallel integer sorting algorithms, queue-sort and barrel-sort, are presented and analyzed in detail. These algorithms do not have optimal parallel complexity, yet they show very good performance in practice. Queue-sort designed for fine-scale parallel architectures which allow the queueing of multiple messages to the same destination. Barrel-sort is designed for medium-scale parallel architectures with a high message passing overhead. The performance results from the implementation of queue-sort on a Connection Machine CM-2 and barrel-sort on a 128 processor iPSC/860 are given. The two implementations are found to be comparable in performance but not as good as a fully vectorized bucket sort on the Cray YMP.

  10. Parallel Randomized Controlled Clinical Trial in Patients with Temporomandibular Disorders Treated with a CAD/CAM Versus a Conventional Stabilization Splint.

    Science.gov (United States)

    Pho Duc, Jean Marc; Hüning, Sandra Vargas; Grossi, Márcio Lima

    2016-01-01

    This parallel randomized controlled trial (RCT) compared the efficacy of a computer-aided design/computer-assisted manufacture (CAD/CAM) splint versus a conventional stabilization splint in patients with temporomandibular disorders (TMD). A sample of 48 age-matched TMD patients from the Ludwig Maximilian University Prosthodontic Department in Munich, Germany, were randomly allocated into groups 1 (CAD/CAM splint) and 2 (conventional splint). The Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) was used for TMD Axis I (groups I, II, and III) and Axis II (chronic pain grade [CPG]) diagnoses. Numeric scales (TMD/NS, 10 cm) were used to measure headaches, face pain, jaw joint pain, jaw joint noises, mastication pain, neck pain, face tension, limitation of mouth opening, complaints during mastication, and teeth sensitivity at baseline and then monthly for 9 months (T₁ to T₁₀). Optical axiography was used to measure right and left condyle movements (mm) at baseline, 3 months, and 6 months (T₁, T₄, and T₇). A total of 32 patients (drop-out rate = 33%; 68.75% women; 28.51 ± 7.13 years old), 16 per group, completed the study. RDC/TMD Axis I showed the following diagnoses: 93.75% muscle disorders, 37.75% disc displacement with reduction, 3.12% disc displacement without reduction, and 56.25% arthralgia. There was a significant reduction in 10 out of 13 items of the TMD/NS in the CAD/CAM splint versus 8 out of 13 in the conventional splint. However, no significant improvement in mandibular movements (ie, increase in range of motion and reduction in asymmetry between right and left condyles) was observed. Both treatments were equally efficacious and no difference was found between them.

  11. A police education programme to integrate occupational safety and HIV prevention: protocol for a modified stepped-wedge study design with parallel prospective cohorts to assess behavioural outcomes

    Science.gov (United States)

    Strathdee, Steffanie A; Arredondo, Jaime; Rocha, Teresita; Abramovitz, Daniela; Rolon, Maria Luisa; Patiño Mandujano, Efrain; Rangel, Maria Gudelia; Olivarria, Horcasitas Omar; Gaines, Tommi; Patterson, Thomas L; Beletsky, Leo

    2015-01-01

    Introduction Policing practices are key drivers of HIV among people who inject drugs (PWID). This paper describes the protocol for the first study to prospectively examine the impact of a police education programme (PEP) to align law enforcement and HIV prevention. PEPs incorporating HIV prevention (including harm reduction programmes like syringe exchange) have been successfully piloted in several countries but were limited to brief pre–post assessments; the impact of PEPs on policing behaviours and occupational safety is unknown. Objectives Proyecto ESCUDO (SHIELD) aims to evaluate the efficacy of the PEP on uptake of occupational safety procedures, as assessed through the incidence of needle stick injuries (NSIs) (primary outcome) and changes in knowledge of transmission, prevention and treatment of HIV and viral hepatitis; attitudes towards PWID, adverse behaviours that interfere with HIV prevention and protective behaviours (secondary outcomes). Methods/analysis ESCUDO is a hybrid type I design that simultaneously tests an intervention and an implementation strategy. Using a modified stepped-wedge design involving all active duty street-level police officers in Tijuana (N=∼1200), we will administer one 3 h PEP course to groups of 20–50 officers until the entire force is trained. NSI incidence and geocoded arrest data will be assessed from department-wide de-identified data. Of the consenting police officers, a subcohort (N=500) will be randomly sampled from each class to undergo pre-PEP and post-PEP surveys with a semiannual follow-up for 2 years to assess self-reported NSIs, attitudes and behaviour changes. The impact on PWIDs will be externally validated through a parallel cohort of Tijuana PWIDs. Ethics/dissemination Research ethics approval was obtained from the USA and Mexico. Findings will be disseminated through open access to protocol materials through the Law Enforcement and HIV Network. Trial registration number NCT02444403. PMID:26260350

  12. Research design considerations for chronic pain prevention clinical trials

    DEFF Research Database (Denmark)

    Gewandter, Jennifer S; Dworkin, Robert H; Turk, Dennis C

    2015-01-01

    Although certain risk factors can identify individuals who are most likely to develop chronic pain, few interventions to prevent chronic pain have been identified. To facilitate the identification of preventive interventions, an IMMPACT meeting was convened to discuss research design considerations...

  13. The humanization of catheter room design: its clinical practice

    International Nuclear Information System (INIS)

    Lin Hanying; Shi Fengxia; Guo Huiying

    2011-01-01

    American scholar Engeer has proposed biological, psychological and sociological medicine pattern, which has been well accepted by the society, It has manifested the medical arena humanism return and has made the profound influence on the nursing development. The idea, 'the human is a whole', has gradually become the mainstream of the nurse service concept, meanwhile, the environment has more and more become a beneficial part for diagnosing and treating in hospitalization. The improvement and more user-friendly design of the diagnosing and treating environment has already become an important ring linked with the whole nursing work. At the beginning of the fitting up design for the Catheter Lab Room of Interventional Radiology in General Hospital of PLA, the authors receive the idea 'the environment experience and admiration of the patient', put more attention to the humanization in the diagnosing and treating environmental construction. The functional compartments are separated clearly. The color, the background music as well as the video are designed to be coordinated with each other in order to produce a relaxing system. Practice for the past three years indicates that the use of humanization environment design can markedly reduce the patient intense and the anxious level in perioperative period, it can also significantly promote the patient to be restored to health. This article will describe user-friendly diagnosing and treating environmental construction practice in the Catheter Lab Room of Interventional Radiology in General Hospital of PLA. (authors)

  14. Emollient bath additives for the treatment of childhood eczema (BATHE): multicentre pragmatic parallel group randomised controlled trial of clinical and cost effectiveness.

    Science.gov (United States)

    Santer, Miriam; Ridd, Matthew J; Francis, Nick A; Stuart, Beth; Rumsby, Kate; Chorozoglou, Maria; Becque, Taeko; Roberts, Amanda; Liddiard, Lyn; Nollett, Claire; Hooper, Julie; Prude, Martina; Wood, Wendy; Thomas, Kim S; Thomas-Jones, Emma; Williams, Hywel C; Little, Paul

    2018-05-03

    To determine the clinical effectiveness and cost effectiveness of including emollient bath additives in the management of eczema in children. Pragmatic randomised open label superiority trial with two parallel groups. 96 general practices in Wales and western and southern England. 483 children aged 1 to 11 years, fulfilling UK diagnostic criteria for atopic dermatitis. Children with very mild eczema and children who bathed less than once weekly were excluded. Participants in the intervention group were prescribed emollient bath additives by their usual clinical team to be used regularly for 12 months. The control group were asked to use no bath additives for 12 months. Both groups continued with standard eczema management, including leave-on emollients, and caregivers were given standardised advice on how to wash participants. The primary outcome was eczema control measured by the patient oriented eczema measure (POEM, scores 0-7 mild, 8-16 moderate, 17-28 severe) weekly for 16 weeks. Secondary outcomes were eczema severity over one year (monthly POEM score from baseline to 52 weeks), number of eczema exacerbations resulting in primary healthcare consultation, disease specific quality of life (dermatitis family impact), generic quality of life (child health utility-9D), utilisation of resources, and type and quantity of topical corticosteroid or topical calcineurin inhibitors prescribed. 483 children were randomised and one child was withdrawn, leaving 482 children in the trial: 51% were girls (244/482), 84% were of white ethnicity (447/470), and the mean age was 5 years. 96% (461/482) of participants completed at least one post-baseline POEM, so were included in the analysis, and 77% (370/482) completed questionnaires for more than 80% of the time points for the primary outcome (12/16 weekly questionnaires to 16 weeks). The mean baseline POEM score was 9.5 (SD 5.7) in the bath additives group and 10.1 (SD 5.8) in the no bath additives group. The mean POEM score

  15. An Introduction to Parallel Computation R

    Indian Academy of Sciences (India)

    How are they programmed? This article provides an introduction. A parallel computer is a network of processors built for ... and have been used to solve problems much faster than a single ... in parallel computer design is to select an organization which ..... The most ambitious approach to parallel computing is to develop.

  16. Oxytocin: parallel processing in the social brain?

    Science.gov (United States)

    Dölen, Gül

    2015-06-01

    Early studies attempting to disentangle the network complexity of the brain exploited the accessibility of sensory receptive fields to reveal circuits made up of synapses connected both in series and in parallel. More recently, extension of this organisational principle beyond the sensory systems has been made possible by the advent of modern molecular, viral and optogenetic approaches. Here, evidence supporting parallel processing of social behaviours mediated by oxytocin is reviewed. Understanding oxytocinergic signalling from this perspective has significant implications for the design of oxytocin-based therapeutic interventions aimed at disorders such as autism, where disrupted social function is a core clinical feature. Moreover, identification of opportunities for novel technology development will require a better appreciation of the complexity of the circuit-level organisation of the social brain. © 2015 The Authors. Journal of Neuroendocrinology published by John Wiley & Sons Ltd on behalf of British Society for Neuroendocrinology.

  17. A parallel robot to assist vitreoretinal surgery

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, Taiga; Sugita, Naohiko; Mitsuishi, Mamoru [University of Tokyo, School of Engineering, Tokyo (Japan); Ueta, Takashi; Tamaki, Yasuhiro [University of Tokyo, Graduate School of Medicine, Tokyo (Japan)

    2009-11-15

    This paper describes the development and evaluation of a parallel prototype robot for vitreoretinal surgery where physiological hand tremor limits performance. The manipulator was specifically designed to meet requirements such as size, precision, and sterilization; this has six-degree-of-freedom parallel architecture and provides positioning accuracy with micrometer resolution within the eye. The manipulator is controlled by an operator with a ''master manipulator'' consisting of multiple joints. Results of the in vitro experiments revealed that when compared to the manual procedure, a higher stability and accuracy of tool positioning could be achieved using the prototype robot. This microsurgical system that we have developed has superior operability as compared to traditional manual procedure and has sufficient potential to be used clinically for vitreoretinal surgery. (orig.)

  18. Aspects of computation on asynchronous parallel processors

    International Nuclear Information System (INIS)

    Wright, M.

    1989-01-01

    The increasing availability of asynchronous parallel processors has provided opportunities for original and useful work in scientific computing. However, the field of parallel computing is still in a highly volatile state, and researchers display a wide range of opinion about many fundamental questions such as models of parallelism, approaches for detecting and analyzing parallelism of algorithms, and tools that allow software developers and users to make effective use of diverse forms of complex hardware. This volume collects the work of researchers specializing in different aspects of parallel computing, who met to discuss the framework and the mechanics of numerical computing. The far-reaching impact of high-performance asynchronous systems is reflected in the wide variety of topics, which include scientific applications (e.g. linear algebra, lattice gauge simulation, ordinary and partial differential equations), models of parallelism, parallel language features, task scheduling, automatic parallelization techniques, tools for algorithm development in parallel environments, and system design issues

  19. Comparing oncology clinical programs by use of innovative designs and expected net present value optimization: Which adaptive approach leads to the best result?

    Science.gov (United States)

    Parke, Tom; Marchenko, Olga; Anisimov, Vladimir; Ivanova, Anastasia; Jennison, Christopher; Perevozskaya, Inna; Song, Guochen

    2017-01-01

    Designing an oncology clinical program is more challenging than designing a single study. The standard approaches have been proven to be not very successful during the last decade; the failure rate of Phase 2 and Phase 3 trials in oncology remains high. Improving a development strategy by applying innovative statistical methods is one of the major objectives of a drug development process. The oncology sub-team on Adaptive Program under the Drug Information Association Adaptive Design Scientific Working Group (DIA ADSWG) evaluated hypothetical oncology programs with two competing treatments and published the work in the Therapeutic Innovation and Regulatory Science journal in January 2014. Five oncology development programs based on different Phase 2 designs, including adaptive designs and a standard two parallel arm Phase 3 design were simulated and compared in terms of the probability of clinical program success and expected net present value (eNPV). In this article, we consider eight Phase2/Phase3 development programs based on selected combinations of five Phase 2 study designs and three Phase 3 study designs. We again used the probability of program success and eNPV to compare simulated programs. For the development strategies, we considered that the eNPV showed robust improvement for each successive strategy, with the highest being for a three-arm response adaptive randomization design in Phase 2 and a group sequential design with 5 analyses in Phase 3.

  20. Innovative designs for radiation oncology research in clinical trials

    International Nuclear Information System (INIS)

    Rubin, P.; Keys, H.; Salazar, O.

    1987-01-01

    The goals of the research mission are clear: (1) to explore in a logical, systematic, organized, and coordinated fashion those potential therapeutic advances that show promise; (2) to identify, in the laboratory, those models that predict successful combinations of treatment modes; (3) to search for ultimate treatment of programs that improve the therapeutic ratio. The modalities, innovations in treatment, and logic exist; it now rests with meticulous science to show carefully the directions that clinical research should follow in pursuit of the ultimate objective of improving cancer cure rates with less toxicity

  1. A Clinical Pilot Study Comparing Sweet Bee Venom parallel treatment with only Acupuncture Treatment in patient diagnosed with lumbar spine sprain

    Directory of Open Access Journals (Sweden)

    Shin Yong-jeen

    2011-06-01

    Full Text Available Objectives: This study was carried out to compare the Sweet Bee Venom (referred to as Sweet BV hereafter acupuncture parallel treatment to treatment with acupuncture only for the patient diagnosed with lumbar spine sprain and find a better treatment. Methods: The subjects were patients diagnosed with lumbar spine sprain and hospitalized at Suncheon oriental medical hospital, which was randomly divided into sweet BV parallel treatment group and acupuncture-only group, and other treatment conditions were maintained the same. Then,VAS (Visual Analogue Scale was used to compare the difference in the treatment period between the two groups from VAS 10 to VAS 0, from VAS 10 to VAS 5, and from VAS 5 to VAS 0. Result & Conclusion: Sweet BV parallel treatment group and acupuncture-only treatment group were compared regarding the respective treatment period, and as the result, the treatment period from VAS 10 to VAS 5 was significantly reduced in sweet BV parallel treatment group compared to the acupuncture-only treatment group, but the treatment period from VAS 5 to VAS 0 did not show a significant difference. Therefore, it can be said that sweet BV parallel treatment is effective in shortening the treatment period and controlling early pain compared to acupuncture-only treatment.

  2. User-centered design to improve clinical decision support in primary care.

    Science.gov (United States)

    Brunner, Julian; Chuang, Emmeline; Goldzweig, Caroline; Cain, Cindy L; Sugar, Catherine; Yano, Elizabeth M

    2017-08-01

    A growing literature has demonstrated the ability of user-centered design to make clinical decision support systems more effective and easier to use. However, studies of user-centered design have rarely examined more than a handful of sites at a time, and have frequently neglected the implementation climate and organizational resources that influence clinical decision support. The inclusion of such factors was identified by a systematic review as "the most important improvement that can be made in health IT evaluations." (1) Identify the prevalence of four user-centered design practices at United States Veterans Affairs (VA) primary care clinics and assess the perceived utility of clinical decision support at those clinics; (2) Evaluate the association between those user-centered design practices and the perceived utility of clinical decision support. We analyzed clinic-level survey data collected in 2006-2007 from 170 VA primary care clinics. We examined four user-centered design practices: 1) pilot testing, 2) provider satisfaction assessment, 3) formal usability assessment, and 4) analysis of impact on performance improvement. We used a regression model to evaluate the association between user-centered design practices and the perceived utility of clinical decision support, while accounting for other important factors at those clinics, including implementation climate, available resources, and structural characteristics. We also examined associations separately at community-based clinics and at hospital-based clinics. User-centered design practices for clinical decision support varied across clinics: 74% conducted pilot testing, 62% conducted provider satisfaction assessment, 36% conducted a formal usability assessment, and 79% conducted an analysis of impact on performance improvement. Overall perceived utility of clinical decision support was high, with a mean rating of 4.17 (±.67) out of 5 on a composite measure. "Analysis of impact on performance

  3. Clinical Immersion and Biomedical Engineering Design Education: "Engineering Grand Rounds".

    Science.gov (United States)

    Walker, Matthew; Churchwell, André L

    2016-03-01

    Grand Rounds is a ritual of medical education and inpatient care comprised of presenting the medical problems and treatment of a patient to an audience of physicians, residents, and medical students. Traditionally, the patient would be in attendance for the presentation and would answer questions. Grand Rounds has evolved considerably over the years with most sessions being didactic-rarely having a patient present (although, in some instances, an actor will portray the patient). Other members of the team, such as nurses, nurse practitioners, and biomedical engineers, are not traditionally involved in the formal teaching process. In this study we examine the rapid ideation in a clinical setting to forge a system of cross talk between engineers and physicians as a steady state at the praxis of ideation and implementation.

  4. Design considerations of a real-time clinical confocal microscope

    Science.gov (United States)

    Masters, Barry R.

    1991-06-01

    A real-time clinical confocal light microscope provides the ophthalmologist with a new tool for the observation of the cornea and the ocular lens. In addition, the ciliary body, the iris, and the sclera can be observed. The real-time light microscopic images have high contrast and resolution. The transverse resolution is about one half micron and the range resolution is one micron. The following observations were made with visible light: corneal epithelial cells, wing cells, basal cells, Bowman's membrane, nerve fibers, basal lamina, fibroblast nuclei, Descemet's membrane, endothelial cells. Observation of the in situ ocular lens showed lens capsule, lens epithelium, lens fibrils, the interior of lens fibrils. The applications of the confocal microscope include: eye banking, laser refractive surgery, observation of wound healing, observation of the iris, the sciera, the ciliary body, the ocular lens, and the intraocular lens. Digital image processing can produce three-dimensional reconstructions of the cornea and the ocular lens.

  5. Trial protocol: a parallel group, individually randomized clinical trial to evaluate the effect of a mobile phone application to improve sexual health among youth in Stockholm County.

    Science.gov (United States)

    Nielsen, Anna; De Costa, Ayesha; Bågenholm, Aspasia; Danielsson, Kristina Gemzell; Marrone, Gaetano; Boman, Jens; Salazar, Mariano; Diwan, Vinod

    2018-02-05

    Genital Chlamydia trachomatis infection is a major public health problem worldwide affecting mostly youth. Sweden introduced an opportunistic screening approach in 1982 accompanied by treatment, partner notification and case reporting. After an initial decline in infection rate till the mid-90s, the number of reported cases has increased over the last two decades and has now stabilized at a high level of 37,000 reported cases in Sweden per year (85% of cases in youth). Sexual risk-taking among youth is also reported to have significantly increased over the last 20 years. Mobile health (mHealth) interventions could be particularly suitable for youth and sexual health promotion as the intervention is delivered in a familiar and discrete way to a tech savvy at-risk population. This paper presents a protocol for a randomized trial to study the effect of an interactive mHealth application (app) on condom use among the youth of Stockholm. 446 youth resident in Stockholm, will be recruited in this two arm parallel group individually randomized trial. Recruitment will be from Youth Health Clinics or via the trial website. Participants will be randomized to receive either the intervention (which comprises an interactive app on safe sexual health that will be installed on their smart phones) or a control group (standard of care). Youth will be followed up for 6 months, with questionnaire responses submitted periodically via the app. Self-reported condom use over 6 months will be the primary outcome. Secondary outcomes will include presence of an infection, Chlamydia tests during the study period and proxy markers of safe sex. Analysis is by intention to treat. This trial exploits the high mobile phone usage among youth to provide a phone app intervention in the area of sexual health. If successful, the results will have implications for health service delivery and health promotion among the youth. From a methodological perspective, this trial is expected to provide

  6. OARSI Clinical Trials Recommendations: Design and conduct of clinical trials of rehabilitation interventions for osteoarthritis.

    Science.gov (United States)

    Fitzgerald, G K; Hinman, R S; Zeni, J; Risberg, M A; Snyder-Mackler, L; Bennell, K L

    2015-05-01

    A Task Force of the Osteoarthritis Research Society International (OARSI) has previously published a set of guidelines for the conduct of clinical trials in osteoarthritis (OA) of the hip and knee. Limited material available on clinical trials of rehabilitation in people with OA has prompted OARSI to establish a separate Task Force to elaborate guidelines encompassing special issues relating to rehabilitation of OA. The Task Force identified three main categories of rehabilitation clinical trials. The categories included non-operative rehabilitation trials, post-operative rehabilitation trials, and trials examining the effectiveness of devices (e.g., assistive devices, bracing, physical agents, electrical stimulation, etc.) that are used in rehabilitation of people with OA. In addition, the Task Force identified two main categories of outcomes in rehabilitation clinical trials, which include outcomes related to symptoms and function, and outcomes related to disease modification. The guidelines for rehabilitation clinical trials provided in this report encompass these main categories. The report provides guidelines for conducting and reporting on randomized clinical trials. The topics include considerations for entering patients into trials, issues related to conducting trials, considerations for selecting outcome measures, and recommendations for statistical analyses and reporting of results. The focus of the report is on rehabilitation trials for hip, knee and hand OA, however, we believe the content is broad enough that it could be applied to rehabilitation trials for other regions as well. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  7. Contribution to the optimal design of an hybrid parallel power-train: choice of a battery model; Contribution a la conception optimale d'une motorisation hybride parallele. Choix d'un modele d'accumulateur

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, E.

    2004-09-15

    This work deals with the dynamical and energetic modeling of a 42 V NiMH battery, the model of which is taking into account into a control law for an hybrid electrical vehicle. Using an inventory of the electrochemical phenomena, an equivalent electrical scheme has been established. In this model, diffusion phenomena were represented using non integer derivatives. This tool leads to a very good approximation of diffusion phenomena, nevertheless such a pure mathematical approach did not allow to represent energetic losses inside the battery. Consequently, a second model, made of a series of electric circuits has been proposed to represent energetic transfers. This second model has been used in the determination of a control law which warrants an autonomous management of electrical energy embedded in a parallel hybrid electrical vehicle, and to prevent deep discharge of the battery. (author)

  8. Clinical evaluation of a newly designed orthodontic tooth brush - A clinical study

    Directory of Open Access Journals (Sweden)

    C S Saimbi

    2009-01-01

    In this study, the newly designed orthodontic tooth brush is compared with an ordinary tooth brush. Results of this study show that the newly designed orthodontic tooth brush is superior in its cleaning efficiency as compared to the ordinary tooth brush. The results show that plaque removing capacity of orthodontic tooth brush is nearly 95-99%.

  9. Nitinol Esophageal Stents: New Designs and Clinical Indications

    International Nuclear Information System (INIS)

    Strecker, Ernst-Peter; Boos, Irene; Vetter, Sylvia; Strohm, Michael; Domschke, Sigurd

    1996-01-01

    Purpose: To evaluate the clinical use of covered and noncovered, knitted nitinol stents in patients presenting new stent indications. Methods: Self-expandable, knitted nitinol stents were implanted in four patients for treatment of dysphagia. In two patients who had malignant strictures and had esophago-respiratory fistulae and in one patient with an esophagocutaneous fistula, polytetrafluoroethylene (PTFE)-covered stents were implanted. One patient received a noncovered stent, but a retrograde approach through a percutaneous endoscopic gastrostomy (PEG) fistula had to be chosen for recanalization of an esophageal occlusion. Two patients received stents for treatment of benign strictures. Results: Recanalization of the stricture and stent implantation were performed under fluoroscopic control without any procedure-related morbidity or mortality. Dysphagia improved in all patients and the esophageal fistulae could be sealed off by covered stents. During a maximum follow-up of 18 months, there was no stent migration or esophageal perforation. Complications observed were stent stenosis due to food impaction (1/4) and benign stent stenosis (2/2). Most complications could be treated by the interventional radiologist. Conclusion: Self-expandable, covered Nitinol stents provide an option for the treatment of dysphagia combined with esophageal fistulae. In combination with interventional radiology techniques, even complex strictures are accessible. For benign strictures, the value of stent treatment has not yet been proven

  10. Current Therapeutic Cannabis Controversies and Clinical Trial Design Issues.

    Directory of Open Access Journals (Sweden)

    Ethan Budd Russo

    2016-09-01

    Full Text Available This overview covers a wide range of cannabis topics, initially examining issue in dispensaries and self-administration, plus regulatory requirement for production of cannabis-based medicines, particularly the Food and Drug Administration Botanical Guidance. The remainder pertains to various cannabis controversies that certainly require closer examination if the scientific, consumer and governmental stakeholders are ever to reach consensus on safety issues, specifically: whether botanical cannabis displays herbal synergy of its components, pharmacokinetics of cannabis and dose titration, whether cannabis medicines produce cyclo-oxygenase inhibition, cannabis-drug interactions and cytochrome P450 issues, whether cannabis randomized clinical trials are properly blinded, combatting the placebo effect in those trials via new approaches, the drug abuse liability of cannabis-based medicines and their regulatory scheduling, their effects on cognitive function and psychiatric sequelae, immunological effects, cannabis and driving safety, youth usage, issues related to cannabis smoking and vaporization, cannabis concentrates and vape-pens, and laboratory analysis for contamination with bacteria and heavy metals. Finally, the issue of pesticide usage on cannabis crops is addressed. New and disturbing data on pesticide residues in legal cannabis products in Washington State are presented with the observation of an 84.6% contamination rate including potentially neurotoxic and carcinogenic agents. With ongoing developments in legalization of cannabis in medical and recreational settings, numerous scientific, safety and public health issues remain.

  11. Current Therapeutic Cannabis Controversies and Clinical Trial Design Issues

    Science.gov (United States)

    Russo, Ethan B.

    2016-01-01

    This overview covers a wide range of cannabis topics, initially examining issues in dispensaries and self-administration, plus regulatory requirements for production of cannabis-based medicines, particularly the Food and Drug Administration “Botanical Guidance.” The remainder pertains to various cannabis controversies that certainly require closer examination if the scientific, consumer, and governmental stakeholders are ever to reach consensus on safety issues, specifically: whether botanical cannabis displays herbal synergy of its components, pharmacokinetics of cannabis and dose titration, whether cannabis medicines produce cyclo-oxygenase inhibition, cannabis-drug interactions, and cytochrome P450 issues, whether cannabis randomized clinical trials are properly blinded, combatting the placebo effect in those trials via new approaches, the drug abuse liability (DAL) of cannabis-based medicines and their regulatory scheduling, their effects on cognitive function and psychiatric sequelae, immunological effects, cannabis and driving safety, youth usage, issues related to cannabis smoking and vaporization, cannabis concentrates and vape-pens, and laboratory analysis for contamination with bacteria and heavy metals. Finally, the issue of pesticide usage on cannabis crops is addressed. New and disturbing data on pesticide residues in legal cannabis products in Washington State are presented with the observation of an 84.6% contamination rate including potentially neurotoxic and carcinogenic agents. With ongoing developments in legalization of cannabis in medical and recreational settings, numerous scientific, safety, and public health issues remain. PMID:27683558

  12. Design of a visible-light spectroscopy clinical tissue oximeter.

    Science.gov (United States)

    Benaron, David A; Parachikov, Ilian H; Cheong, Wai-Fung; Friedland, Shai; Rubinsky, Boris E; Otten, David M; Liu, Frank W H; Levinson, Carl J; Murphy, Aileen L; Price, John W; Talmi, Yair; Weersing, James P; Duckworth, Joshua L; Hörchner, Uwe B; Kermit, Eben L

    2005-01-01

    We develop a clinical visible-light spectroscopy (VLS) tissue oximeter. Unlike currently approved near-infrared spectroscopy (NIRS) or pulse oximetry (SpO2%), VLS relies on locally absorbed, shallow-penetrating visible light (475 to 625 nm) for the monitoring of microvascular hemoglobin oxygen saturation (StO2%), allowing incorporation into therapeutic catheters and probes. A range of probes is developed, including noncontact wands, invasive catheters, and penetrating needles with injection ports. Data are collected from: 1. probes, standards, and reference solutions to optimize each component; 2. ex vivo hemoglobin solutions analyzed for StO2% and pO2 during deoxygenation; and 3. human subject skin and mucosal tissue surfaces. Results show that differential VLS allows extraction of features and minimization of scattering effects, in vitro VLS oximetry reproduces the expected sigmoid hemoglobin binding curve, and in vivo VLS spectroscopy of human tissue allows for real-time monitoring (e.g., gastrointestinal mucosal saturation 69+/-4%, n=804; gastrointestinal tumor saturation 45+/-23%, n=14; and p<0.0001), with reproducible values and small standard deviations (SDs) in normal tissues. FDA approved VLS systems began shipping earlier this year. We conclude that VLS is suitable for the real-time collection of spectroscopic and oximetric data from human tissues, and that a VLS oximeter has application to the monitoring of localized subsurface hemoglobin oxygen saturation in the microvascular tissue spaces of human subjects.

  13. Periodontal Regeneration Of 1-, 2-, and 3-Walled Intrabony Defects Using Accell Connexus (registered trademark) Versus Demineralized Freeze-Dried Bone Allograft: A Randomized Parallel Arm Clinical Control Trial

    Science.gov (United States)

    2015-06-01

    RS, Granet MA, Kircos LT, Chambers OW, Robertson PB. Radiographic detection of dental calculus . Journal of periodontology1987; 58 (11): 747- 751...RANDOMIZED PARALLEL ARM CLINICAL CONTROL TRIAL by Teresita LaRonce Alston Lieutenant Commander, Dental Corps United States Navy A thesis submitted to...the Faculty of the Periodontics Graduate Program Naval Postgraduate Dental School Uniformed Services University of the Health Sciences in partial

  14. Clinical Digital Libraries Project: design approach and exploratory assessment of timely use in clinical environments.

    Science.gov (United States)

    Maccall, Steven L

    2006-04-01

    The paper describes and evaluates the use of Clinical Digital Libraries Project (CDLP) digital library collections in terms of their facilitation of timely clinical information seeking. A convenience sample of CDLP Web server log activity over a twelve-month period (7/2002 to 6/2003) was analyzed for evidence of timely information seeking after users were referred to digital library clinical topic pages from Web search engines. Sample searches were limited to those originating from medical schools (26% North American and 19% non-North American) and from hospitals or clinics (51% North American and 4% non-North American). Timeliness was determined based on a calculation of the difference between the timestamps of the first and last Web server log "hit" during each search in the sample. The calculated differences were mapped into one of three ranges: less than one minute, one to three minutes, and three to five minutes. Of the 864 searches analyzed, 48% were less than 1 minute, 41% were 1 to 3 minutes, and 11% were 3 to 5 minutes. These results were further analyzed by environment (medical schools versus hospitals or clinics) and by geographic location (North America versus non-North American). Searches reflected a consistent pattern of less than 1 minute in these environments. Though the results were not consistent on a month-by-month basis over the entire time period, data for 8 of 12 months showed that searches shorter than 1 minute predominated and data for 1 month showed an equal number of less than 1 minute and 1 to 3 minute searches. The CDLP digital library collections provided timely access to high-quality Web clinical resources when used for information seeking in medical education and hospital or clinic environments from North American and non-North American locations and consistently provided access to the sought information within the documented two-minute standard. The limitations of the use of Web server data warrant an exploratory assessment. This

  15. [DESIGN AND CLINICAL APPLICATION OF LESSER TROCHANTERIC REDUCTION FIXATION SYSTEM].

    Science.gov (United States)

    Guo, Xiaoze; Zhang, Ying; Xiao, Jin; Xie, Huibin; Yu, Jiefeng

    2015-02-01

    To design and produce a lesser trochanteric reduction fixation system and verify its value and effectiveness. A lesser trochanteric reduction fixation system was designed and produced according to the anatomical features of the lesser trochanteric fractures. Sixty-six patients with intertrochanteric fractures of Evans type III were included between January 2010 and July 2012. Of 66 patients, 32 were treated with dynamic hip screw (DHS) assisted with the lesser trochanteric reduction fixation system (study group), and 34 cases were treated with DHS only (control group). The 2 groups were comparable with no significant difference in gender, age, the reasons, and the types of the fractures (P > 0.05). The operation time, intraoperative blood loss, neck-shaft angle, bone healing time, ratio of successful fixations, and the functional evaluation of the hip joint after operation were compared between 2 groups. The study group had shorter operation time [(58.4 ± 5.3) minutes] and less intraoperative blood loss [(186.3 ± 6.6) mL than the control group [(78.5 ± 6.2)minutes and (246.2 ± 8.7) mL], showing significant differences (t = -14.040, P = 0.000; t = -31.145, P = 0.000). There was no significant difference in neck-shaft angle between study group [(138.6 ± 3.0)] and control group [(139.4 ± 2.9) degrees] (t = -1.044, P = 0.301). The wounds healed by first intention in both groups. The 30 and 31 patients were followed up 12 to 24 months (mean, 15 months) in the study group, and 13 to 25 months (mean, 16 months) in the control group, respectively. All fractures healed well in 2 groups. The study group had significantly shorter healing time [(8.8 ± 2.0) weeks] than the control group [(10.7 ± 3.4) weeks] (t = -2.871, P = 0.006). At 12 months after operation, coxa vara happened in 2 cases of the study group with a successful fixation ratio of 93.3% and in 10 cases of the control group with a successful fixation ratio of 67.7%, showing significant difference (Χ2 = 6

  16. The language parallel Pascal and other aspects of the massively parallel processor

    Science.gov (United States)

    Reeves, A. P.; Bruner, J. D.

    1982-01-01

    A high level language for the Massively Parallel Processor (MPP) was designed. This language, called Parallel Pascal, is described in detail. A description of the language design, a description of the intermediate language, Parallel P-Code, and details for the MPP implementation are included. Formal descriptions of Parallel Pascal and Parallel P-Code are given. A compiler was developed which converts programs in Parallel Pascal into the intermediate Parallel P-Code language. The code generator to complete the compiler for the MPP is being developed independently. A Parallel Pascal to Pascal translator was also developed. The architecture design for a VLSI version of the MPP was completed with a description of fault tolerant interconnection networks. The memory arrangement aspects of the MPP are discussed and a survey of other high level languages is given.

  17. Overview of the Force Scientific Parallel Language

    Directory of Open Access Journals (Sweden)

    Gita Alaghband

    1994-01-01

    Full Text Available The Force parallel programming language designed for large-scale shared-memory multiprocessors is presented. The language provides a number of parallel constructs as extensions to the ordinary Fortran language and is implemented as a two-level macro preprocessor to support portability across shared memory multiprocessors. The global parallelism model on which the Force is based provides a powerful parallel language. The parallel constructs, generic synchronization, and freedom from process management supported by the Force has resulted in structured parallel programs that are ported to the many multiprocessors on which the Force is implemented. Two new parallel constructs for looping and functional decomposition are discussed. Several programming examples to illustrate some parallel programming approaches using the Force are also presented.

  18. Exploiting Symmetry on Parallel Architectures.

    Science.gov (United States)

    Stiller, Lewis Benjamin

    1995-01-01

    This thesis describes techniques for the design of parallel programs that solve well-structured problems with inherent symmetry. Part I demonstrates the reduction of such problems to generalized matrix multiplication by a group-equivariant matrix. Fast techniques for this multiplication are described, including factorization, orbit decomposition, and Fourier transforms over finite groups. Our algorithms entail interaction between two symmetry groups: one arising at the software level from the problem's symmetry and the other arising at the hardware level from the processors' communication network. Part II illustrates the applicability of our symmetry -exploitation techniques by presenting a series of case studies of the design and implementation of parallel programs. First, a parallel program that solves chess endgames by factorization of an associated dihedral group-equivariant matrix is described. This code runs faster than previous serial programs, and discovered it a number of results. Second, parallel algorithms for Fourier transforms for finite groups are developed, and preliminary parallel implementations for group transforms of dihedral and of symmetric groups are described. Applications in learning, vision, pattern recognition, and statistics are proposed. Third, parallel implementations solving several computational science problems are described, including the direct n-body problem, convolutions arising from molecular biology, and some communication primitives such as broadcast and reduce. Some of our implementations ran orders of magnitude faster than previous techniques, and were used in the investigation of various physical phenomena.

  19. Parallel Programming with Intel Parallel Studio XE

    CERN Document Server

    Blair-Chappell , Stephen

    2012-01-01

    Optimize code for multi-core processors with Intel's Parallel Studio Parallel programming is rapidly becoming a "must-know" skill for developers. Yet, where to start? This teach-yourself tutorial is an ideal starting point for developers who already know Windows C and C++ and are eager to add parallelism to their code. With a focus on applying tools, techniques, and language extensions to implement parallelism, this essential resource teaches you how to write programs for multicore and leverage the power of multicore in your programs. Sharing hands-on case studies and real-world examples, the

  20. The STAPL Parallel Graph Library

    KAUST Repository

    Harshvardhan,

    2013-01-01

    This paper describes the stapl Parallel Graph Library, a high-level framework that abstracts the user from data-distribution and parallelism details and allows them to concentrate on parallel graph algorithm development. It includes a customizable distributed graph container and a collection of commonly used parallel graph algorithms. The library introduces pGraph pViews that separate algorithm design from the container implementation. It supports three graph processing algorithmic paradigms, level-synchronous, asynchronous and coarse-grained, and provides common graph algorithms based on them. Experimental results demonstrate improved scalability in performance and data size over existing graph libraries on more than 16,000 cores and on internet-scale graphs containing over 16 billion vertices and 250 billion edges. © Springer-Verlag Berlin Heidelberg 2013.

  1. Design and Effectiveness of a Required Pre-Clinical Simulation-based Curriculum for Fundamental Clinical Skills and Procedures

    Directory of Open Access Journals (Sweden)

    Daryl P. Lofaso

    2011-12-01

    Full Text Available For more than 20 years, medical literature has increasingly documented the need for students to learn, practice and demonstrate competence in basic clinical knowledge and skills. In 2001, the Louisiana State University Health Science Centers (LSUHSC School of Medicine – New Orleans replaced its traditional Introduction in to Clinical Medicine (ICM course with the Science and Practice of Medicine (SPM course. The main component within the SPM course is the Clinical Skills Lab (CSL. The CSL teaches 30 plus skills to all pre-clinical medical students (Years 1 and 2. Since 2002, an annual longitudinal evaluation questionnaire was distributed to all medical students targeting the skills taught in the CSL. Students were asked to rate their self- confidence (Dreyfus and Likert-type and estimate the number of times each clinical skill was performed (clinically/non-clinically. Of the 30 plus skills taught, 8 were selected for further evaluation. An analysis was performed on the eight skills selected to determine the effectiveness of the CSL. All students that participated in the CSL reported a significant improvement in self-confidence and in number performed in the clinically/non-clinically setting when compared to students that did not experience the CSL. For example, without CSL training, the percentage of students reported at the end of their second year self-perceived expertise as “novice” ranged from 21.4% (CPR to 84.7% (GU catheterization. Students who completed the two-years CSL, only 7.8% rated their self-perceived expertise at the end of the second year as “novice” and 18.8% for GU catheterization. The CSL design is not to replace real clinical patient experiences. It's to provide early exposure, medial knowledge, professionalism and opportunity to practice skills in a patient free environment.

  2. Assessment of clinical reasoning: A Script Concordance test designed for pre-clinical medical students.

    Science.gov (United States)

    Humbert, Aloysius J; Johnson, Mary T; Miech, Edward; Friedberg, Fred; Grackin, Janice A; Seidman, Peggy A

    2011-01-01

    The Script Concordance test (SCT) measures clinical reasoning in the context of uncertainty by comparing the responses of examinees and expert clinicians. It uses the level of agreement with a panel of experts to assign credit for the examinee's answers. This study describes the development and validation of a SCT for pre-clinical medical students. Faculty from two US medical schools developed SCT items in the domains of anatomy, biochemistry, physiology, and histology. Scoring procedures utilized data from a panel of 30 expert physicians. Validation focused on internal reliability and the ability of the SCT to distinguish between different cohorts. The SCT was administered to an aggregate of 411 second-year and 70 fourth-year students from both schools. Internal consistency for the 75 test items was satisfactory (Cronbach's alpha = 0.73). The SCT successfully differentiated second- from fourth-year students and both student groups from the expert panel in a one-way analysis of variance (F(2,508) = 120.4; p students from the two schools were not significantly different (p = 0.20). This SCT successfully differentiated pre-clinical medical students from fourth-year medical students and both cohorts of medical students from expert clinicians across different institutions and geographic areas. The SCT shows promise as an easy-to-administer measure of "problem-solving" performance in competency evaluation even in the beginning years of medical education.

  3. Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair

    Science.gov (United States)

    Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats

    2011-01-01

    Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project

  4. CLINIC-LABORATORY DESIGN BASED ON FUNCTION AND PHILOSOPHY AT PURDUE UNIVERSITY.

    Science.gov (United States)

    HANLEY, T.D.; STEER, M.D.

    THIS REPORT DESCRIBES THE DESIGN OF A NEW CLINIC AND LABORATORY FOR SPEECH AND HEARING TO ACCOMMODATE THE THREE BASIC PROGRAMS OF--(1) CLINICAL TRAINING OF UNDERGRADUATE AND GRADUATE STUDENT MAJORS, (2) SERVICES MADE AVAILABLE TO THE SPEECH AND HEARING HANDICAPPED, AND (3) RESEARCH IN SPEECH PATHOLOGY, AUDIOLOGY, PSYCHO-ACOUSTICS, AND…

  5. Interpreting clinical trial results by deductive reasoning: In search of improved trial design.

    Science.gov (United States)

    Kurbel, Sven; Mihaljević, Slobodan

    2017-10-01

    Clinical trial results are often interpreted by inductive reasoning, in a trial design-limited manner, directed toward modifications of the current clinical practice. Deductive reasoning is an alternative in which results of relevant trials are combined in indisputable premises that lead to a conclusion easily testable in future trials. © 2017 WILEY Periodicals, Inc.

  6. Effect of a combined education and eHealth programme on the control of oral anticoagulation patients (PORTALS study): a parallel cohort design in Dutch primary care.

    Science.gov (United States)

    Talboom-Kamp, Esther P W A; Verdijk, Noortje A; Kasteleyn, Marise J; Harmans, Lara M; Talboom, Irvin J S H; Numans, Mattijs E; Chavannes, Niels H

    2017-09-27

    To analyse the effect on therapeutic control and self-management skills of the implementation of self-management programmes, including eHealth by e-learning versus group training. Primary Care Thrombosis Service Center. Of the 247 oral anticoagulation therapy (OAT) patients, 63 started self-management by e-learning, 74 self-management by group training and 110 received usual care. Parallel cohort design with two randomised self-management groups (e-learning and group training) and a group receiving usual care. The effect of implementation of self-management on time in therapeutic range (TTR) was analysed with multilevel linear regression modelling. Usage of a supporting eHealth platform and the impact on self-efficacy (Generalised Self-Efficacy Scale (GSES)) and education level were analysed with linear regression analysis. After intervention, TTR was measured in three time periods of 6 months. (1) TTR, severe complications,(2) usage of an eHealth platform,(3) GSES, education level. Analysis showed no significant differences in TTR between the three time periods (p=0.520), the three groups (p=0.460) or the groups over time (p=0.263). Comparison of e-learning and group training showed no significant differences in TTR between the time periods (p=0.614), the groups (p=0.460) or the groups over time (p=0.263). No association was found between GSES and TTR (p=0.717) or education level and TTR (p=0.107). No significant difference was found between the self-management groups in usage of the platform (0-6 months p=0.571; 6-12 months p=0.866; 12-18 months p=0.260). The percentage of complications was low in all groups (3.2%; 1.4%; 0%). No differences were found between OAT patients trained by e-learning or by a group course regarding therapeutic control (TTR) and usage of a supporting eHealth platform. The TTR was similar in self-management and regular care patients. With adequate e-learning or group training, self-management seems safe and reliable for a selected

  7. Implementation and performance of parallelized elegant

    International Nuclear Information System (INIS)

    Wang, Y.; Borland, M.

    2008-01-01

    The program elegant is widely used for design and modeling of linacs for free-electron lasers and energy recovery linacs, as well as storage rings and other applications. As part of a multi-year effort, we have parallelized many aspects of the code, including single-particle dynamics, wakefields, and coherent synchrotron radiation. We report on the approach used for gradual parallelization, which proved very beneficial in getting parallel features into the hands of users quickly. We also report details of parallelization of collective effects. Finally, we discuss performance of the parallelized code in various applications.

  8. OARSI Clinical Trials Recommendations: Design and conduct of clinical trials of lifestyle diet and exercise interventions for osteoarthritis.

    Science.gov (United States)

    Messier, S P; Callahan, L F; Golightly, Y M; Keefe, F J

    2015-05-01

    The objective was to develop a set of "best practices" for use as a primer for those interested in entering the clinical trials field for lifestyle diet and/or exercise interventions in osteoarthritis (OA), and as a set of recommendations for experienced clinical trials investigators. A subcommittee of the non-pharmacologic therapies committee of the OARSI Clinical Trials Working Group was selected by the Steering Committee to develop a set of recommended principles for non-pharmacologic diet/exercise OA randomized clinical trials. Topics were identified for inclusion by co-authors and reviewed by the subcommittee. Resources included authors' expert opinions, traditional search methods including MEDLINE (via PubMed), and previously published guidelines. Suggested steps and considerations for study methods (e.g., recruitment and enrollment of participants, study design, intervention and assessment methods) were recommended. The recommendations set forth in this paper provide a guide from which a research group can design a lifestyle diet/exercise randomized clinical trial in patients with OA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  9. From clinical practice guidelines, to clinical guidance in practice - implications for design of computerized guidance

    DEFF Research Database (Denmark)

    Lyng, Karen Marie

    2010-01-01

    an extensive application of what we have named second order guiding artifacts. The deployed protocols underwent a local adaptation and transformation process when initiated. The protocols were adapted to match the local resources and transformed into several activity specific second order guiding artifacts....... The transformation from protocols was executed according to a standard operating procedure. Each activity type had a standardized template ensuring uniformity across second order guiding artifacts within a clinic. The guiding artifacts were multi-functional and a wide variety of standardized graphical attributes...

  10. Design control for clinical translation of 3D printed modular scaffolds.

    Science.gov (United States)

    Hollister, Scott J; Flanagan, Colleen L; Zopf, David A; Morrison, Robert J; Nasser, Hassan; Patel, Janki J; Ebramzadeh, Edward; Sangiorgio, Sophia N; Wheeler, Matthew B; Green, Glenn E

    2015-03-01

    The primary thrust of tissue engineering is the clinical translation of scaffolds and/or biologics to reconstruct tissue defects. Despite this thrust, clinical translation of tissue engineering therapies from academic research has been minimal in the 27 year history of tissue engineering. Academic research by its nature focuses on, and rewards, initial discovery of new phenomena and technologies in the basic research model, with a view towards generality. Translation, however, by its nature must be directed at specific clinical targets, also denoted as indications, with associated regulatory requirements. These regulatory requirements, especially design control, require that the clinical indication be precisely defined a priori, unlike most academic basic tissue engineering research where the research target is typically open-ended, and furthermore requires that the tissue engineering therapy be constructed according to design inputs that ensure it treats or mitigates the clinical indication. Finally, regulatory approval dictates that the constructed system be verified, i.e., proven that it meets the design inputs, and validated, i.e., that by meeting the design inputs the therapy will address the clinical indication. Satisfying design control requires (1) a system of integrated technologies (scaffolds, materials, biologics), ideally based on a fundamental platform, as compared to focus on a single technology, (2) testing of design hypotheses to validate system performance as opposed to mechanistic hypotheses of natural phenomena, and (3) sequential testing using in vitro, in vivo, large preclinical and eventually clinical tests against competing therapies, as compared to single experiments to test new technologies or test mechanistic hypotheses. Our goal in this paper is to illustrate how design control may be implemented in academic translation of scaffold based tissue engineering therapies. Specifically, we propose to (1) demonstrate a modular platform approach

  11. Potential of adaptive clinical trial designs in pharmacogenetic research, A simulation based on the IPASS trial

    NARCIS (Netherlands)

    Van Der Baan, Frederieke H.; Knol, Mirjam J.|info:eu-repo/dai/nl/304820350; Klungel, Olaf H.|info:eu-repo/dai/nl/181447649; Egberts, Toine C.G.|info:eu-repo/dai/nl/162850050; Grobbee, Diederick E.; Roes, Kit C.B.

    2011-01-01

    Background: An adaptive clinical trial design that allows population enrichment after interim analysis can be advantageous in pharmacogenetic research if previous evidence is not strong enough to exclude part of the patient population beforehand.With this design, underpowered studies or unnecessary

  12. 78 FR 11207 - Clinical Study Designs for Surgical Ablation Devices for Treatment of Atrial Fibrillation...

    Science.gov (United States)

    2013-02-15

    ...] Clinical Study Designs for Surgical Ablation Devices for Treatment of Atrial Fibrillation; Guidance for... devices intended for the treatment of atrial fibrillation. DATES: Submit either electronic or written... Study Designs for Surgical Ablation Devices for Treatment of Atrial Fibrillation'' to the Division of...

  13. Designing and Implementing a Mentoring Program to Support Clinically-Based Teacher Education

    Science.gov (United States)

    Henning, John E.; Gut, Dianne; Beam, Pamela

    2015-01-01

    This article describes one teacher preparation program's approach to designing and implementing a mentoring program to support clinically-based teacher education. The design for the program is based on an interview study that compared the mentoring experiences of 18 teachers across three different contexts: student teaching, early field…

  14. [Clinical skills and outcomes of chair-side computer aided design and computer aided manufacture system].

    Science.gov (United States)

    Yu, Q

    2018-04-09

    Computer aided design and computer aided manufacture (CAD/CAM) technology is a kind of oral digital system which is applied to clinical diagnosis and treatment. It overturns the traditional pattern, and provides a solution to restore defect tooth quickly and efficiently. In this paper we mainly discuss the clinical skills of chair-side CAD/CAM system, including tooth preparation, digital impression, the three-dimensional design of prosthesis, numerical control machining, clinical bonding and so on, and review the outcomes of several common kinds of materials at the same time.

  15. Practical parallel computing

    CERN Document Server

    Morse, H Stephen

    1994-01-01

    Practical Parallel Computing provides information pertinent to the fundamental aspects of high-performance parallel processing. This book discusses the development of parallel applications on a variety of equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the technology trends that converge to favor massively parallel hardware over traditional mainframes and vector machines. This text then gives a tutorial introduction to parallel hardware architectures. Other chapters provide worked-out examples of programs using several parallel languages. Thi

  16. Parallel sorting algorithms

    CERN Document Server

    Akl, Selim G

    1985-01-01

    Parallel Sorting Algorithms explains how to use parallel algorithms to sort a sequence of items on a variety of parallel computers. The book reviews the sorting problem, the parallel models of computation, parallel algorithms, and the lower bounds on the parallel sorting problems. The text also presents twenty different algorithms, such as linear arrays, mesh-connected computers, cube-connected computers. Another example where algorithm can be applied is on the shared-memory SIMD (single instruction stream multiple data stream) computers in which the whole sequence to be sorted can fit in the

  17. Parallel algorithms for continuum dynamics

    International Nuclear Information System (INIS)

    Hicks, D.L.; Liebrock, L.M.

    1987-01-01

    Simply porting existing parallel programs to a new parallel processor may not achieve the full speedup possible; to achieve the maximum efficiency may require redesigning the parallel algorithms for the specific architecture. The authors discuss here parallel algorithms that were developed first for the HEP processor and then ported to the CRAY X-MP/4, the ELXSI/10, and the Intel iPSC/32. Focus is mainly on the most recent parallel processing results produced, i.e., those on the Intel Hypercube. The applications are simulations of continuum dynamics in which the momentum and stress gradients are important. Examples of these are inertial confinement fusion experiments, severe breaks in the coolant system of a reactor, weapons physics, shock-wave physics. Speedup efficiencies on the Intel iPSC Hypercube are very sensitive to the ratio of communication to computation. Great care must be taken in designing algorithms for this machine to avoid global communication. This is much more critical on the iPSC than it was on the three previous parallel processors

  18. Research design considerations for single-dose analgesic clinical trials in acute pain

    DEFF Research Database (Denmark)

    Cooper, Stephen A; Desjardins, Paul J; Turk, Dennis C

    2016-01-01

    This article summarizes the results of a meeting convened by the Initiative on Methods, Measurement, and Pain Assessment in Clinical Trials (IMMPACT) on key considerations and best practices governing the design of acute pain clinical trials. We discuss the role of early phase clinical trials......, including pharmacokinetic-pharmacodynamic (PK-PD) trials, and the value of including both placebo and active standards of comparison in acute pain trials. This article focuses on single-dose and short-duration trials with emphasis on the perioperative and study design factors that influence assay...... sensitivity. Recommendations are presented on assessment measures, study designs, and operational factors. Although most of the methodological advances have come from studies of postoperative pain after dental impaction, bunionectomy, and other surgeries, the design considerations discussed are applicable...

  19. Architectural design of a data warehouse to support operational and analytical queries across disparate clinical databases.

    Science.gov (United States)

    Chelico, John D; Wilcox, Adam; Wajngurt, David

    2007-10-11

    As the clinical data warehouse of the New York Presbyterian Hospital has evolved innovative methods of integrating new data sources and providing more effective and efficient data reporting and analysis need to be explored. We designed and implemented a new clinical data warehouse architecture to handle the integration of disparate clinical databases in the institution. By examining the way downstream systems are populated and streamlining the way data is stored we create a virtual clinical data warehouse that is adaptable to future needs of the organization.

  20. 基于OpenMP的电磁场FDTD多核并行程序设计%Design of electromagnetic field FDTD multi-core parallel program based on OpenMP

    Institute of Scientific and Technical Information of China (English)

    吕忠亭; 张玉强; 崔巍

    2013-01-01

    探讨了基于OpenMP的电磁场FDTD多核并行程序设计的方法,以期实现该方法在更复杂的算法中应用具有更理想的性能提升。针对一个一维电磁场FDTD算法问题,对其计算方法与过程做了简单描述。在Fortran语言环境中,采用OpenMP+细粒度并行的方式实现了并行化,即只对循环部分进行并行计算,并将该并行方法在一个三维瞬态场电偶极子辐射FDTD程序中进行了验证。该并行算法取得了较其他并行FDTD算法更快的加速比和更高的效率。结果表明基于OpenMP的电磁场FDTD并行算法具有非常好的加速比和效率。%The method of the electromagnetic field FDTD multi-core parallel programm design based on OpenMP is dis-cussed,in order to implement ideal performance improvement of this method in the application of more sophisticated algorithms. Aiming at a problem existing in one-dimensional electromagnetic FDTD algorithm , its calculation method and process are described briefly. In Fortran language environment,the parallelism is achieved with OpenMP technology and fine-grained parallel way,that is,the parallel computation is performed only for the cycle part. The parallel method was verified in a three-dimensional transient electromagnetic field FDTD program for dipole radiation. The parallel algorithm has achieved faster speedup and higher efficiency than other parallel FDTD algoritms. The results indicate that the electromagnetic field FDTD parallel algorithm based on OpenMP has a good speedup and efficiency.

  1. Clinical Trial Design for HIV Prevention Research: Determining Standards of Prevention.

    Science.gov (United States)

    Dawson, Liza; Zwerski, Sheryl

    2015-06-01

    This article seeks to advance ethical dialogue on choosing standards of prevention in clinical trials testing improved biomedical prevention methods for HIV. The stakes in this area of research are high, given the continued high rates of infection in many countries and the budget limitations that have constrained efforts to expand treatment for all who are currently HIV-infected. New prevention methods are still needed; at the same time, some existing prevention and treatment interventions have been proven effective but are not yet widely available in the countries where they most urgently needed. The ethical tensions in this field of clinical research are well known and have been the subject of extensive debate. There is no single clinical trial design that can optimize all the ethically important goals and commitments involved in research. Several recent articles have described the current ethical difficulties in designing HIV prevention trials, especially in resource limited settings; however, there is no consensus on how to handle clinical trial design decisions, and existing international ethical guidelines offer conflicting advice. This article acknowledges these deep ethical dilemmas and moves beyond a simple descriptive approach to advance an organized method for considering what clinical trial designs will be ethically acceptable for HIV prevention trials, balancing the relevant criteria and providing justification for specific design decisions. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  2. Introduction to parallel programming

    CERN Document Server

    Brawer, Steven

    1989-01-01

    Introduction to Parallel Programming focuses on the techniques, processes, methodologies, and approaches involved in parallel programming. The book first offers information on Fortran, hardware and operating system models, and processes, shared memory, and simple parallel programs. Discussions focus on processes and processors, joining processes, shared memory, time-sharing with multiple processors, hardware, loops, passing arguments in function/subroutine calls, program structure, and arithmetic expressions. The text then elaborates on basic parallel programming techniques, barriers and race

  3. Chiropractic and self-care for back-related leg pain: design of a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Schulz Craig A

    2011-03-01

    Full Text Available Abstract Background Back-related leg pain (BRLP is a common variation of low back pain (LBP, with lifetime prevalence estimates as high as 40%. Often disabling, BRLP accounts for greater work loss, recurrences, and higher costs than uncomplicated LBP and more often leads to surgery with a lifetime incidence of 10% for those with severe BRLP, compared to 1-2% for those with LBP. In the US, half of those with back-related conditions seek CAM treatments, the most common of which is chiropractic care. While there is preliminary evidence suggesting chiropractic spinal manipulative therapy is beneficial for patients with BRLP, there is insufficient evidence currently available to assess the effectiveness of this care. Methods/Design This study is a two-site, prospective, parallel group, observer-blinded randomized clinical trial (RCT. A total of 192 study patients will be recruited from the Twin Cities, MN (n = 122 and Quad Cities area in Iowa and Illinois (n = 70 to the research clinics at WHCCS and PCCR, respectively. It compares two interventions: chiropractic spinal manipulative therapy (SMT plus home exercise program (HEP to HEP alone (minimal intervention comparison for patients with subacute or chronic back-related leg pain. Discussion Back-related leg pain (BRLP is a costly and often disabling variation of the ubiquitous back pain conditions. As health care costs continue to climb, the search for effective treatments with few side-effects is critical. While SMT is the most commonly sought CAM treatment for LBP sufferers, there is only a small, albeit promising, body of research to support its use for patients with BRLP. This study seeks to fill a critical gap in the LBP literature by performing the first full scale RCT assessing chiropractic SMT for patients with sub-acute or chronic BRLP using important patient-oriented and objective biomechanical outcome measures. Trial Registration ClinicalTrials.gov NCT00494065

  4. Using Co-Design with Nursing Students to Create Educational Apps for Clinical Training.

    Science.gov (United States)

    O'Connor, Siobhan; Andrews, Tom

    2016-01-01

    Mobile technology is being trialed in nursing education to support students in clinical practice, as it can provide instant access to high quality educational material at the point of care. However, most educational mobile apps are generic, off-the-shelf applications that do not take into consideration the unique needs of nursing students, who can require personalised software solutions. This study adapted a socio-cognitive engineering approach and through a series of focus groups with final year nursing students explored the co-design process and gained their input on the design and functionality of a clinical skills based educational app. Results showed students required an uncluttered interface that was fast to navigate and easy to use in busy clinical environments. They also requested simple visual descriptions of key clinical skills and equipment to enable them to quickly refresh their memory so they could perform the skill in practice.

  5. Clinical implementation of x-ray phase-contrast imaging: Theoretical foundations and design considerations

    International Nuclear Information System (INIS)

    Wu Xizeng; Liu Hong

    2003-01-01

    Theoretical foundation and design considerations of a clinical feasible x-ray phase contrast imaging technique were presented in this paper. Different from the analysis of imaging phase object with weak absorption in literature, we proposed a new formalism for in-line phase-contrast imaging to analyze the effects of four clinically important factors on the phase contrast. These are the body parts attenuation, the spatial coherence of spherical waves from a finite-size focal spot, and polychromatic x-ray and radiation doses to patients for clinical applications. The theory presented in this paper can be applied widely in diagnostic x-ray imaging procedures. As an example, computer simulations were conducted and optimal design parameters were derived for clinical mammography. The results of phantom experiments were also presented which validated the theoretical analysis and computer simulations

  6. Design of a Clinical Information Management System to Support DNA Analysis Laboratory Operation

    Science.gov (United States)

    Dubay, Christopher J.; Zimmerman, David; Popovich, Bradley

    1995-01-01

    The LabDirector system has been developed at the Oregon Health Sciences University to support the operation of our clinical DNA analysis laboratory. Through an iterative design process which has spanned two years, we have produced a system that is both highly tailored to a clinical genetics production laboratory and flexible in its implementation, to support the rapid growth and change of protocols and methodologies in use in the field. The administrative aspects of the system are integrated with an enterprise schedule management system. The laboratory side of the system is driven by a protocol modeling and execution system. The close integration between these two aspects of the clinical laboratory facilitates smooth operations, and allows management to accurately measure costs and performance. The entire application has been designed and documented to provide utility to a wide range of clinical laboratory environments.

  7. A survey of parallel multigrid algorithms

    Science.gov (United States)

    Chan, Tony F.; Tuminaro, Ray S.

    1987-01-01

    A typical multigrid algorithm applied to well-behaved linear-elliptic partial-differential equations (PDEs) is described. Criteria for designing and evaluating parallel algorithms are presented. Before evaluating the performance of some parallel multigrid algorithms, consideration is given to some theoretical complexity results for solving PDEs in parallel and for executing the multigrid algorithm. The effect of mapping and load imbalance on the partial efficiency of the algorithm is studied.

  8. Design of a Clinical Information Management System to Support DNA Analysis Laboratory Operation

    OpenAIRE

    Dubay, Christopher J.; Zimmerman, David; Popovich, Bradley

    1995-01-01

    The LabDirector system has been developed at the Oregon Health Sciences University to support the operation of our clinical DNA analysis laboratory. Through an iterative design process which has spanned two years, we have produced a system that is both highly tailored to a clinical genetics production laboratory and flexible in its implementation, to support the rapid growth and change of protocols and methodologies in use in the field. The administrative aspects of the system are integrated ...

  9. Influence of Abutment Design on Clinical Status of Peri-Implant Tissues

    OpenAIRE

    Taiyeb-Ali, T. B.; Toh, C. G.; Siar, C. H.; Seiz, D.; Ong, S. T.

    2017-01-01

    Objective: To compare the clinical soft tissue responses around implant tooth-supported 3-unit bridges using tapered abutments with those using butt-joint abutments. Methods: In a split-mouth design study, 8 mm Ankylos (Dentsply Friadent, Germany) implants were placed in the second mandibular molar region of 8 adult Macaca fascicularis monkeys about I month after extraction of all mandibular molars. After 3 months of submerged healing, 3-unit metal bridges were constructed. Clinical data was ...

  10. Healthy School, Happy School: Design and Protocol for a Randomized Clinical Trial Designed to Prevent Weight Gain in Children

    Directory of Open Access Journals (Sweden)

    Daniela Schneid Schuh

    Full Text Available Abstract Background: Schools have become a key figure for the promotion of health and obesity interventions, bringing the development of critical awareness to the construction and promotion of a healthy diet, physical activity, and the monitoring of the nutritional status in childhood and adolescence. Objectives: To describe a study protocol to evaluate the effectiveness of an intervention designed to improve knowledge of food choices in the school environment. Methods: This is a cluster-randomized, parallel, two-arm study conducted in public elementary and middle schools in Brazil. Participants will be children and adolescents between the ages of 5 and 15 years, from both genders. The interventions will be focusing on changes in lifestyle, physical activities and nutritional education. Intervention activities will occur monthly in the school’s multimedia room or sports court. The control group arm will receive usual recommendations by the school. The primary outcome variable will be anthropometric measures, such as body mass index percentiles and levels of physical activity by the International Physical Activity Questionnaire. Results: We expect that after the study children will increase the ingestion of fresh food, reduce excessive consumption of sugary and processed foods, and reduce the hours of sedentary activities. Conclusion: The purpose of starting the dietary intervention at this stage of life is to develop a knowledge that will enable for healthy choices, providing opportunities for a better future for this population.

  11. Sources of variation in primary care clinical workflow: implications for the design of cognitive support.

    Science.gov (United States)

    Militello, Laura G; Arbuckle, Nicole B; Saleem, Jason J; Patterson, Emily; Flanagan, Mindy; Haggstrom, David; Doebbeling, Bradley N

    2014-03-01

    This article identifies sources of variation in clinical workflow and implications for the design and implementation of electronic clinical decision support. Sources of variation in workflow were identified via rapid ethnographic observation, focus groups, and interviews across a total of eight medical centers in both the Veterans Health Administration and academic medical centers nationally regarded as leaders in developing and using clinical decision support. Data were reviewed for types of variability within the social and technical subsystems and the external environment as described in the sociotechnical systems theory. Two researchers independently identified examples of variation and their sources, and then met with each other to discuss them until consensus was reached. Sources of variation were categorized as environmental (clinic staffing and clinic pace), social (perception of health information technology and real-time use with patients), or technical (computer access and information access). Examples of sources of variation within each of the categories are described and discussed in terms of impact on clinical workflow. As technologies are implemented, barriers to use become visible over time as users struggle to adapt workflow and work practices to accommodate new technologies. Each source of variability identified has implications for the effective design and implementation of useful health information technology. Accommodating moderate variability in workflow is anticipated to avoid brittle and inflexible workflow designs, while also avoiding unnecessary complexity for implementers and users.

  12. Parallel Atomistic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    HEFFELFINGER,GRANT S.

    2000-01-18

    Algorithms developed to enable the use of atomistic molecular simulation methods with parallel computers are reviewed. Methods appropriate for bonded as well as non-bonded (and charged) interactions are included. While strategies for obtaining parallel molecular simulations have been developed for the full variety of atomistic simulation methods, molecular dynamics and Monte Carlo have received the most attention. Three main types of parallel molecular dynamics simulations have been developed, the replicated data decomposition, the spatial decomposition, and the force decomposition. For Monte Carlo simulations, parallel algorithms have been developed which can be divided into two categories, those which require a modified Markov chain and those which do not. Parallel algorithms developed for other simulation methods such as Gibbs ensemble Monte Carlo, grand canonical molecular dynamics, and Monte Carlo methods for protein structure determination are also reviewed and issues such as how to measure parallel efficiency, especially in the case of parallel Monte Carlo algorithms with modified Markov chains are discussed.

  13. A clinical trial design using the concept of proportional time using the generalized gamma ratio distribution.

    Science.gov (United States)

    Phadnis, Milind A; Wetmore, James B; Mayo, Matthew S

    2017-11-20

    Traditional methods of sample size and power calculations in clinical trials with a time-to-event end point are based on the logrank test (and its variations), Cox proportional hazards (PH) assumption, or comparison of means of 2 exponential distributions. Of these, sample size calculation based on PH assumption is likely the most common and allows adjusting for the effect of one or more covariates. However, when designing a trial, there are situations when the assumption of PH may not be appropriate. Additionally, when it is known that there is a rapid decline in the survival curve for a control group, such as from previously conducted observational studies, a design based on the PH assumption may confer only a minor statistical improvement for the treatment group that is neither clinically nor practically meaningful. For such scenarios, a clinical trial design that focuses on improvement in patient longevity is proposed, based on the concept of proportional time using the generalized gamma ratio distribution. Simulations are conducted to evaluate the performance of the proportional time method and to identify the situations in which such a design will be beneficial as compared to the standard design using a PH assumption, piecewise exponential hazards assumption, and specific cases of a cure rate model. A practical example in which hemorrhagic stroke patients are randomized to 1 of 2 arms in a putative clinical trial demonstrates the usefulness of this approach by drastically reducing the number of patients needed for study enrollment. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Co-Designing Mobile Apps to Assist in Clinical Nursing Education: A Study Protocol.

    Science.gov (United States)

    O'Connor, Siobhan; Andrews, Tom

    2016-01-01

    Mobile applications (apps) to train health professionals is gaining momentum as the benefits of mobile learning (mLearning) are becoming apparent in complex clinical environments. However, most educational apps are generic, off-the-shelf pieces of software that do not take into consideration the unique needs of nursing students. The proposed study will apply a user-centred design process to create a tailored mobile app for nursing students to learn and apply clinical skills in practice. The app will be piloted and evaluated to understand how nursing students use mobile technology in clinical settings to support their learning and educational needs.

  15. Parallel imaging microfluidic cytometer.

    Science.gov (United States)

    Ehrlich, Daniel J; McKenna, Brian K; Evans, James G; Belkina, Anna C; Denis, Gerald V; Sherr, David H; Cheung, Man Ching

    2011-01-01

    By adding an additional degree of freedom from multichannel flow, the parallel microfluidic cytometer (PMC) combines some of the best features of fluorescence-activated flow cytometry (FCM) and microscope-based high-content screening (HCS). The PMC (i) lends itself to fast processing of large numbers of samples, (ii) adds a 1D imaging capability for intracellular localization assays (HCS), (iii) has a high rare-cell sensitivity, and (iv) has an unusual capability for time-synchronized sampling. An inability to practically handle large sample numbers has restricted applications of conventional flow cytometers and microscopes in combinatorial cell assays, network biology, and drug discovery. The PMC promises to relieve a bottleneck in these previously constrained applications. The PMC may also be a powerful tool for finding rare primary cells in the clinic. The multichannel architecture of current PMC prototypes allows 384 unique samples for a cell-based screen to be read out in ∼6-10 min, about 30 times the speed of most current FCM systems. In 1D intracellular imaging, the PMC can obtain protein localization using HCS marker strategies at many times for the sample throughput of charge-coupled device (CCD)-based microscopes or CCD-based single-channel flow cytometers. The PMC also permits the signal integration time to be varied over a larger range than is practical in conventional flow cytometers. The signal-to-noise advantages are useful, for example, in counting rare positive cells in the most difficult early stages of genome-wide screening. We review the status of parallel microfluidic cytometry and discuss some of the directions the new technology may take. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. How to design and write a clinical research protocol in Cosmetic Dermatology*

    Science.gov (United States)

    Bagatin, Ediléia; Miot, Helio A.

    2013-01-01

    Cosmetic Dermatology is a growing subspecialty. High-quality basic science studies have been published; however, few double-blind, randomized controlled clinical trials, which are the major instrument for evidence-based medicine, have been conducted in this area. Clinical research is essential for the discovery of new knowledge, improvement of scientific basis, resolution of challenges, and good clinical practice. Some basic principles for a successful researcher include interest, availability, persistence, and honesty. It is essential to learn how to write a protocol research and to know the international and national regulatory rules. A complete clinical trial protocol should include question, background, objectives, methodology (design, variable description, sample size, randomization, inclusion and exclusion criteria, intervention, efficacy and safety measures, and statistical analysis), consent form, clinical research form, and references. Institutional ethical review board approval and financial support disclosure are necessary. Publication of positive or negative results should be an authors' commitment. PMID:23539006

  17. The DEMO trial: a randomized, parallel-group, observer-blinded clinical trial of strength versus aerobic versus relaxation training for patients with mild to moderate depression

    DEFF Research Database (Denmark)

    Krogh, Jesper; Saltin, Bengt; Gluud, Christian

    2009-01-01

    OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners or psych......: Our findings do not support a biologically mediated effect of exercise on symptom severity in depressed patients, but they do support a beneficial effect of strength training on work capacity. TRIAL REGISTRATION: (ClinicalTrials.gov) Identifier: NCT00103415....

  18. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  19. Solid-State-NMR-Structure-Based Inhibitor Design to Achieve Selective Inhibition of the Parallel-in-Register β-Sheet versus Antiparallel Iowa Mutant β-Amyloid Fibrils.

    Science.gov (United States)

    Cheng, Qinghui; Qiang, Wei

    2017-06-08

    Solid-state nuclear magnetic resonance (ssNMR) spectroscopy has been widely applied to characterize the high-resolution structures of β-amyloid (Aβ) fibrils. While these structures provide crucial molecular insights on the deposition of amyloid plaques in Alzheimer's diseases (AD), ssNMR structures have been rarely used so far as the basis for designing inhibitors. It remains a challenge because the ssNMR-based Aβ fibril structures were usually obtained with sparsely isotope-labeled peptides with limited experimental constraints, where the structural models, especially the side-chain coordinates, showed restricted precision. However, these structural models often possess a higher accuracy within the hydrophobic core regions with more well-defined experimental data, which provide potential targets for the molecular design. This work presents an ssNMR-based molecular design to achieve selective inhibition of a particular type of Aβ fibrillar structure, which was formed with the Iowa mutant of Aβ with parallel-in-register β-sheet hydrophobic core. The results show that short peptides that mimic the C-terminal β-strands of the fibril may have a preference in binding to the parallel Aβ fibrils rather than the antiparallel fibrils, mainly due to the differences in the high-resolution structures in the fibril elongation interfaces. The Iowa mutant Aβ fibrils are utilized in this work mainly as a model to demonstrate the feasibility of the strategy because it is relatively straightforward to distinguish the parallel and antiparallel fibril structures using ssNMR. Our results suggest that it is potentially feasible to design structure-selective inhibitors and/or diagnostic agents to Aβ fibrils using ssNMR-based structural models.

  20. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  1. Productive Parallel Programming: The PCN Approach

    Directory of Open Access Journals (Sweden)

    Ian Foster

    1992-01-01

    Full Text Available We describe the PCN programming system, focusing on those features designed to improve the productivity of scientists and engineers using parallel supercomputers. These features include a simple notation for the concise specification of concurrent algorithms, the ability to incorporate existing Fortran and C code into parallel applications, facilities for reusing parallel program components, a portable toolkit that allows applications to be developed on a workstation or small parallel computer and run unchanged on supercomputers, and integrated debugging and performance analysis tools. We survey representative scientific applications and identify problem classes for which PCN has proved particularly useful.

  2. Parallel auto-correlative statistics with VTK.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  3. Structured Parallel Programming Patterns for Efficient Computation

    CERN Document Server

    McCool, Michael; Robison, Arch

    2012-01-01

    Programming is now parallel programming. Much as structured programming revolutionized traditional serial programming decades ago, a new kind of structured programming, based on patterns, is relevant to parallel programming today. Parallel computing experts and industry insiders Michael McCool, Arch Robison, and James Reinders describe how to design and implement maintainable and efficient parallel algorithms using a pattern-based approach. They present both theory and practice, and give detailed concrete examples using multiple programming models. Examples are primarily given using two of th

  4. Designing cyclic appointment schedules for outpatient clinics with scheduled and unscheduled patient arrivals

    NARCIS (Netherlands)

    Kortbeek, Nikky; Zonderland, Maartje E.; Braaksma, Aleida; Vliegen, Ingrid M. H.; Boucherie, Richard J.; Litvak, Nelly; Hans, Erwin W.

    2014-01-01

    We present a methodology to design appointment systems for outpatient clinics and diagnostic facilities that offer both walk-in and scheduled service. The developed blueprint for the appointment schedule prescribes the number of appointments to plan per day and the moment on the day to schedule the

  5. Designing cyclic appointment schedules for outpatient clinics with scheduled and unscheduled patient arrivals

    NARCIS (Netherlands)

    Kortbeek, Nikky; Zonderland, Maartje Elisabeth; Boucherie, Richardus J.; Litvak, Nelli; Hans, Elias W.

    2011-01-01

    We present a methodology to design appointment systems for outpatient clinics and diagnostic facilities that offer both walk-in and scheduled service. The developed blueprint for the appointment schedule prescribes the number of appointments to plan per day and the moment on the day to schedule the

  6. Anterior single implants with different neck designs : 5 Year results of a randomized clinical trial

    NARCIS (Netherlands)

    den Hartog, Laurens; Meijer, Henny J A; Vissink, Arjan; Raghoebar, Gerry M

    BACKGROUND: The design of the implant neck might be significant for preservation of marginal bone. PURPOSE: To compare the 5-year radiographic and clinical outcome of single anterior implants provided with a smooth neck, a rough neck or a scalloped rough neck. MATERIALS AND METHODS: 93 Patients with

  7. Efficient design of clinical trials and epidemiological research: is it possible?

    Science.gov (United States)

    Lauer, Michael S; Gordon, David; Wei, Gina; Pearson, Gail

    2017-08-01

    Randomized clinical trials and large-scale, cohort studies continue to have a critical role in generating evidence in cardiovascular medicine; however, the increasing concern is that ballooning costs threaten the clinical trial enterprise. In this Perspectives article, we discuss the changing landscape of clinical research, and clinical trials in particular, focusing on reasons for the increasing costs and inefficiencies. These reasons include excessively complex design, overly restrictive inclusion and exclusion criteria, burdensome regulations, excessive source-data verification, and concerns about the effect of clinical research conduct on workflow. Thought leaders have called on the clinical research community to consider alternative, transformative business models, including those models that focus on simplicity and leveraging of digital resources. We present some examples of innovative approaches by which some investigators have successfully conducted large-scale, clinical trials at relatively low cost. These examples include randomized registry trials, cluster-randomized trials, adaptive trials, and trials that are fully embedded within digital clinical care or administrative platforms.

  8. A Blended Learning Course Design in Clinical Pharmacology for Post-graduate Dental Students

    Science.gov (United States)

    Rosenbaum, Paul-Erik Lillholm; Mikalsen, Øyvind; Lygre, Henning; Solheim, Einar; Schjøtt, Jan

    2012-01-01

    Postgraduate courses in clinical pharmacology are important for dentists to be updated on drug therapy and information related to their clinical practice, as well as knowledge of relevant adverse effects and interactions. A traditional approach with classroom delivery as the only method to teaching and learning has shortcomings regarding flexibility, individual learning preferences, and problem based learning (PBL) activities compared to online environments. This study examines a five week postgraduate course in clinical pharmacology with 15 hours of lectures and online learning activities, i.e. blended course design. Six postgraduate dental students participated and at the end of the course they were interviewed. Our findings emphasize that a blended learning course design can be successfully used in postgraduate dental education. Key matters for discussion were time flexibility and location convenience, change in teacher’s role, rein-forced learning strategies towards professional needs, scarcity in online communication, and proposed future utilization of e-learning components. PMID:23248716

  9. Parallelization in Modern C++

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The traditionally used and well established parallel programming models OpenMP and MPI are both targeting lower level parallelism and are meant to be as language agnostic as possible. For a long time, those models were the only widely available portable options for developing parallel C++ applications beyond using plain threads. This has strongly limited the optimization capabilities of compilers, has inhibited extensibility and genericity, and has restricted the use of those models together with other, modern higher level abstractions introduced by the C++11 and C++14 standards. The recent revival of interest in the industry and wider community for the C++ language has also spurred a remarkable amount of standardization proposals and technical specifications being developed. Those efforts however have so far failed to build a vision on how to seamlessly integrate various types of parallelism, such as iterative parallel execution, task-based parallelism, asynchronous many-task execution flows, continuation s...

  10. Statistical controversies in clinical research: requiem for the 3 + 3 design for phase I trials.

    Science.gov (United States)

    Paoletti, X; Ezzalfani, M; Le Tourneau, C

    2015-09-01

    More than 95% of published phase I trials have used the 3 + 3 design to identify the dose to be recommended for phase II trials. However, the statistical community agrees on the limitations of the 3 + 3 design compared with model-based approaches. Moreover, the mechanisms of action of targeted agents strongly challenge the hypothesis that the maximum tolerated dose constitutes the optimal dose, and more outcomes including clinical and biological activity increasingly need to be taken into account to identify the optimal dose. We review key elements from clinical publications and from the statistical literature to show that the 3 + 3 design lacks the necessary flexibility to address the challenges of targeted agents. The design issues raised by expansion cohorts, new definitions of dose-limiting toxicity and trials of combinations are not easily addressed by the 3 + 3 design or its extensions. Alternative statistical proposals have been developed to make a better use of the complex data generated by phase I trials. Their applications require a close collaboration between all actors of early phase clinical trials. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  11. Design of clinical trials involving multiple hypothesis tests with a common control.

    Science.gov (United States)

    Schou, I Manjula; Marschner, Ian C

    2017-07-01

    Randomized clinical trials comparing several treatments to a common control are often reported in the medical literature. For example, multiple experimental treatments may be compared with placebo, or in combination therapy trials, a combination therapy may be compared with each of its constituent monotherapies. Such trials are typically designed using a balanced approach in which equal numbers of individuals are randomized to each arm, however, this can result in an inefficient use of resources. We provide a unified framework and new theoretical results for optimal design of such single-control multiple-comparator studies. We consider variance optimal designs based on D-, A-, and E-optimality criteria, using a general model that allows for heteroscedasticity and a range of effect measures that include both continuous and binary outcomes. We demonstrate the sensitivity of these designs to the type of optimality criterion by showing that the optimal allocation ratios are systematically ordered according to the optimality criterion. Given this sensitivity to the optimality criterion, we argue that power optimality is a more suitable approach when designing clinical trials where testing is the objective. Weighted variance optimal designs are also discussed, which, like power optimal designs, allow the treatment difference to play a major role in determining allocation ratios. We illustrate our methods using two real clinical trial examples taken from the medical literature. Some recommendations on the use of optimal designs in single-control multiple-comparator trials are also provided. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A cross-cultural convergent parallel mixed methods study of what makes a cancer-related symptom or functional health problem clinically important.

    Science.gov (United States)

    Giesinger, Johannes M; Aaronson, Neil K; Arraras, Juan I; Efficace, Fabio; Groenvold, Mogens; Kieffer, Jacobien M; Loth, Fanny L; Petersen, Morten Aa; Ramage, John; Tomaszewski, Krzysztof A; Young, Teresa; Holzner, Bernhard

    2018-02-01

    In this study, we investigated what makes a symptom or functional impairment clinically important, that is, relevant for a patient to discuss with a health care professional (HCP). This is the first part of a European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group project focusing on the development of thresholds for clinical importance for the EORTC QLQ-C30 questionnaire and its corresponding computer-adaptive version. We conducted interviews with cancer patients and HCPs in 6 European countries. Participants were asked to name aspects of a symptom or problem that make it clinically important and to provide importance ratings for a predefined set of aspects (eg, need for help and limitations of daily functioning). We conducted interviews with 83 cancer patients (mean age, 60.3 y; 50.6% men) and 67 HCPs. Participants related clinical importance to limitations of everyday life (patients, 65.1%; HCPs, 77.6%), the emotional impact of a symptom/problem (patients, 53.0%; HCPs, 64.2%), and duration/frequency (patients, 51.8%; HCPs, 49.3%). In the patient sample, importance ratings were highest for worries by partner or family, limitations in everyday life, and need for help from the medical staff. Health care professionals rated limitations in everyday life and need for help from the medical staff to be most important. Limitations in everyday life, need for (medical) help, and emotional impact on the patient or family/partner were found to be relevant aspects of clinical importance. Based on these findings, we will define anchor items for the development of thresholds for clinical importance for the EORTC measures in a Europe-wide field study. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Designing a clinical skills training laboratory with focus on video for better learning

    DEFF Research Database (Denmark)

    Lauridsen, Henrik Hein; Toftgård, Rie Castella; Nørgaard, Cita

    resources of varying quality on the internet if this is not made available during teaching. The objective of this project was to design a new clinical skills laboratory with IT and video facilities to support learning processes. Methods Teaching principles were described before decisions on the design......Objective The principles of apprenticeship in clinical skills training are increasingly being challenged. First, most students are proficient in learning from visual multimedia and will expect this to be part of a modern university education. Second, students will often find visual teaching...... on a priori described teaching and learning designs related to active learning principles. This was a complex process involving teachers, IT-experts, e-learning specialists and a variety of university employees....

  14. A parallel buffer tree

    DEFF Research Database (Denmark)

    Sitchinava, Nodar; Zeh, Norbert

    2012-01-01

    We present the parallel buffer tree, a parallel external memory (PEM) data structure for batched search problems. This data structure is a non-trivial extension of Arge's sequential buffer tree to a private-cache multiprocessor environment and reduces the number of I/O operations by the number of...... in the optimal OhOf(psortN + K/PB) parallel I/O complexity, where K is the size of the output reported in the process and psortN is the parallel I/O complexity of sorting N elements using P processors....

  15. Parallel Algorithms and Patterns

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation on parallel algorithms and patterns. A parallel algorithm is a well-defined, step-by-step computational procedure that emphasizes concurrency to solve a problem. Examples of problems include: Sorting, searching, optimization, matrix operations. A parallel pattern is a computational step in a sequence of independent, potentially concurrent operations that occurs in diverse scenarios with some frequency. Examples are: Reductions, prefix scans, ghost cell updates. We only touch on parallel patterns in this presentation. It really deserves its own detailed discussion which Gabe Rockefeller would like to develop.

  16. Application Portable Parallel Library

    Science.gov (United States)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  17. A cross-cultural convergent parallel mixed methods study of what makes a cancer-related symptom or functional health problem clinically important

    NARCIS (Netherlands)

    Giesinger, J.M.; Aaronson, N.K.; Arraras, J.I.; Efficace, F.; Groenvold, M.; Kieffer, J.M.; Loth, F.L.; Petersen, M.A.; Ramage, J.; Tomaszewski, K.A.; Young, T.; Holzner, B.

    2018-01-01

    Objective: In this study, we investigated what makes a symptom or functional impairment clinically important, that is, relevant for a patient to discuss with a health care professional (HCP). This is the first part of a European Organisation for Research and Treatment of Cancer (EORTC) Quality of

  18. Fringe Capacitance of a Parallel-Plate Capacitor.

    Science.gov (United States)

    Hale, D. P.

    1978-01-01

    Describes an experiment designed to measure the forces between charged parallel plates, and determines the relationship among the effective electrode area, the measured capacitance values, and the electrode spacing of a parallel plate capacitor. (GA)

  19. Differences Between Distributed and Parallel Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brightwell, R.; Maccabe, A.B.; Rissen, R.

    1998-10-01

    Distributed systems have been studied for twenty years and are now coming into wider use as fast networks and powerful workstations become more readily available. In many respects a massively parallel computer resembles a network of workstations and it is tempting to port a distributed operating system to such a machine. However, there are significant differences between these two environments and a parallel operating system is needed to get the best performance out of a massively parallel system. This report characterizes the differences between distributed systems, networks of workstations, and massively parallel systems and analyzes the impact of these differences on operating system design. In the second part of the report, we introduce Puma, an operating system specifically developed for massively parallel systems. We describe Puma portals, the basic building blocks for message passing paradigms implemented on top of Puma, and show how the differences observed in the first part of the report have influenced the design and implementation of Puma.

  20. Interface, information, interaction: a narrative review of design and functional requirements for clinical decision support.

    Science.gov (United States)

    Miller, Kristen; Mosby, Danielle; Capan, Muge; Kowalski, Rebecca; Ratwani, Raj; Noaiseh, Yaman; Kraft, Rachel; Schwartz, Sanford; Weintraub, William S; Arnold, Ryan

    2018-05-01

    Provider acceptance and associated patient outcomes are widely discussed in the evaluation of clinical decision support systems (CDSSs), but critical design criteria for tools have generally been overlooked. The objective of this work is to inform electronic health record alert optimization and clinical practice workflow by identifying, compiling, and reporting design recommendations for CDSS to support the efficient, effective, and timely delivery of high-quality care. A narrative review was conducted from 2000 to 2016 in PubMed and The Journal of Human Factors and Ergonomics Society to identify papers that discussed/recommended design features of CDSSs that are associated with the success of these systems. Fourteen papers were included as meeting the criteria and were found to have a total of 42 unique recommendations; 11 were classified as interface features, 10 as information features, and 21 as interaction features. Features are defined and described, providing actionable guidance that can be applied to CDSS development and policy. To our knowledge, no reviews have been completed that discuss/recommend design features of CDSS at this scale, and thus we found that this was important for the body of literature. The recommendations identified in this narrative review will help to optimize design, organization, management, presentation, and utilization of information through presentation, content, and function. The designation of 3 categories (interface, information, and interaction) should be further evaluated to determine the critical importance of the categories. Future work will determine how to prioritize them with limited resources for designers and developers in order to maximize the clinical utility of CDSS. This review will expand the field of knowledge and provide a novel organization structure to identify key recommendations for CDSS.

  1. Optimising the design and operation of semi-continuous affinity chromatography for clinical and commercial manufacture.

    Science.gov (United States)

    Pollock, James; Bolton, Glen; Coffman, Jon; Ho, Sa V; Bracewell, Daniel G; Farid, Suzanne S

    2013-04-05

    This paper presents an integrated experimental and modelling approach to evaluate the potential of semi-continuous chromatography for the capture of monoclonal antibodies (mAb) in clinical and commercial manufacture. Small-scale single-column experimental breakthrough studies were used to derive design equations for the semi-continuous affinity chromatography system. Verification runs with the semi-continuous 3-column and 4-column periodic counter current (PCC) chromatography system indicated the robustness of the design approach. The product quality profiles and step yields (after wash step optimisation) achieved were comparable to the standard batch process. The experimentally-derived design equations were incorporated into a decisional tool comprising dynamic simulation, process economics and sizing optimisation. The decisional tool was used to evaluate the economic and operational feasibility of whole mAb bioprocesses employing PCC affinity capture chromatography versus standard batch chromatography across a product's lifecycle from clinical to commercial manufacture. The tool predicted that PCC capture chromatography would offer more significant savings in direct costs for early-stage clinical manufacture (proof-of-concept) (∼30%) than for late-stage clinical (∼10-15%) or commercial (∼5%) manufacture. The evaluation also highlighted the potential facility fit issues that could arise with a capture resin (MabSelect) that experiences losses in binding capacity when operated in continuous mode over lengthy commercial campaigns. Consequently, the analysis explored the scenario of adopting the PCC system for clinical manufacture and switching to the standard batch process following product launch. The tool determined the PCC system design required to operate at commercial scale without facility fit issues and with similar costs to the standard batch process whilst pursuing a process change application. A retrofitting analysis established that the direct cost

  2. Research design considerations for chronic pain prevention clinical trials: IMMPACT recommendations.

    Science.gov (United States)

    Gewandter, Jennifer S; Dworkin, Robert H; Turk, Dennis C; Farrar, John T; Fillingim, Roger B; Gilron, Ian; Markman, John D; Oaklander, Anne Louise; Polydefkis, Michael J; Raja, Srinivasa N; Robinson, James P; Woolf, Clifford J; Ziegler, Dan; Ashburn, Michael A; Burke, Laurie B; Cowan, Penney; George, Steven Z; Goli, Veeraindar; Graff, Ole X; Iyengar, Smriti; Jay, Gary W; Katz, Joel; Kehlet, Henrik; Kitt, Rachel A; Kopecky, Ernest A; Malamut, Richard; McDermott, Michael P; Palmer, Pamela; Rappaport, Bob A; Rauschkolb, Christine; Steigerwald, Ilona; Tobias, Jeffrey; Walco, Gary A

    2015-07-01

    Although certain risk factors can identify individuals who are most likely to develop chronic pain, few interventions to prevent chronic pain have been identified. To facilitate the identification of preventive interventions, an IMMPACT meeting was convened to discuss research design considerations for clinical trials investigating the prevention of chronic pain. We present general design considerations for prevention trials in populations that are at relatively high risk for developing chronic pain. Specific design considerations included subject identification, timing and duration of treatment, outcomes, timing of assessment, and adjusting for risk factors in the analyses. We provide a detailed examination of 4 models of chronic pain prevention (ie, chronic postsurgical pain, postherpetic neuralgia, chronic low back pain, and painful chemotherapy-induced peripheral neuropathy). The issues discussed can, in many instances, be extrapolated to other chronic pain conditions. These examples were selected because they are representative models of primary and secondary prevention, reflect persistent pain resulting from multiple insults (ie, surgery, viral infection, injury, and toxic or noxious element exposure), and are chronically painful conditions that are treated with a range of interventions. Improvements in the design of chronic pain prevention trials could improve assay sensitivity and thus accelerate the identification of efficacious interventions. Such interventions would have the potential to reduce the prevalence of chronic pain in the population. Additionally, standardization of outcomes in prevention clinical trials will facilitate meta-analyses and systematic reviews and improve detection of preventive strategies emerging from clinical trials.

  3. Innovating cystic fibrosis clinical trial designs in an era of successful standard of care therapies.

    Science.gov (United States)

    VanDevanter, Donald R; Mayer-Hamblett, Nicole

    2017-11-01

    Evolving cystic fibrosis 'standards of care' have influenced recent cystic fibrosis clinical trial designs for new therapies; care additions/improvements will require innovative trial designs to maximize feasibility and efficacy detection. Three cystic fibrosis therapeutic areas (pulmonary exacerbations, Pseudomonas aeruginosa airway infections, and reduced cystic fibrosis transmembrane conductance regulator [CFTR] protein function) differ with respect to the duration for which recognized 'standards of care' have been available. However, developers of new therapies in all the three areas are affected by similar challenges: standards of care have become so strongly entrenched that traditional placebo-controlled studies in cystic fibrosis populations likely to benefit from newer therapies have become less and less feasible. Today, patients/clinicians are more likely to entertain participation in active-comparator trial designs, that have substantial challenges of their own. Foremost among these are the selection of 'valid' active comparator(s), estimation of a comparator's current clinical efficacy (required for testing noninferiority hypotheses), and effective blinding of commercially available comparators. Recent and future cystic fibrosis clinical trial designs will have to creatively address this collateral result of successful past development of effective cystic fibrosis therapies: patients and clinicians are much less likely to accept simple, placebo-controlled studies to evaluate future therapies.

  4. The DEMO trial: a randomized, parallel-group, observer-blinded clinical trial of strength versus aerobic versus relaxation training for patients with mild to moderate depression

    DEFF Research Database (Denmark)

    Krogh, Jesper; Saltin, Bengt; Gluud, Christian

    2009-01-01

    OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners......: Our findings do not support a biologically mediated effect of exercise on symptom severity in depressed patients, but they do support a beneficial effect of strength training on work capacity. TRIAL REGISTRATION: (ClinicalTrials.gov) Identifier: NCT00103415....... or psychiatrists and were eligible if they fulfilled the International Classification of Diseases, Tenth Revision, criteria for unipolar depression and were aged between 18 and 55 years. Patients (N = 165) were allocated to supervised strength, aerobic, or relaxation training during a 4-month period. The primary...

  5. Efficacy of topical resin lacquer, amorolfine and oral terbinafine for treating toenail onychomycosis: a prospective, randomized, controlled, investigator-blinded, parallel-group clinical trial.

    Science.gov (United States)

    Auvinen, T; Tiihonen, R; Soini, M; Wangel, M; Sipponen, A; Jokinen, J J

    2015-10-01

    Norway spruce (Picea abies) produces resin to protect against decomposition by microbial pathogens. In vitro tests have shown that spruce resin has antifungal properties against dermatophytes known to cause nearly 90% of onychomycosis in humans. To confirm previous in vivo observations that a topical resin lacquer provides mycological and clinical efficacy, and to compare this lacquer with topical amorolfine hydrochloride lacquer and systemic terbinafine for treating dermatophyte toenail onychomycosis. In this prospective, randomized, controlled, investigator-blinded study, 73 patients with onychomycosis were randomized to receive topical 30% resin lacquer once daily for 9 months, topical 5% amorolfine lacquer once weekly for 9 months, or 250 mg oral terbinafine once daily for 3 months. The primary outcome measure was complete mycological cure at 10 months. Secondary outcomes were clinical efficacy, cost-effectiveness and patient compliance. At 10 months, complete mycological cure rates with the resin, amorolfine and terbinafine treatments were 13% [95% confidence interval (CI) 0-28], 8% (95% CI 0-19) and 56% (95% CI 35-77), respectively (P ≤ 0·002). At 10 months, clinical responses were complete in four patients (16%) treated with terbinafine, and partial in seven (30%), seven (28%) and nine (36%) patients treated with resin, amorolfine and terbinafine, respectively (P terbinafine treatments cost €41·6, €56·3 and €52·1, respectively, per patient (P terbinafine was significantly more effective in terms of mycological cure and clinical outcome than either topical therapy at the 10-month follow-up. © 2015 British Association of Dermatologists.

  6. Invited review: study design considerations for clinical research in veterinary radiology and radiation oncology.

    Science.gov (United States)

    Scrivani, Peter V; Erb, Hollis N

    2013-01-01

    High quality clinical research is essential for advancing knowledge in the areas of veterinary radiology and radiation oncology. Types of clinical research studies may include experimental studies, method-comparison studies, and patient-based studies. Experimental studies explore issues relative to pathophysiology, patient safety, and treatment efficacy. Method-comparison studies evaluate agreement between techniques or between observers. Patient-based studies investigate naturally acquired disease and focus on questions asked in clinical practice that relate to individuals or populations (e.g., risk, accuracy, or prognosis). Careful preplanning and study design are essential in order to achieve valid results. A key point to planning studies is ensuring that the design is tailored to the study objectives. Good design includes a comprehensive literature review, asking suitable questions, selecting the proper sample population, collecting the appropriate data, performing the correct statistical analyses, and drawing conclusions supported by the available evidence. Most study designs are classified by whether they are experimental or observational, longitudinal or cross-sectional, and prospective or retrospective. Additional features (e.g., controlled, randomized, or blinded) may be described that address bias. Two related challenging aspects of study design are defining an important research question and selecting an appropriate sample population. The sample population should represent the target population as much as possible. Furthermore, when comparing groups, it is important that the groups are as alike to each other as possible except for the variables of interest. Medical images are well suited for clinical research because imaging signs are categorical or numerical variables that might be predictors or outcomes of diseases or treatments. © 2013 Veterinary Radiology & Ultrasound.

  7. Participatory design methods for the development of a clinical telehealth service for neonatal homecare.

    Science.gov (United States)

    Garne Holm, Kristina; Brødsgaard, Anne; Zachariassen, Gitte; Smith, Anthony C; Clemensen, Jane

    2017-01-01

    Neonatal homecare delivered during home visits by neonatal nurses is a common method for supporting families of preterm infants following discharge. Telehealth has been introduced for the provision of neonatal homecare, resulting in positive feedback from parents of preterm infants. While the benefits are beginning to be realised, widespread uptake of telehealth has been limited due to a range of logistical challenges. Understanding user requirements is important when planning and developing a clinical telehealth service. We therefore used participatory design to develop a clinical telehealth service for neonatal homecare. The study adopted a participatory design approach to engage users in the development and design of a new telehealth service. Participatory design embraces qualitative research methods. Creative and technical workshops were conducted as part of the study. Tests of the telehealth service were conducted in the neonatal unit. Participants in this study were former and current parents of preterm infants eligible for neonatal homecare, and clinical staff (medical and nursing) from the neonatal unit. Preterm infants accompanied their parents. Based on the results obtained during the workshops and subsequent testing, we developed an application (app), which was integrated into the medical record at the neonatal unit. The app was used to initiate videoconferences and chat messages between the family at home and the neonatal unit, and to share information regarding infant growth and well-being. Results obtained from the workshops and testing demonstrated the importance of involving users when developing new telehealth applications. The workshops helped identify the challenges associated with delivery of the service, and helped instruct the design of a new telehealth service for neonatal homecare based on the needs of parents and clinical staff.

  8. FILMPAR: A parallel algorithm designed for the efficient and accurate computation of thin film flow on functional surfaces containing micro-structure

    Science.gov (United States)

    Lee, Y. C.; Thompson, H. M.; Gaskell, P. H.

    2009-12-01

    FILMPAR is a highly efficient and portable parallel multigrid algorithm for solving a discretised form of the lubrication approximation to three-dimensional, gravity-driven, continuous thin film free-surface flow over substrates containing micro-scale topography. While generally applicable to problems involving heterogeneous and distributed features, for illustrative purposes the algorithm is benchmarked on a distributed memory IBM BlueGene/P computing platform for the case of flow over a single trench topography, enabling direct comparison with complementary experimental data and existing serial multigrid solutions. Parallel performance is assessed as a function of the number of processors employed and shown to lead to super-linear behaviour for the production of mesh-independent solutions. In addition, the approach is used to solve for the case of flow over a complex inter-connected topographical feature and a description provided of how FILMPAR could be adapted relatively simply to solve for a wider class of related thin film flow problems. Program summaryProgram title: FILMPAR Catalogue identifier: AEEL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 530 421 No. of bytes in distributed program, including test data, etc.: 1 960 313 Distribution format: tar.gz Programming language: C++ and MPI Computer: Desktop, server Operating system: Unix/Linux Mac OS X Has the code been vectorised or parallelised?: Yes. Tested with up to 128 processors RAM: 512 MBytes Classification: 12 External routines: GNU C/C++, MPI Nature of problem: Thin film flows over functional substrates containing well-defined single and complex topographical features are of enormous significance, having a wide variety of engineering

  9. The DEMO trial: a randomized, parallel-group, observer-blinded clinical trial of strength versus aerobic versus relaxation training for patients with mild to moderate depression

    DEFF Research Database (Denmark)

    Krogh, Jesper; Saltin, Bengt; Gluud, Christian

    2009-01-01

    OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners or psych......: Our findings do not support a biologically mediated effect of exercise on symptom severity in depressed patients, but they do support a beneficial effect of strength training on work capacity. TRIAL REGISTRATION: (ClinicalTrials.gov) Identifier: NCT00103415.......OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners...... or psychiatrists and were eligible if they fulfilled the International Classification of Diseases, Tenth Revision, criteria for unipolar depression and were aged between 18 and 55 years. Patients (N = 165) were allocated to supervised strength, aerobic, or relaxation training during a 4-month period. The primary...

  10. High-speed parallel counter

    International Nuclear Information System (INIS)

    Gus'kov, B.N.; Kalinnikov, V.A.; Krastev, V.R.; Maksimov, A.N.; Nikityuk, N.M.

    1985-01-01

    This paper describes a high-speed parallel counter that contains 31 inputs and 15 outputs and is implemented by integrated circuits of series 500. The counter is designed for fast sampling of events according to the number of particles that pass simultaneously through the hodoscopic plane of the detector. The minimum delay of the output signals relative to the input is 43 nsec. The duration of the output signals can be varied from 75 to 120 nsec

  11. Combinatorics of spreads and parallelisms

    CERN Document Server

    Johnson, Norman

    2010-01-01

    Partitions of Vector Spaces Quasi-Subgeometry Partitions Finite Focal-SpreadsGeneralizing André SpreadsThe Going Up Construction for Focal-SpreadsSubgeometry Partitions Subgeometry and Quasi-Subgeometry Partitions Subgeometries from Focal-SpreadsExtended André SubgeometriesKantor's Flag-Transitive DesignsMaximal Additive Partial SpreadsSubplane Covered Nets and Baer Groups Partial Desarguesian t-Parallelisms Direct Products of Affine PlanesJha-Johnson SL(2,

  12. Parallel discrete event simulation

    NARCIS (Netherlands)

    Overeinder, B.J.; Hertzberger, L.O.; Sloot, P.M.A.; Withagen, W.J.

    1991-01-01

    In simulating applications for execution on specific computing systems, the simulation performance figures must be known in a short period of time. One basic approach to the problem of reducing the required simulation time is the exploitation of parallelism. However, in parallelizing the simulation

  13. Language constructs for modular parallel programs

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.

    1996-03-01

    We describe programming language constructs that facilitate the application of modular design techniques in parallel programming. These constructs allow us to isolate resource management and processor scheduling decisions from the specification of individual modules, which can themselves encapsulate design decisions concerned with concurrence, communication, process mapping, and data distribution. This approach permits development of libraries of reusable parallel program components and the reuse of these components in different contexts. In particular, alternative mapping strategies can be explored without modifying other aspects of program logic. We describe how these constructs are incorporated in two practical parallel programming languages, PCN and Fortran M. Compilers have been developed for both languages, allowing experimentation in substantial applications.

  14. Totally parallel multilevel algorithms

    Science.gov (United States)

    Frederickson, Paul O.

    1988-01-01

    Four totally parallel algorithms for the solution of a sparse linear system have common characteristics which become quite apparent when they are implemented on a highly parallel hypercube such as the CM2. These four algorithms are Parallel Superconvergent Multigrid (PSMG) of Frederickson and McBryan, Robust Multigrid (RMG) of Hackbusch, the FFT based Spectral Algorithm, and Parallel Cyclic Reduction. In fact, all four can be formulated as particular cases of the same totally parallel multilevel algorithm, which are referred to as TPMA. In certain cases the spectral radius of TPMA is zero, and it is recognized to be a direct algorithm. In many other cases the spectral radius, although not zero, is small enough that a single iteration per timestep keeps the local error within the required tolerance.

  15. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's reference manual.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  16. DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L. (Sandai National Labs, Livermore, CA); Watson, Jean-Paul; Kolda, Tamara Gibson (Sandai National Labs, Livermore, CA); Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandai National Labs, Livermore, CA); Hough, Patricia Diane (Sandai National Labs, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Guinta, Anthony A.; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  17. Clinical and cost effectiveness of mechanical support for severe ankle sprains: design of a randomised controlled trial in the emergency department [ISRCTN 37807450

    Directory of Open Access Journals (Sweden)

    Hutton JL

    2005-01-01

    Full Text Available Abstract Background The optimal management for severe sprains (Grades II and III of the lateral ligament complex of the ankle is unclear. The aims of this randomised controlled trial are to estimate (1 the clinical effectiveness of three methods of providing mechanical support to the ankle (below knee cast, Aircast® brace and Bledsoe® boot in comparison to Tubigrip®, and (2 to compare the cost of each strategy, including subsequent health care costs. Methods/design Six hundred and fifty people with a diagnosis of severe sprain are being identified through emergency departments. The study has been designed to complement routine practice in the emergency setting. Outcomes are recovery of mobility (primary outcome and usual activity, residual symptoms and need for further medical, rehabilitation or surgical treatment. Parallel economic and qualitative studies are being conducted to aid interpretation of the results and to evaluate the cost-effectiveness of the interventions. Discussion This paper highlights the design, methods and operational aspects of a clinical trial of acute injury management in the emergency department.

  18. Participatory design methods for the development of a clinical telehealth service for neonatal homecare

    DEFF Research Database (Denmark)

    Garne Holm, Kristina; Brødsgaard, Anne; Zachariassen, Gitte

    2017-01-01

    . While the benefits are beginning to be realised, widespread uptake of telehealth has been limited due to a range of logistical challenges. Understanding user requirements is important when planning and developing a clinical telehealth service. We therefore used participatory design to develop a clinical...... on the results obtained during the workshops and subsequent testing, we developed an application (app), which was integrated into the medical record at the neonatal unit. The app was used to initiate videoconferences and chat messages between the family at home and the neonatal unit, and to share information...... regarding infant growth and well-being. CONCLUSION: Results obtained from the workshops and testing demonstrated the importance of involving users when developing new telehealth applications. The workshops helped identify the challenges associated with delivery of the service, and helped instruct the design...

  19. Parallel Architectures and Parallel Algorithms for Integrated Vision Systems. Ph.D. Thesis

    Science.gov (United States)

    Choudhary, Alok Nidhi

    1989-01-01

    Computer vision is regarded as one of the most complex and computationally intensive problems. An integrated vision system (IVS) is a system that uses vision algorithms from all levels of processing to perform for a high level application (e.g., object recognition). An IVS normally involves algorithms from low level, intermediate level, and high level vision. Designing parallel architectures for vision systems is of tremendous interest to researchers. Several issues are addressed in parallel architectures and parallel algorithms for integrated vision systems.

  20. Designing a Clinical Data Warehouse Architecture to Support Quality Improvement Initiatives.

    Science.gov (United States)

    Chelico, John D; Wilcox, Adam B; Vawdrey, David K; Kuperman, Gilad J

    2016-01-01

    Clinical data warehouses, initially directed towards clinical research or financial analyses, are evolving to support quality improvement efforts, and must now address the quality improvement life cycle. In addition, data that are needed for quality improvement often do not reside in a single database, requiring easier methods to query data across multiple disparate sources. We created a virtual data warehouse at NewYork Presbyterian Hospital that allowed us to bring together data from several source systems throughout the organization. We also created a framework to match the maturity of a data request in the quality improvement life cycle to proper tools needed for each request. As projects progress in the Define, Measure, Analyze, Improve, Control stages of quality improvement, there is a proper matching of resources the data needs at each step. We describe the analysis and design creating a robust model for applying clinical data warehousing to quality improvement.

  1. A Clinical Reasoning Tool for Virtual Patients: Design-Based Research Study.

    Science.gov (United States)

    Hege, Inga; Kononowicz, Andrzej A; Adler, Martin

    2017-11-02

    Clinical reasoning is a fundamental process medical students have to learn during and after medical school. Virtual patients (VP) are a technology-enhanced learning method to teach clinical reasoning. However, VP systems do not exploit their full potential concerning the clinical reasoning process; for example, most systems focus on the outcome and less on the process of clinical reasoning. Keeping our concept grounded in a former qualitative study, we aimed to design and implement a tool to enhance VPs with activities and feedback, which specifically foster the acquisition of clinical reasoning skills. We designed the tool by translating elements of a conceptual clinical reasoning learning framework into software requirements. The resulting clinical reasoning tool enables learners to build their patient's illness script as a concept map when they are working on a VP scenario. The student's map is compared with the experts' reasoning at each stage of the VP, which is technically enabled by using Medical Subject Headings, which is a comprehensive controlled vocabulary published by the US National Library of Medicine. The tool is implemented using Web technologies, has an open architecture that enables its integration into various systems through an open application program interface, and is available under a Massachusetts Institute of Technology license. We conducted usability tests following a think-aloud protocol and a pilot field study with maps created by 64 medical students. The results show that learners interact with the tool but create less nodes and connections in the concept map than an expert. Further research and usability tests are required to analyze the reasons. The presented tool is a versatile, systematically developed software component that specifically supports the clinical reasoning skills acquisition. It can be plugged into VP systems or used as stand-alone software in other teaching scenarios. The modular design allows an extension with new

  2. Ozone exposure and pulmonary effects in panel and human clinical studies: Considerations for design and interpretation.

    Science.gov (United States)

    Rohr, Annette C

    2018-04-01

    A wealth of literature exists regarding the pulmonary effects of ozone, a photochemical pollutant produced by the reaction of nitrogen oxide and volatile organic precursors in the presence of sunlight. This paper focuses on epidemiological panel studies and human clinical studies of ozone exposure, and discusses issues specific to this pollutant that may influence study design and interpretation as well as other, broader considerations relevant to ozone-health research. The issues are discussed using examples drawn from the wider literature. The recent panel and clinical literature is also reviewed. Health outcomes considered include lung function, symptoms, and pulmonary inflammation. Issues discussed include adversity, reversibility, adaptation, variability in ozone exposure metric used and health outcomes evaluated, co-pollutants in panel studies, influence of temperature in panel studies, and multiple comparisons. Improvements in and standardization of panel study approaches are recommended to facilitate comparisons between studies as well as meta-analyses. Additional clinical studies at or near the current National Ambient Air Quality Standard (NAAQS) of 70 ppb are recommended, as are clinical studies in sensitive subpopulations such as asthmatics. The pulmonary health impacts of ozone exposure have been well documented using both epidemiological and chamber study designs. However, there are a number of specific methodological and related issues that should be considered when interpreting the results of these studies and planning additional research, including the standardization of exposure and health metrics to facilitate comparisons among studies.

  3. Discovery of urinary biomarkers to discriminate between exogenous and semi-endogenous thiouracil in cattle: A parallel-like randomized design.

    Science.gov (United States)

    Van Meulebroek, Lieven; Wauters, Jella; Pomian, Beata; Vanden Bussche, Julie; Delahaut, Philippe; Fichant, Eric; Vanhaecke, Lynn

    2018-01-01

    In the European Union, the use of thyreostats for animal fattening purposes has been banned and monitoring plans have been established to detect potential abuse. However, this is not always straightforward as thyreostats such as thiouracil may also have a semi-endogenous origin. Therefore, this study aimed at defining urinary metabolites, which may aid in defining the origin of detected thiouracil. Hereto, a parallel-like randomized in vivo study was conducted in which calves (n = 8) and cows (n = 8) were subjected to either a control treatment, rapeseed-enriched diet to induce semi-endogenous formation, or thiouracil treatment. Urine samples (n = 330) were assessed through metabolic fingerprinting, employing liquid-chromatography and Q-ExactiveTM Orbitrap mass spectrometry. Urinary fingerprints comprised up to 40,000 features whereby multivariate discriminant analysis was able to point out significant metabolome differences between treatments (Q2(Y) ≥ 0.873). Using the validated models, a total of twelve metabolites (including thiouracil) were assigned marker potential. Combining these markers into age-dependent biomarker panels rendered a tool by which sample classification could be improved in comparison with thiouracil-based thresholds, and this during on-going thiouracil treatment (specificities ≥ 95.2% and sensitivities ≥ 85.7%), post-treatment (sensitivities ≥ 80% for ≥ 24 h after last administration), and simulated low-dose thiouracil treatment (exogenous thiouracil below 30 ng μL-1). Moreover, the metabolic relevance of revealed markers was supported by the suggested identities, for which a structural link with thiouracil could be determined in most cases. The proposed biomarker panels may contribute to a more justified decision-making in monitoring thiouracil abuse.

  4. Diabetes mellitus and abnormal glucose tolerance development after gestational diabetes: A three-year, prospective, randomized, clinical-based, Mediterranean lifestyle interventional study with parallel groups.

    Science.gov (United States)

    Pérez-Ferre, Natalia; Del Valle, Laura; Torrejón, Maria José; Barca, Idoya; Calvo, María Isabel; Matía, Pilar; Rubio, Miguel A; Calle-Pascual, Alfonso L

    2015-08-01

    Women with prior gestational diabetes mellitus (GDM) have a high risk of developing type 2 diabetes mellitus (DM2) in later life. The study aim was to evaluate the efficacy of a lifestyle intervention for the prevention of glucose disorders (impaired fasting glucose, impaired glucose tolerance or DM2) in women with prior GDM. A total of 260 women with prior GDM who presented with normal fasting plasma glucose at six to twelve weeks postpartum were randomized into two groups: a Mediterranean lifestyle intervention group (n = 130) who underwent an educational program on nutrition and a monitored physical activity program and a control group (n = 130) with a conventional follow-up. A total of 237 women completed the three-year follow-up (126 in the intervention group and 111 in the control group). Their glucose disorders rates, clinical and metabolic changes and rates of adherence to the Mediterranean lifestyle were analyzed. Less women in the intervention group (42.8%) developed glucose disorders at the end of the three-year follow-up period compared with the control group (56.75%), p Lifestyle intervention was effective for the prevention of glucose disorders in women with prior GDM. Body weight gain and an unhealthy fat intake pattern were found to be the most predictive factors for the development of glucose disorders. Current Controlled trials: ISRCTN24165302. http://www.controlled-trials.com/isrctn/pf/24165302. Copyright © 2014 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  5. Standards for Clinical Trials in Male and Female Sexual Dysfunction: I. Phase I to Phase IV Clinical Trial Design.

    Science.gov (United States)

    Fisher, William A; Gruenwald, Ilan; Jannini, Emmanuele A; Lev-Sagie, Ahinoam; Lowenstein, Lior; Pyke, Robert E; Reisman, Yakov; Revicki, Dennis A; Rubio-Aurioles, Eusebio

    2016-12-01

    This series of articles outlines standards for clinical trials of treatments for male and female sexual dysfunctions, with a focus on research design and patient-reported outcome assessment. These articles consist of revision, updating, and integration of articles on standards for clinical trials in male and female sexual dysfunction from the 2010 International Consultation on Sexual Medicine developed by the authors as part of the 2015 International Consultation on Sexual Medicine. We are guided in this effort by several principles. In contrast to previous versions of these guidelines, we merge discussion of standards for clinical trials in male and female sexual dysfunction in an integrated approach that emphasizes the common foundational practices that underlie clinical trials in the two settings. We present a common expected standard for clinical trial design in male and female sexual dysfunction, a common rationale for the design of phase I to IV clinical trials, and common considerations for selection of study population and study duration in male and female sexual dysfunction. We present a focused discussion of fundamental principles in patient- (and partner-) reported outcome assessment and complete this series of articles with specific discussions of selected aspects of clinical trials that are unique to male and to female sexual dysfunction. Our consideration of standards for clinical trials in male and female sexual dysfunction attempts to embody sensitivity to existing and new regulatory guidance and to address implications of the evolution of the diagnosis of sexual dysfunction that have been brought forward in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition. The first article in this series focuses on phase I to phase IV clinical trial design considerations. Subsequent articles in this series focus on the measurement of patient-reported outcomes, unique aspects of clinical trial design for men, and unique aspects of clinical

  6. Complexities and potential pitfalls of clinical study design and data analysis in assisted reproduction.

    Science.gov (United States)

    Patounakis, George; Hill, Micah J

    2018-06-01

    The purpose of the current review is to describe the common pitfalls in design and statistical analysis of reproductive medicine studies. It serves to guide both authors and reviewers toward reducing the incidence of spurious statistical results and erroneous conclusions. The large amount of data gathered in IVF cycles leads to problems with multiplicity, multicollinearity, and over fitting of regression models. Furthermore, the use of the word 'trend' to describe nonsignificant results has increased in recent years. Finally, methods to accurately account for female age in infertility research models are becoming more common and necessary. The pitfalls of study design and analysis reviewed provide a framework for authors and reviewers to approach clinical research in the field of reproductive medicine. By providing a more rigorous approach to study design and analysis, the literature in reproductive medicine will have more reliable conclusions that can stand the test of time.

  7. Towards Clinically Optimized MRI-guided Surgical Manipulator for Minimally Invasive Prostate Percutaneous Interventions: Constructive Design*

    Science.gov (United States)

    Eslami, Sohrab; Fischer, Gregory S.; Song, Sang-Eun; Tokuda, Junichi; Hata, Nobuhiko; Tempany, Clare M.; Iordachita, Iulian

    2013-01-01

    This paper undertakes the modular design and development of a minimally invasive surgical manipulator for MRI-guided transperineal prostate interventions. Severe constraints for the MRI-compatibility to hold the minimum artifact on the image quality and dimensions restraint of the bore scanner shadow the design procedure. Regarding the constructive design, the manipulator kinematics has been optimized and the effective analytical needle workspace is developed and followed by proposing the workflow for the manual needle insertion. A study of the finite element analysis is established and utilized to improve the mechanism weaknesses under some inevitable external forces to ensure the minimum structure deformation. The procedure for attaching a sterile plastic drape on the robot manipulator is discussed. The introduced robotic manipulator herein is aimed for the clinically prostate biopsy and brachytherapy applications. PMID:24683502

  8. Design, development and deployment of a Diabetes Research Registry to facilitate recruitment in clinical research.

    Science.gov (United States)

    Tan, Meng H; Bernstein, Steven J; Gendler, Stephen; Hanauer, David; Herman, William H

    2016-03-01

    A major challenge in conducting clinical trials/studies is the timely recruitment of eligible subjects. Our aim is to develop a Diabetes Research Registry (DRR) to facilitate recruitment by matching potential subjects interested in research with approved clinical studies using study entry criteria abstracted from their electronic health records (EHR). A committee with expertise in diabetes, quality improvement, information technology, and informatics designed and developed the DRR. Using a hybrid approach, we identified and consented patients interested in research, abstracted their EHRs to assess common eligibility criteria, and contacted them about their interest in participating in specific studies. Investigators submit their requests with study entry criteria to the DRR which then provides a list of potential subjects who may be directly contacted for their study. The DRR meets all local, regional and federal regulatory requirements. After 5 years, the DRR has over 5000 registrants. About 30% have type 1 diabetes and 70% have type 2 diabetes. There are almost equal proportions of men and women. During this period, 31 unique clinical studies from 19 unique investigators requested lists of potential subjects for their studies. Eleven grant applications from 10 unique investigators used aggregated counts of potentially eligible subjects in their applications. The DRR matches potential subjects interested in research with approved clinical studies using study entry criteria abstracted from their EHR. By providing large lists of potentially eligible study subjects quickly, the DRR facilitated recruitment in 31 clinical studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Joshua D. (Sandia National lababoratory, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L. (Sandia National lababoratory, Livermore, CA); Watson, Jean-Paul; Kolda, Tamara Gibson (Sandia National lababoratory, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandia National lababoratory, Livermore, CA); Hough, Patricia Diane (Sandia National lababoratory, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  10. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jakeman, John Davis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stephens, John Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vigil, Dena M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wildey, Timothy Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bohnhoff, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hu, Kenneth T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dalbey, Keith R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauman, Lara E [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hough, Patricia Diane [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

  11. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  12. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's manual.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  13. Non-Cartesian parallel imaging reconstruction.

    Science.gov (United States)

    Wright, Katherine L; Hamilton, Jesse I; Griswold, Mark A; Gulani, Vikas; Seiberlich, Nicole

    2014-11-01

    Non-Cartesian parallel imaging has played an important role in reducing data acquisition time in MRI. The use of non-Cartesian trajectories can enable more efficient coverage of k-space, which can be leveraged to reduce scan times. These trajectories can be undersampled to achieve even faster scan times, but the resulting images may contain aliasing artifacts. Just as Cartesian parallel imaging can be used to reconstruct images from undersampled Cartesian data, non-Cartesian parallel imaging methods can mitigate aliasing artifacts by using additional spatial encoding information in the form of the nonhomogeneous sensitivities of multi-coil phased arrays. This review will begin with an overview of non-Cartesian k-space trajectories and their sampling properties, followed by an in-depth discussion of several selected non-Cartesian parallel imaging algorithms. Three representative non-Cartesian parallel imaging methods will be described, including Conjugate Gradient SENSE (CG SENSE), non-Cartesian generalized autocalibrating partially parallel acquisition (GRAPPA), and Iterative Self-Consistent Parallel Imaging Reconstruction (SPIRiT). After a discussion of these three techniques, several potential promising clinical applications of non-Cartesian parallel imaging will be covered. © 2014 Wiley Periodicals, Inc.

  14. Is there a clinical benefit with a smooth compensator design compared with a plunged compensator design for passive scattered protons?

    Energy Technology Data Exchange (ETDEWEB)

    Tabibian, Art A., E-mail: art.tabibian@gmail.com [University of Texas School of Allied Health-Medical Dosimetry, Houston, TX (United States); Powers, Adam; Dolormente, Keith; Oommen, Sneha; Tiwari, Akhil [University of Texas School of Allied Health-Medical Dosimetry, Houston, TX (United States); Palmer, Matt [MD Anderson Department of Radiation Oncology, Houston, TX (United States); Zhu, Xiaorong R.; Li, Heng; Sahoo, Narayan; Wisdom, Paul [MD Anderson Department of Radiation Physics, Houston, TX (United States); Velasco, Kyle [Radiation Oncology Resources, Goshen, IN (United States); Erhart, Kevin; Stanley, Henry [Decimal, Inc, Sanford, FL (United States); Nguyen, Bao-Ngoc T. [MD Anderson Department of Radiation Oncology, Houston, TX (United States)

    2015-04-01

    In proton therapy, passive scattered proton plans use compensators to conform the dose to the distal surface of the planning volume. These devices are custom made from acrylic or wax for each treatment field using either a plunge-drilled or smooth-milled compensator design. The purpose of this study was to investigate if there is a clinical benefit of generating passive scattered proton radiation treatment plans with the smooth compensator design. We generated 4 plans with different techniques using the smooth compensators. We chose 5 sites and 5 patients for each site for the range of dosimetric effects to show adequate sample. The plans were compared and evaluated using multicriteria (MCA) plan quality metrics for plan assessment and comparison using the Quality Reports [EMR] technology by Canis Lupus LLC. The average absolute difference for dosimetric metrics from the plunged-depth plan ranged from −4.7 to +3.0 and the average absolute performance results ranged from −6.6% to +3%. The manually edited smooth compensator plan yielded the best dosimetric metric, +3.0, and performance, + 3.0% compared to the plunged-depth plan. It was also superior to the other smooth compensator plans. Our results indicate that there are multiple approaches to achieve plans with smooth compensators similar to the plunged-depth plans. The smooth compensators with manual compensator edits yielded equal or better target coverage and normal tissue (NT) doses compared with the other smooth compensator techniques. Further studies are under investigation to evaluate the robustness of the smooth compensator design.

  15. Parallelism and array processing

    International Nuclear Information System (INIS)

    Zacharov, V.

    1983-01-01

    Modern computing, as well as the historical development of computing, has been dominated by sequential monoprocessing. Yet there is the alternative of parallelism, where several processes may be in concurrent execution. This alternative is discussed in a series of lectures, in which the main developments involving parallelism are considered, both from the standpoint of computing systems and that of applications that can exploit such systems. The lectures seek to discuss parallelism in a historical context, and to identify all the main aspects of concurrency in computation right up to the present time. Included will be consideration of the important question as to what use parallelism might be in the field of data processing. (orig.)

  16. Mechanical design of a free-wheel clutch for the thermal engine of a parallel hybrid vehicle with thermal and electrical power-train; Conception mecanique d'un accouplement a roue libre pour le moteur thermique d'un vehicule hybride parallele thermique et electrique

    Energy Technology Data Exchange (ETDEWEB)

    Santin, J.J.

    2001-07-01

    This thesis deals with the design of a free-wheel clutch. This unit is intended to replace the automated dry single-plate clutch of a parallel hybrid car with thermal and electric power-train. Furthermore, the car is a single shaft zero emission vehicle fitted with a controlled gearbox. Chapter one focuses on the type of hybrid vehicle studied. It shows the need to isolate the engine from the rest of the drive train, depending on the driving conditions. Chapter two presents and compares the two alternatives: automated clutch and free-wheel. In order to develop the free-wheel option, the torsional vibrations in the automotive drive line had to be closely studied. It required the design of a specific modular tool, as presented in chapter three, with the help of MATLAB SIMULINK. Lastly, chapter four shows how this tool was used during the design stage and specifies the way to build it. The free-wheel is then to be fitted to a prototype hybrid vehicle, constructed by both the LAMIH and PSA. (author)

  17. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases.

    Science.gov (United States)

    Sawata, Hiroshi; Ueshima, Kenji; Tsutani, Kiichiro

    2011-04-14

    Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP) study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1) to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2) to find ways to improve the environment surrounding clinical trials in Japan more generally. We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization), websites of related medical societies, the University Hospital Medical Information Network (UMIN) Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs). Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5%) was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not disclosed. To improve the quality of clinical

  18. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases

    Directory of Open Access Journals (Sweden)

    Tsutani Kiichiro

    2011-04-01

    Full Text Available Abstract Background Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1 to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2 to find ways to improve the environment surrounding clinical trials in Japan more generally. Methods We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization, websites of related medical societies, the University Hospital Medical Information Network (UMIN Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. Results We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs. Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5% was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not

  19. The impact of oat (Avena sativa) consumption on biomarkers of renal function in patients with chronic kidney disease: A parallel randomized clinical trial.

    Science.gov (United States)

    Rouhani, Mohammad Hossein; Mortazavi Najafabadi, Mojgan; Surkan, Pamela J; Esmaillzadeh, Ahmad; Feizi, Awat; Azadbakht, Leila

    2018-02-01

    Animal studies report that oat (Avena sativa L) intake has favorable effects on kidney function. However, the effects of oat consumption have not been assessed in humans. The aim of this study was to examine the impact of oat intake on biomarkers of renal function in patients with chronic kidney disease (CKD). Fifty-two patients with CKD were randomly assigned to a control group (recommended to reduce intake of dietary protein, phosphorus, sodium and potassium) or an oat consumption group (given nutritional recommendations for controls +50 g/day oats). Blood urea nitrogen (BUN), serum creatinine (SCr), urine creatinine, serum albumin, serum potassium, parathyroid hormone (PTH), serum klotho and urine protein concentration were measured at baseline and after an eight-week intervention. Creatinine clearance was calculated using urine creatinine concentration. Within group analysis showed a significant increase in BUN (P = 0.02) and serum potassium (P = 0.01) and a marginally significant increment in SCr (P = 0.08) among controls. However, changes in the oat group were not significant. In a multivariate adjusted model, we observed a significant difference in change of serum potassium (-0.03 mEq/L for oat group and 0.13 mEq/L for control group; P = 0.01) and a marginally significant difference in change of serum albumin (0.01 g/dl for oat group and -0.08 for control group; P = 0.08) between the two groups. There was no change in PTH concentration. Intake of oats may have a beneficial effect on serum albumin and serum potassium in patients with CKD. Present study registered under IRCT.ir identifier no. IRCT2015050414551N2. Copyright © 2016 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  20. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  1. Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.

    Science.gov (United States)

    Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi

    2018-03-01

    There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.

  2. Feasibility of streamlining an interactive Bayesian-based diagnostic support tool designed for clinical practice

    Science.gov (United States)

    Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa

    2016-03-01

    In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.

  3. Assessing Cognitive Function in Bipolar Disorder: Challenges and Recommendations for Clinical Trial Design

    Science.gov (United States)

    Burdick, Katherine E.; Ketter, Terence A.; Goldberg, Joseph F.; Calabrese, Joseph R.

    2015-01-01

    OBJECTIVE Neurocognitive impairment in schizophrenia has been recognized for more than a century. In contrast, only recently have significant neurocognitive deficits been recognized in bipolar disorder. Converging data suggest the importance of cognitive problems in relation to quality of life in bipolar disorder, highlighting the need for treatment and prevention efforts targeting cognition in bipolar patients. Future treatment trials targeting cognitive deficits will be met with methodological challenges due to the inherent complexity and heterogeneity of the disorder, including significant diagnostic comorbidities, the episodic nature of the illness, frequent use of polypharmacy, cognitive heterogeneity, and a lack of consensus regarding measurement of cognition and outcome in bipolar patients. Guidelines for use in designing future trials are needed. PARTICIPANTS The members of the consensus panel (each of the bylined authors) were selected based upon their expertise in bipolar disorder. Dr. Burdick is a neuropsychologist who has studied cognition in this illness for 15 years; Drs. Ketter, Calabrese, and Goldberg each bring considerable expertise in the treatment of bipolar disorder both within and outside of controlled clinical trials. This consensus statement was derived from work together at scientific meetings (e.g. symposium presention at the 2014 Annual meeting of the American Society of Clinical Psychopharmacology, among others) and ongoing discussions by conference call. With the exception of the public presentations on this topic, these meetings were closed to outside participants. EVIDENCE A literature review was undertaken by the authors to identify illness-specific challenges relevant to the design and conduct of treatment trials targeting neurocognition in bipolar disorder. Expert opinion from each of the authors guided the consensus recommendations. CONSENSUS PROCESS Consensus recommendations, reached by unanimous opinion of the authors, are

  4. Bayer image parallel decoding based on GPU

    Science.gov (United States)

    Hu, Rihui; Xu, Zhiyong; Wei, Yuxing; Sun, Shaohua

    2012-11-01

    In the photoelectrical tracking system, Bayer image is decompressed in traditional method, which is CPU-based. However, it is too slow when the images become large, for example, 2K×2K×16bit. In order to accelerate the Bayer image decoding, this paper introduces a parallel speedup method for NVIDA's Graphics Processor Unit (GPU) which supports CUDA architecture. The decoding procedure can be divided into three parts: the first is serial part, the second is task-parallelism part, and the last is data-parallelism part including inverse quantization, inverse discrete wavelet transform (IDWT) as well as image post-processing part. For reducing the execution time, the task-parallelism part is optimized by OpenMP techniques. The data-parallelism part could advance its efficiency through executing on the GPU as CUDA parallel program. The optimization techniques include instruction optimization, shared memory access optimization, the access memory coalesced optimization and texture memory optimization. In particular, it can significantly speed up the IDWT by rewriting the 2D (Tow-dimensional) serial IDWT into 1D parallel IDWT. Through experimenting with 1K×1K×16bit Bayer image, data-parallelism part is 10 more times faster than CPU-based implementation. Finally, a CPU+GPU heterogeneous decompression system was designed. The experimental result shows that it could achieve 3 to 5 times speed increase compared to the CPU serial method.

  5. Augmenting Predictive Modeling Tools with Clinical Insights for Care Coordination Program Design and Implementation.

    Science.gov (United States)

    Johnson, Tracy L; Brewer, Daniel; Estacio, Raymond; Vlasimsky, Tara; Durfee, Michael J; Thompson, Kathy R; Everhart, Rachel M; Rinehart, Deborath J; Batal, Holly

    2015-01-01

    The Center for Medicare and Medicaid Innovation (CMMI) awarded Denver Health's (DH) integrated, safety net health care system $19.8 million to implement a "population health" approach into the delivery of primary care. This major practice transformation builds on the Patient Centered Medical Home (PCMH) and Wagner's Chronic Care Model (CCM) to achieve the "Triple Aim": improved health for populations, care to individuals, and lower per capita costs. This paper presents a case study of how DH integrated published predictive models and front-line clinical judgment to implement a clinically actionable, risk stratification of patients. This population segmentation approach was used to deploy enhanced care team staff resources and to tailor care-management services to patient need, especially for patients at high risk of avoidable hospitalization. Developing, implementing, and gaining clinical acceptance of the Health Information Technology (HIT) solution for patient risk stratification was a major grant objective. In addition to describing the Information Technology (IT) solution itself, we focus on the leadership and organizational processes that facilitated its multidisciplinary development and ongoing iterative refinement, including the following: team composition, target population definition, algorithm rule development, performance assessment, and clinical-workflow optimization. We provide examples of how dynamic business intelligence tools facilitated clinical accessibility for program design decisions by enabling real-time data views from a population perspective down to patient-specific variables. We conclude that population segmentation approaches that integrate clinical perspectives with predictive modeling results can better identify high opportunity patients amenable to medical home-based, enhanced care team interventions.

  6. Design and analysis of a health care clinic for homeless people using simulations.

    Science.gov (United States)

    Reynolds, Jared; Zeng, Zhen; Li, Jingshan; Chiang, Shu-Yin

    2010-01-01

    Improving quality of care is important in health care management. For health care clinics, reducing patient waiting time and improving throughput with efficient utilization of the workforce are important issues to achieve better quality of care. This paper seeks to introduce a simulation study on design and analysis of a health clinic for homeless patients in Lexington, Kentucky, USA. Using the simulation model, the patient flow of the clinic and analyze quality of care for different staffing levels is simulated. In addition, the dependence of distributions on service times is investigated. Moreover, the impact of service time variability on quality of care (e.g. patient waiting time) is analyzed. The necessary staffing level and utilizations to reduce patient waiting times and improve throughput to achieve better quality of care are obtained. In addition, it is shown that the system performance is primarily dependent on the mean and coefficients of variation, rather than a complete distribution, of service times. In addition, a piece-wise linear approximation formula is proposed so that patient waiting time in the clinic can be estimated for any variability with only two simulations. The simulation method may need long model development time and long simulation executing time for complex systems. The quality of care delivery in a health care clinic can be evaluated using simulations. The results presented in the paper provide an easier approach for medical practitioners to evaluate different scenarios, examine needed resources, and carry out what-if analysis to predictthe impact of any changes in the system, to determine an optimal system configuration. The paper shows that such models provide a quantitative tool for clinic operations and management to achieve better care quality. Moreover, it can be easily adapted to model other health care facilities, such as hospitals, emergency rooms, operating rooms, supply chain in health care industry.

  7. Adaptive Clinical Trials: Advantages and Disadvantages of Various Adaptive Design Elements.

    Science.gov (United States)

    Korn, Edward L; Freidlin, Boris

    2017-06-01

    There is a wide range of adaptive elements of clinical trial design (some old and some new), with differing advantages and disadvantages. Classical interim monitoring, which adapts the design based on early evidence of superiority or futility of a treatment arm, has long been known to be extremely useful. A more recent application of interim monitoring is in the use of phase II/III designs, which can be very effective (especially in the setting of multiple experimental treatments and a reliable intermediate end point) but do have the cost of having to commit earlier to the phase III question than if separate phase II and phase III trials were performed. Outcome-adaptive randomization is an older technique that has recently regained attention; it increases trial complexity and duration without offering substantial benefits to the patients in the trial. The use of adaptive trials with biomarkers is new and has great potential for efficiently identifying patients who will be helped most by specific treatments. Master protocols in which trial arms and treatment questions are added to an ongoing trial can be especially efficient in the biomarker setting, where patients are screened for entry into different subtrials based on evolving knowledge about targeted therapies. A discussion of three recent adaptive clinical trials (BATTLE-2, I-SPY 2, and FOCUS4) highlights the issues. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  8. Methodology Series Module 9: Designing Questionnaires and Clinical Record Forms - Part II.

    Science.gov (United States)

    Setia, Maninder Singh

    2017-01-01

    This article is a continuation of the previous module on designing questionnaires and clinical record form in which we have discussed some basic points about designing the questionnaire and clinical record forms. In this section, we will discuss the reliability and validity of questionnaires. The different types of validity are face validity, content validity, criterion validity, and construct validity. The different types of reliability are test-retest reliability, inter-rater reliability, and intra-rater reliability. Some of these parameters are assessed by subject area experts. However, statistical tests should be used for evaluation of other parameters. Once the questionnaire has been designed, the researcher should pilot test the questionnaire. The items in the questionnaire should be changed based on the feedback from the pilot study participants and the researcher's experience. After the basic structure of the questionnaire has been finalized, the researcher should assess the validity and reliability of the questionnaire or the scale. If an existing standard questionnaire is translated in the local language, the researcher should assess the reliability and validity of the translated questionnaire, and these values should be presented in the manuscript. The decision to use a self- or interviewer-administered, paper- or computer-based questionnaire depends on the nature of the questions, literacy levels of the target population, and resources.

  9. Methodology Series Module 9: Designing Questionnaires and Clinical Record Forms – Part II

    Science.gov (United States)

    Setia, Maninder Singh

    2017-01-01

    This article is a continuation of the previous module on designing questionnaires and clinical record form in which we have discussed some basic points about designing the questionnaire and clinical record forms. In this section, we will discuss the reliability and validity of questionnaires. The different types of validity are face validity, content validity, criterion validity, and construct validity. The different types of reliability are test-retest reliability, inter-rater reliability, and intra-rater reliability. Some of these parameters are assessed by subject area experts. However, statistical tests should be used for evaluation of other parameters. Once the questionnaire has been designed, the researcher should pilot test the questionnaire. The items in the questionnaire should be changed based on the feedback from the pilot study participants and the researcher's experience. After the basic structure of the questionnaire has been finalized, the researcher should assess the validity and reliability of the questionnaire or the scale. If an existing standard questionnaire is translated in the local language, the researcher should assess the reliability and validity of the translated questionnaire, and these values should be presented in the manuscript. The decision to use a self- or interviewer-administered, paper- or computer-based questionnaire depends on the nature of the questions, literacy levels of the target population, and resources. PMID:28584367

  10. Parallel phase model : a programming model for high-end parallel machines with manycores.

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Junfeng (Syracuse University, Syracuse, NY); Wen, Zhaofang; Heroux, Michael Allen; Brightwell, Ronald Brian

    2009-04-01

    This paper presents a parallel programming model, Parallel Phase Model (PPM), for next-generation high-end parallel machines based on a distributed memory architecture consisting of a networked cluster of nodes with a large number of cores on each node. PPM has a unified high-level programming abstraction that facilitates the design and implementation of parallel algorithms to exploit both the parallelism of the many cores and the parallelism at the cluster level. The programming abstraction will be suitable for expressing both fine-grained and coarse-grained parallelism. It includes a few high-level parallel programming language constructs that can be added as an extension to an existing (sequential or parallel) programming language such as C; and the implementation of PPM also includes a light-weight runtime library that runs on top of an existing network communication software layer (e.g. MPI). Design philosophy of PPM and details of the programming abstraction are also presented. Several unstructured applications that inherently require high-volume random fine-grained data accesses have been implemented in PPM with very promising results.

  11. The kpx, a program analyzer for parallelization

    International Nuclear Information System (INIS)

    Matsuyama, Yuji; Orii, Shigeo; Ota, Toshiro; Kume, Etsuo; Aikawa, Hiroshi.

    1997-03-01

    The kpx is a program analyzer, developed as a common technological basis for promoting parallel processing. The kpx consists of three tools. The first is ktool, that shows how much execution time is spent in program segments. The second is ptool, that shows parallelization overhead on the Paragon system. The last is xtool, that shows parallelization overhead on the VPP system. The kpx, designed to work for any FORTRAN cord on any UNIX computer, is confirmed to work well after testing on Paragon, SP2, SR2201, VPP500, VPP300, Monte-4, SX-4 and T90. (author)

  12. Multistage parallel-serial time averaging filters

    International Nuclear Information System (INIS)

    Theodosiou, G.E.

    1980-01-01

    Here, a new time averaging circuit design, the 'parallel filter' is presented, which can reduce the time jitter, introduced in time measurements using counters of large dimensions. This parallel filter could be considered as a single stage unit circuit which can be repeated an arbitrary number of times in series, thus providing a parallel-serial filter type as a result. The main advantages of such a filter over a serial one are much less electronic gate jitter and time delay for the same amount of total time uncertainty reduction. (orig.)

  13. Manufacturing the truth: From designing clinical trials to publishing trial data.

    Science.gov (United States)

    Whitstock, Margaret

    2018-01-01

    This paper expands on some of the points made by Deepak Natarajan on techniques used in designing clinical trials of new drugs to ensure favourable outcomes. It also considers the nexus between the manufacturers of new drugs and the publishers of medical journals in which edited versions of these favourable outcomes are presented to the medical fraternity. The argument will be illustrated by referring to the clinical trials of rofecoxib (Vioxx®) and etoricoxib (Arcoxia®). Both these drugs are COX-2 selective non-steroidal anti-inflammatory drugs (NSAIDs) manufactured by Merck and Co. Because of the unparalleled access to Merck's internal confidential documents, due to the subpoenaing of these documents by government and private individuals in civil and criminal actions, we are still learning about the company's unconscionable acts. What we learn can inform our judgement concerning published reports of both new and old drugs.

  14. Designing a clinical dashboard to fill information gaps in the emergency department.

    Science.gov (United States)

    Swartz, Jordan L; Cimino, James J; Fred, Matthew R; Green, Robert A; Vawdrey, David K

    2014-01-01

    Data fragmentation within electronic health records causes gaps in the information readily available to clinicians. We investigated the information needs of emergency medicine clinicians in order to design an electronic dashboard to fill information gaps in the emergency department. An online survey was distributed to all emergency medicine physicians at a large, urban academic medical center. The survey response rate was 48% (52/109). The clinical information items reported to be most helpful while caring for patients in the emergency department were vital signs, electrocardiogram (ECG) reports, previous discharge summaries, and previous lab results. Brief structured interviews were also conducted with 18 clinicians during their shifts in the emergency department. From the interviews, three themes emerged: 1) difficulty accessing vital signs, 2) difficulty accessing point-of-care tests, and 3) difficulty comparing the current ECG with the previous ECG. An emergency medicine clinical dashboard was developed to address these difficulties.

  15. Psychotherapy with traumatised refugees – the design of a randomised clinical trial

    DEFF Research Database (Denmark)

    Vindbjerg, Erik; Carlsson, Jessica Mariana; Klimpke, Christoph Axel

    2014-01-01

    There is little evidence as to which kind of psychotherapy is the most effective in the treatment of traumatised refugees. At the Competence Center for  Transcultural Psychiatry, a series of clinical trials have been conducted since 2008. The first results are pending publication. The aim...... of this paper is to discuss some of the challenges in adapting Cognitive Behavioural Therapy (CBT) to the treatment of traumatised refugees, as well as describe a randomised clinical trial designed to test two such adaptations. In the described trial one group receives CBT with a focus on cognitive...... restructuring while the other group receives CBT focusing on Stress Management. A main goal of this setup is to test whether some, perhaps even most, of the traumatised refugees referred to treatment, may benefit from a more direct focus on current stress, and its alleviation through simple, repetitive...

  16. A Herbal Medicine, Gongjindan, in Subjects with Chronic Dizziness (GOODNESS Study: Study Protocol for a Prospective, Multicenter, Randomized, Double-Blind, Placebo-Controlled, Parallel-Group, Clinical Trial for Effectiveness, Safety, and Cost-Effectiveness

    Directory of Open Access Journals (Sweden)

    Seungwon Shin

    2017-01-01

    Full Text Available This study protocol aims to explore the effectiveness, safety, and cost-effectiveness of a herbal medication, Gongjindan (GJD, in patients with chronic dizziness. This will be a prospective, multicenter, randomized, double-blind, placebo-controlled, parallel-group, clinical trial. Seventy-eight patients diagnosed with Meniere’s disease, psychogenic dizziness, or dizziness of unknown cause will be randomized and allocated to either a GJD or a placebo group in a 1 : 1 ratio. Participants will be orally given 3.75 g GJD or placebo in pill form once a day for 56 days. The primary outcome measure will be the Dizziness Handicap Inventory score. Secondary outcome measures will be as follows: severity (mean vertigo scale and visual analogue scale and frequency of dizziness, balance function (Berg Balance Scale, fatigue (Fatigue Severity Scale and deficiency pattern/syndrome (qi blood yin yang-deficiency questionnaire levels, and depression (Korean version of Beck’s Depression Inventory and anxiety (State-Trait Anxiety Inventory levels. To assess safety, adverse events, including laboratory test results, will be monitored. Further, the incremental cost-effectiveness ratio will be calculated based on quality-adjusted life years (from the EuroQoL five dimensions’ questionnaire and medical expenses. Data will be statistically analyzed at a significance level of 0.05 (two-sided. This trial is registered with ClinicalTrials.gov NCT03219515, in July 2017.

  17. The STAPL Parallel Graph Library

    KAUST Repository

    Harshvardhan,; Fidel, Adam; Amato, Nancy M.; Rauchwerger, Lawrence

    2013-01-01

    This paper describes the stapl Parallel Graph Library, a high-level framework that abstracts the user from data-distribution and parallelism details and allows them to concentrate on parallel graph algorithm development. It includes a customizable

  18. Parallel image encryption algorithm based on discretized chaotic map

    International Nuclear Information System (INIS)

    Zhou Qing; Wong Kwokwo; Liao Xiaofeng; Xiang Tao; Hu Yue

    2008-01-01

    Recently, a variety of chaos-based algorithms were proposed for image encryption. Nevertheless, none of them works efficiently in parallel computing environment. In this paper, we propose a framework for parallel image encryption. Based on this framework, a new algorithm is designed using the discretized Kolmogorov flow map. It fulfills all the requirements for a parallel image encryption algorithm. Moreover, it is secure and fast. These properties make it a good choice for image encryption on parallel computing platforms

  19. Managed ventricular pacing vs. conventional dual-chamber pacing for elective replacements: the PreFER MVP study: clinical background, rationale, and design.

    Science.gov (United States)

    Quesada, Aurelio; Botto, Gianluca; Erdogan, Ali; Kozak, Milan; Lercher, Peter; Nielsen, Jens Cosedis; Piot, Olivier; Ricci, Renato; Weiss, Christian; Becker, Daniel; Wetzels, Gwenn; De Roy, Luc

    2008-03-01

    Several clinical studies have shown that, in patients with intact atrioventricular (AV) conduction, unnecessary chronic right ventricular (RV) pacing can be detrimental. The managed ventricular pacing (MVP) algorithm is designed to give preference to spontaneous AV conduction, thus minimizing RV pacing. The clinical outcomes of MVP are being studied in several ongoing trials in patients undergoing a first device implantation, but it is unknown to what extent MVP is beneficial in patients with a history of ventricular pacing. The purpose of the Prefer for Elective Replacement MVP (PreFER MVP) study is to assess the superiority of the MVP algorithm to conventional pacemaker and implantable cardioverter-defibrillator programming in terms of freedom from hospitalization for cardiovascular causes in a population of patients exposed to long periods of ventricular pacing. PreFER MVP is a prospective, 1:1 parallel, randomized (MVP ON/MVP OFF), single-blinded multi-centre trial. The study population consists of patients with more than 40% ventricular pacing documented with their previous device. Approximately, 600 patients will be randomized and followed for at least 24 months. The primary endpoint comprises cardiovascular hospitalization. The PreFER MVP trial is the first large prospective randomized clinical trial evaluating the effect of MVP in patients with a history of RV pacing.

  20. Using clinical simulation centers to test design interventions: a pilot study of lighting and color modifications.

    Science.gov (United States)

    Gray, Whitney Austin; Kesten, Karen S; Hurst, Stephen; Day, Tama Duffy; Anderko, Laura

    2012-01-01

    The aim of this pilot study was to test design interventions such as lighting, color, and spatial color patterning on nurses' stress, alertness, and satisfaction, and to provide an example of how clinical simulation centers can be used to conduct research. The application of evidence-based design research in healthcare settings requires a transdisciplinary approach. Integrating approaches from multiple fields in real-life settings often proves time consuming and experimentally difficult. However, forums for collaboration such as clinical simulation centers may offer a solution. In these settings, identical operating and patient rooms are used to deliver simulated patient care scenarios using automated mannequins. Two identical rooms were modified in the clinical simulation center. Nurses spent 30 minutes in each room performing simulated cardiac resuscitation. Subjective measures of nurses' stress, alertness, and satisfaction were collected and compared between settings and across time using matched-pair t-test analysis. Nurses reported feeling less stressed after exposure to the experimental room than nurses who were exposed to the control room (2.22, p = .03). Scores post-session indicated a significant reduction in stress and an increase in alertness after exposure to the experimental room as compared to the control room, with significance levels below .10. (Change in stress scores: 3.44, p = .069); (change in alertness scores: 3.6, p = .071). This study reinforces the use of validated survey tools to measure stress, alertness, and satisfaction. Results support human-centered design approaches by evaluating the effect on nurses in an experimental setting.

  1. Study design in clinical radiology; Studiendesign in der klinisch radiologischen Forschung

    Energy Technology Data Exchange (ETDEWEB)

    Knopp, M.V.; Floemer, F.; Zuna, I.; Kaick, G. van [Deutsches Krebsforschungszentrum (DKFZ) Heidelberg (Germany). Forschungsschwerpunkt Radiologische Diagnostik und Therapie

    1998-04-01

    Purpose: To review important aspects of study design in clinical radiology and to introduce the reader to the requirements of Good Clinical Practice (GCP). Methods: The European guidelines for GCP, the Declaration of Helsinki, the differentiation into study phases and the authors` own experience in open and sponsored clinical trials are the basis of this analysis. Results: Guideline such as GCP do not limit scientific freedom in research but define high standards for the well-being of patients and volunteers as well as guaranteeing scientific honesty. The benefits of defined data monitoring and the necessity of a prospective statistical concept are frequently underestimated. Conclusion: Correct study design has to be expected in radiology too. High standards guarantee accuracy and honesty of scientific studies. Only this can warrant the value for the patient of radiological diagnostics and therapy. (orig.) [Deutsch] Ziel dieses Artikels ist es, auf wesentliche Aspekte des Studiendesigns in der klinischen Radiologie aufmerksam zu machen und den Leser mit den Anforderungen der `good clinical practice` (GCP) vertraut zumachen. Die Europaeischen Richtlinien zur GCP, die Deklaration von Helsinki, die Einteilung in Studienphasen und eigene Erfahrungen in freien und gesponsorten klinischen Studien sind die Grundlage der Analyse. Richtlinien wie GCP und andere stellen keine Einschraenkung der wissenschaftlichen Freiheit dar, sondern definieren hohe Standards zum Schutz der Patienten und Probanden sowie zur Sicherstellung der wissenschaftlichen Wahrheit. Positive Auswirkungen eines definierten Datenmonitorings sowie die Notwendigkeit einer prospektiven statistischen Konzeption werden haeufig unterschaetzt. Regelrechtes Studiendesign ist auch in der Radiologie zu fordern. Hohe Standards garantieren die Aussagekraft und Wahrheit wissenschaftlicher Untersuchungen. Nur so kann die Wertigkeit radiologischer Diagnostik und Therapie fuer den Patienten sichergestellt werden. (orig.)

  2. Healthy School, Happy School: Design and Protocol for a Randomized Clinical Trial Designed to Prevent Weight Gain in Children.

    Science.gov (United States)

    Schuh, Daniela Schneid; Goulart, Maíra Ribas; Barbiero, Sandra Mari; Sica, Caroline D'Azevedo; Borges, Raphael; Moraes, David William; Pellanda, Lucia Campos

    2017-06-01

    Schools have become a key figure for the promotion of health and obesity interventions, bringing the development of critical awareness to the construction and promotion of a healthy diet, physical activity, and the monitoring of the nutritional status in childhood and adolescence. To describe a study protocol to evaluate the effectiveness of an intervention designed to improve knowledge of food choices in the school environment. This is a cluster-randomized, parallel, two-arm study conducted in public elementary and middle schools in Brazil. Participants will be children and adolescents between the ages of 5 and 15 years, from both genders. The interventions will be focusing on changes in lifestyle, physical activities and nutritional education. Intervention activities will occur monthly in the school's multimedia room or sports court. The control group arm will receive usual recommendations by the school. The primary outcome variable will be anthropometric measures, such as body mass index percentiles and levels of physical activity by the International Physical Activity Questionnaire. We expect that after the study children will increase the ingestion of fresh food, reduce excessive consumption of sugary and processed foods, and reduce the hours of sedentary activities. The purpose of starting the dietary intervention at this stage of life is to develop a knowledge that will enable for healthy choices, providing opportunities for a better future for this population. As escolas tornaram-se essenciais para a promoção de saúde e de intervenções para obesidade, propiciando o desenvolvimento de consciência crítica para a construção e promoção de dieta saudável, atividade física e monitoramento do status nutricional na infância e adolescência. Descrever um protocolo de estudo para avaliar a eficiência de uma intervenção projetada para aprimorar o conhecimento sobre escolhas alimentares no ambiente escolar. Estudo clínico randomizado em cluster

  3. Traditional and innovative experimental and clinical trial designs and their advantages and pitfalls.

    Science.gov (United States)

    Weimer, Katja; Enck, Paul

    2014-01-01

    Many study designs and design variants have been developed in the past to either overcome or enhance drug-placebo differences in clinical trials or to identify and characterize placebo responders in experimental studies. They share many commonalities as well as differences that are discussed here: the role of deception and ethical restrictions, habituation effects and the control of the natural course of disease, assay sensitivity testing and effective blinding, acceptability and motivation of patients and volunteers, and the development of individualized medicine. These are fostered by two opposite strategies: utilizing the beneficial aspects of the placebo response-and avoiding its negative counterpart, the nocebo effect-in medical routine for the benefit of patients, and minimizing-by controlling-the negative aspects of the placebo effect during drug development.

  4. Will dapivirine redeem the promises of anti-HIV microbicides? Overview of product design and clinical testing.

    Science.gov (United States)

    das Neves, José; Martins, João Pedro; Sarmento, Bruno

    2016-08-01

    Microbicides are being developed in order to prevent sexual transmission of HIV. Dapivirine, a non-nucleoside reverse transcriptase inhibitor, is one of the leading drug candidates in the field, currently being tested in various dosage forms, namely vaginal rings, gels, and films. In particular, a ring allowing sustained drug release for 1month is in an advanced stage of clinical testing. Two parallel phase III clinical trials are underway in sub-Saharan Africa and results are expected to be released in early 2016. This article overviews the development of dapivirine and its multiple products as potential microbicides, with particular emphasis being placed on clinical evaluation. Also, critical aspects regarding regulatory approval, manufacturing, distribution, and access are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. An ontology-driven, case-based clinical decision support model for removable partial denture design

    Science.gov (United States)

    Chen, Qingxiao; Wu, Ji; Li, Shusen; Lyu, Peijun; Wang, Yong; Li, Miao

    2016-06-01

    We present the initial work toward developing a clinical decision support model for specific design of removable partial dentures (RPDs) in dentistry. We developed an ontological paradigm to represent knowledge of a patient’s oral conditions and denture component parts. During the case-based reasoning process, a cosine similarity algorithm was applied to calculate similarity values between input patients and standard ontology cases. A group of designs from the most similar cases were output as the final results. To evaluate this model, the output designs of RPDs for 104 randomly selected patients were compared with those selected by professionals. An area under the curve of the receiver operating characteristic (AUC-ROC) was created by plotting true-positive rates against the false-positive rate at various threshold settings. The precision at position 5 of the retrieved cases was 0.67 and at the top of the curve it was 0.96, both of which are very high. The mean average of precision (MAP) was 0.61 and the normalized discounted cumulative gain (NDCG) was 0.74 both of which confirmed the efficient performance of our model. All the metrics demonstrated the efficiency of our model. This methodology merits further research development to match clinical applications for designing RPDs. This paper is organized as follows. After the introduction and description of the basis for the paper, the evaluation and results are presented in Section 2. Section 3 provides a discussion of the methodology and results. Section 4 describes the details of the ontology, similarity algorithm, and application.

  6. Survival of various implant-supported prosthesis designs following 36 months of clinical function.

    Science.gov (United States)

    Rodriguez, A M; Orenstein, I H; Morris, H F; Ochi, S

    2000-12-01

    The use of endosseous dental implants to replace natural teeth lost to trauma, dental caries, or periodontal disease has become a predictable form of prosthetic treatment since gaining popularity in the early 1980s. While numerous clinical studies have focused on the survival of implants, few address the survival of different prosthesis designs. Beginning in 1991, 882 prostheses supported by more than 2,900 implants (687 patients) were placed by the Department of Veterans Affairs Dental Implant Clinical Research Group (DICRG). These prostheses were divided into five research strata based on arch location. The recommended design for each stratum was: bar-supported overdenture (maxillary completely edentulous); screw-retained hybrid denture (mandibular completely edentulous); screw-retained fixed partial denture (mandibular and maxillary posterior partially edentulous); and cemented single crown (maxillary anterior single tooth). Alternative overdenture designs were utilized in the edentulous arches when the recommended prosthesis could not be fabricated. Prosthesis success rates for the research strata were calculated for an observation time of up to 36 months following prosthesis placement. Success rates for the maxillary edentulous stratum ranged from 94.6% for the bar-retained overdenture supported by five to six fixtures to 81.8% for the cap-retained overdenture. The mandibular edentulous strata produced success rates of 98.1% for the fixed hybrid prosthesis to 91.7% for the cap-retained prosthesis. Success rates for maxillary and mandibular posterior fixed partial dentures were 94.3% and 92.6%, respectively, while the maxillary anterior single-tooth prosthesis yielded a success rate of 98.1% for the 36-month observation period. The recommended prosthesis designs investigated in this study proved to be reliable, with encouraging success rates for an observation period of 36 months following placement.

  7. The Usefulness of Systematic Reviews of Animal Experiments for the Design of Preclinical and Clinical Studies

    Science.gov (United States)

    de Vries, Rob B. M.; Wever, Kimberley E.; Avey, Marc T.; Stephens, Martin L.; Sena, Emily S.; Leenaars, Marlies

    2014-01-01

    The question of how animal studies should be designed, conducted, and analyzed remains underexposed in societal debates on animal experimentation. This is not only a scientific but also a moral question. After all, if animal experiments are not appropriately designed, conducted, and analyzed, the results produced are unlikely to be reliable and the animals have in effect been wasted. In this article, we focus on one particular method to address this moral question, namely systematic reviews of previously performed animal experiments. We discuss how the design, conduct, and analysis of future (animal and human) experiments may be optimized through such systematic reviews. In particular, we illustrate how these reviews can help improve the methodological quality of animal experiments, make the choice of an animal model and the translation of animal data to the clinic more evidence-based, and implement the 3Rs. Moreover, we discuss which measures are being taken and which need to be taken in the future to ensure that systematic reviews will actually contribute to optimizing experimental design and thereby to meeting a necessary condition for making the use of animals in these experiments justified. PMID:25541545

  8. Massively parallel multicanonical simulations

    Science.gov (United States)

    Gross, Jonathan; Zierenberg, Johannes; Weigel, Martin; Janke, Wolfhard

    2018-03-01

    Generalized-ensemble Monte Carlo simulations such as the multicanonical method and similar techniques are among the most efficient approaches for simulations of systems undergoing discontinuous phase transitions or with rugged free-energy landscapes. As Markov chain methods, they are inherently serial computationally. It was demonstrated recently, however, that a combination of independent simulations that communicate weight updates at variable intervals allows for the efficient utilization of parallel computational resources for multicanonical simulations. Implementing this approach for the many-thread architecture provided by current generations of graphics processing units (GPUs), we show how it can be efficiently employed with of the order of 104 parallel walkers and beyond, thus constituting a versatile tool for Monte Carlo simulations in the era of massively parallel computing. We provide the fully documented source code for the approach applied to the paradigmatic example of the two-dimensional Ising model as starting point and reference for practitioners in the field.

  9. School-based mindfulness intervention for stress reduction in adolescents: Design and methodology of an open-label, parallel group, randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Jeanette M. Johnstone

    2016-12-01

    Full Text Available Adolescents are in a high-risk period developmentally, in terms of susceptibility to stress. A mindfulness intervention represents a potentially useful strategy for developing cognitive and emotion regulation skills associated with successful stress coping. Mindfulness strategies have been used successfully for emotional coping in adults, but are not as well studied in youth. This article details a novel proposal for the design of an 8-week randomized study to evaluate a high school-based mindfulness curriculum delivered as part of a two semester health class. A wellness education intervention is proposed as an active control, along with a waitlist control condition. All students enrolled in a sophomore (10th grade health class at a private suburban high school will be invited to participate (n = 300. Pre-test assessments will be obtained by youth report, parent ratings, and on-site behavioral testing. The assessments will evaluate baseline stress, mood, emotional coping, controlled attention, and working memory. Participants, divided into 13 classrooms, will be randomized into one of three conditions, by classroom: A mindfulness intervention, an active control (wellness education, and a passive control (waitlist. Waitlisted participants will receive one of the interventions in the following term. Intervention groups will meet weekly for 8 weeks during regularly scheduled health classes. Immediate post-tests will be conducted, followed by a 60-day post-test. It is hypothesized that the mindfulness intervention will outperform the other conditions with regard to the adolescents' mood, attention and response to stress.

  10. SPINning parallel systems software

    International Nuclear Information System (INIS)

    Matlin, O.S.; Lusk, E.; McCune, W.

    2002-01-01

    We describe our experiences in using Spin to verify parts of the Multi Purpose Daemon (MPD) parallel process management system. MPD is a distributed collection of processes connected by Unix network sockets. MPD is dynamic processes and connections among them are created and destroyed as MPD is initialized, runs user processes, recovers from faults, and terminates. This dynamic nature is easily expressible in the Spin/Promela framework but poses performance and scalability challenges. We present here the results of expressing some of the parallel algorithms of MPD and executing both simulation and verification runs with Spin

  11. Parallel programming with Python

    CERN Document Server

    Palach, Jan

    2014-01-01

    A fast, easy-to-follow and clear tutorial to help you develop Parallel computing systems using Python. Along with explaining the fundamentals, the book will also introduce you to slightly advanced concepts and will help you in implementing these techniques in the real world. If you are an experienced Python programmer and are willing to utilize the available computing resources by parallelizing applications in a simple way, then this book is for you. You are required to have a basic knowledge of Python development to get the most of this book.

  12. Design and methods for a randomized clinical trial treating comorbid obesity and major depressive disorder.

    Science.gov (United States)

    Schneider, Kristin L; Bodenlos, Jamie S; Ma, Yunsheng; Olendzki, Barbara; Oleski, Jessica; Merriam, Philip; Crawford, Sybil; Ockene, Ira S; Pagoto, Sherry L

    2008-09-15

    Obesity is often comorbid with depression and individuals with this comorbidity fare worse in behavioral weight loss treatment. Treating depression directly prior to behavioral weight loss treatment might bolster weight loss outcomes in this population, but this has not yet been tested in a randomized clinical trial. This randomized clinical trial will examine whether behavior therapy for depression administered prior to standard weight loss treatment produces greater weight loss than standard weight loss treatment alone. Obese women with major depressive disorder (N = 174) will be recruited from primary care clinics and the community and randomly assigned to one of the two treatment conditions. Treatment will last 2 years, and will include a 6-month intensive treatment phase followed by an 18-month maintenance phase. Follow-up assessment will occur at 6-months and 1- and 2 years following randomization. The primary outcome is weight loss. The study was designed to provide 90% power for detecting a weight change difference between conditions of 3.1 kg (standard deviation of 5.5 kg) at 1-year assuming a 25% rate of loss to follow-up. Secondary outcomes include depression, physical activity, dietary intake, psychosocial variables and cardiovascular risk factors. Potential mediators (e.g., adherence, depression, physical activity and caloric intake) of the intervention effect on weight change will also be examined. Treating depression before administering intensive health behavior interventions could potentially boost the impact on both mental and physical health outcomes. NCT00572520.

  13. Design and implementation of a radiotherapy programme: Clinical, medical physics, radiation protection and safety aspects

    International Nuclear Information System (INIS)

    1998-09-01

    It is widely acknowledged that the clinical aspects (diagnosis, decision, indication for treatment, follow-up) as well as the procedures related to the physical and technical aspects of patient treatment must be subjected to careful control and planning in order to ensure safe, high quality radiotherapy. Whilst it has long been recognized that the physical aspects of quality assurance in radiotherapy are vital to achieve and effective and safe treatment, it has been increasingly acknowledged only recently that a systematic approach is absolutely necessary to all steps within clinical and technical aspects of a radiotherapy programme as well. The need to establish general guidelines at the IAEA, taking into account clinical medical physics, radiation protection and safety considerations, for designing and implementing radiotherapy programmes in Member States has been identified through the Member States' increased interest in the efficient and safe application of radiation in health care. Several consultants and advisory group meetings were convened to prepare a report providing a basis for establishing a programme in radiotherapy. The present TECDOC is addressed to all professionals and administrators involved in the development, implementation and management of a radiotherapy programme in order to establish a common and consistent framework where all steps and procedures in radiotherapy are taken into account

  14. Quantitative imaging of the human upper airway: instrument design and clinical studies

    Science.gov (United States)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  15. Emerging concepts in dendrimer-based nanomedicine: from design principles to clinical applications.

    Science.gov (United States)

    Kannan, R M; Nance, E; Kannan, S; Tomalia, D A

    2014-12-01

    Dendrimers are discrete nanostructures/nanoparticles with 'onion skin-like' branched layers. Beginning with a core, these nanostructures grow in concentric layers to produce stepwise increases in size that are similar to the dimensions of many in vivo globular proteins. These branched tree-like concentric layers are referred to as 'generations'. The outer generation of each dendrimer presents a precise number of functional groups that may act as a monodispersed platform for engineering favourable nanoparticle-drug and nanoparticle-tissue interactions. These features have attracted significant attention in medicine as nanocarriers for traditional small drugs, proteins, DNA/RNA and in some instances as intrinsically active nanoscale drugs. Dendrimer-based drugs, as well as diagnostic and imaging agents, are emerging as promising candidates for many nanomedicine applications. First, we will provide a brief survey of recent nanomedicines that are either approved or in the clinical approval process. This will be followed by an introduction to a new 'nanoperiodic' concept which proposes nanoparticle structure control and the engineering of 'critical nanoscale design parameters' (CNDPs) as a strategy for optimizing pharmocokinetics, pharmocodynamics and site-specific targeting of disease. This paradigm has led to the emergence of CNDP-directed nanoperiodic property patterns relating nanoparticle behaviour to critical in vivo clinical translation issues such as cellular uptake, transport, elimination, biodistribution, accumulation and nanotoxicology. With a focus on dendrimers, these CNDP-directed nanoperiodic patterns are used as a strategy for designing and optimizing nanoparticles for a variety of drug delivery and imaging applications, including a recent dendrimer-based theranostic nanodevice for imaging and treating cancer. Several emerging preclinical dendrimer-based nanotherapy concepts related to inflammation, neuro-inflammatory disorders, oncology and infectious

  16. Design and methods for a randomized clinical trial treating comorbid obesity and major depressive disorder

    Directory of Open Access Journals (Sweden)

    Crawford Sybil

    2008-09-01

    Full Text Available Abstract Background Obesity is often comorbid with depression and individuals with this comorbidity fare worse in behavioral weight loss treatment. Treating depression directly prior to behavioral weight loss treatment might bolster weight loss outcomes in this population, but this has not yet been tested in a randomized clinical trial. Methods and design This randomized clinical trial will examine whether behavior therapy for depression administered prior to standard weight loss treatment produces greater weight loss than standard weight loss treatment alone. Obese women with major depressive disorder (N = 174 will be recruited from primary care clinics and the community and randomly assigned to one of the two treatment conditions. Treatment will last 2 years, and will include a 6-month intensive treatment phase followed by an 18-month maintenance phase. Follow-up assessment will occur at 6-months and 1- and 2 years following randomization. The primary outcome is weight loss. The study was designed to provide 90% power for detecting a weight change difference between conditions of 3.1 kg (standard deviation of 5.5 kg at 1-year assuming a 25% rate of loss to follow-up. Secondary outcomes include depression, physical activity, dietary intake, psychosocial variables and cardiovascular risk factors. Potential mediators (e.g., adherence, depression, physical activity and caloric intake of the intervention effect on weight change will also be examined. Discussion Treating depression before administering intensive health behavior interventions could potentially boost the impact on both mental and physical health outcomes. Trial registration NCT00572520

  17. Pharmacokinetic comparison of acetaminophen elixir versus suppositories in vaccinated infants (aged 3 to 36 months): a single-dose, open-label, randomized, parallel-group design.

    Science.gov (United States)

    Walson, Philip D; Halvorsen, Mark; Edge, James; Casavant, Marcel J; Kelley, Michael T

    2013-02-01

    Because of practical problems and ethical concerns, few studies of the pharmacokinetics (PK) of acetaminophen (ACET) in infants have been published. The goal of this study was to compare the PK of an ACET rectal suppository with a commercially available ACET elixir to complete a regulatory obligation to market the suppository. This study was not submitted previously because of numerous obstacles related to both the investigators and the commercial entities associated with the tested product. Thirty infants (age 3-36 months) prescribed ACET for either fever, pain, or postimmunization prophylaxis of fever and discomfort were randomized to receive a single 10- to 15-mg/kg ACET dose either as the rectal suppository or oral elixir. Blood was collected at selected times for up to 8 hours after administration. ACET concentrations were measured by using a validated HPLC method, and PK behavior and bioavailability were compared for the 2 preparations. All 30 infants enrolled were prescribed ACET for postimmunization prophylaxis. PK samples were available in 27 of the 30 enrolled infants. Subject enrollment (completed in January 1995) was rapid (8.3 months) and drawn entirely from a vaccinated infant clinic population. There were no statistically significant differences between the subjects (elixir, n = 12; suppository, n = 15) in either mean (SD) age (10.0 [6.3] vs 12.4 [8.1] months), weight (8.6 [2.3] vs 9.4 [2.4] kg), sex (7 of 12 males vs 7 of 15 males), or racial distribution (5 white, 5 black, and 2 biracial vs 4 white and 11 black) between the 2 dosing groups (oral vs rectal, respectively). The oral and rectal preparations produced similar, rapid peak concentrations (T(max), 1.16 vs 1.17 hours; P = 0.98) and elimination t(½) (1.84 vs 2.10 hours; P = 0.14), respectively. No statistically significant differences were found between either C(max) (7.65 vs 5.68 μg/mL) or total drug exposure (AUC(0-∞), 23.36 vs 20.45 μg-h/mL) for the oral versus rectal preparations

  18. Design of a multi-arm randomized clinical trial with no control arm.

    Science.gov (United States)

    Magaret, Amalia; Angus, Derek C; Adhikari, Neill K J; Banura, Patrick; Kissoon, Niranjan; Lawler, James V; Jacob, Shevin T

    2016-01-01

    Clinical trial designs that include multiple treatments are currently limited to those that perform pairwise comparisons of each investigational treatment to a single control. However, there are settings, such as the recent Ebola outbreak, in which no treatment has been demonstrated to be effective; and therefore, no standard of care exists which would serve as an appropriate control. For illustrative purposes, we focused on the care of patients presenting in austere settings with critically ill 'sepsis-like' syndromes. Our approach involves a novel algorithm for comparing mortality among arms without requiring a single fixed control. The algorithm allows poorly-performing arms to be dropped during interim analyses. Consequently, the study may be completed earlier than planned. We used simulation to determine operating characteristics for the trial and to estimate the required sample size. We present a potential study design targeting a minimal effect size of a 23% relative reduction in mortality between any pair of arms. Using estimated power and spurious significance rates from the simulated scenarios, we show that such a trial would require 2550 participants. Over a range of scenarios, our study has 80 to 99% power to select the optimal treatment. Using a fixed control design, if the control arm is least efficacious, 640 subjects would be enrolled into the least efficacious arm, while our algorithm would enroll between 170 and 430. This simulation method can be easily extended to other settings or other binary outcomes. Early dropping of arms is efficient and ethical when conducting clinical trials with multiple arms. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Performance of the Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  20. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    Science.gov (United States)

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  1. The Bolton Treovance abdominal stent-graft: European clinical trial design.

    Science.gov (United States)

    Chiesa, R; Riambau, V; Coppi, G; Zipfel, B; Llagostera, S; Marone, E M; Kahlberg, A

    2012-10-01

    Endovascular aortic repair (EVAR) has emerged as a promising, less invasive alternative to conventional open surgery for the treatment of infrarenal abdominal aortic aneurysms (AAAs). In the last 20 years, the application rate of EVAR and its clinical results have significantly improved thanks to the evolution of stent-grafts and endovascular delivery systems. However, further development is still needed to reduce the incidence of complications and secondary re-interventions. The Treovance abdominal aortic stent-graft (Bolton Medical, Barcelona, Spain) is a new-generation endovascular device, developed to increase flexibility, lower profile, improve deployment and sealing mechanisms. In particular, it is provided with some innovative features as a double layer of proximal barbs (suprarenal and infrarenal) for supplemental fixation, dull barbs between modules to avoid potential leg disconnections, detachable outer sheath provided with a new-design hemostatic valve, and a double improved mechanism (slow motion and "pin and pull") for precise stent-graft deployment. A European prospective, non-randomized, multi-institutional, "first-in-human" trial (the ADVANCE trial) was conducted from March to December 2011 to assess the safety and performance of the Treovance stent-graft system before commercialization. Thirty patients with anatomically suitable non-ruptured AAAs were enrolled at five clinical sites in Italy, Spain, and Germany. EVAR was completed successfully in all patients. The stent-graft was delivered and deployed safely even in heavily angulated or calcified anatomies. No 30-day device-related complications nor deaths were observed. Preliminary experience with the Treovance abdominal stent-graft within the ADVANCE trial was satisfactory with regard to technical success and perioperative clinical results. Follow-up data are needed to assess mid- and long-term clinical outcomes, along with durability of this new-generation endovascular device.

  2. PI3K inhibitors as new cancer therapeutics: implications for clinical trial design

    Directory of Open Access Journals (Sweden)

    Massacesi C

    2016-01-01

    AKT–mTOR pathway, patient selection, biomarkers, PI3K inhibitors, clinical trial design

  3. Parallel Fast Legendre Transform

    NARCIS (Netherlands)

    Alves de Inda, M.; Bisseling, R.H.; Maslen, D.K.

    1998-01-01

    We discuss a parallel implementation of a fast algorithm for the discrete polynomial Legendre transform We give an introduction to the DriscollHealy algorithm using polynomial arithmetic and present experimental results on the eciency and accuracy of our implementation The algorithms were

  4. Practical parallel programming

    CERN Document Server

    Bauer, Barr E

    2014-01-01

    This is the book that will teach programmers to write faster, more efficient code for parallel processors. The reader is introduced to a vast array of procedures and paradigms on which actual coding may be based. Examples and real-life simulations using these devices are presented in C and FORTRAN.

  5. Parallel hierarchical radiosity rendering

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Michael [Iowa State Univ., Ames, IA (United States)

    1993-07-01

    In this dissertation, the step-by-step development of a scalable parallel hierarchical radiosity renderer is documented. First, a new look is taken at the traditional radiosity equation, and a new form is presented in which the matrix of linear system coefficients is transformed into a symmetric matrix, thereby simplifying the problem and enabling a new solution technique to be applied. Next, the state-of-the-art hierarchical radiosity methods are examined for their suitability to parallel implementation, and scalability. Significant enhancements are also discovered which both improve their theoretical foundations and improve the images they generate. The resultant hierarchical radiosity algorithm is then examined for sources of parallelism, and for an architectural mapping. Several architectural mappings are discussed. A few key algorithmic changes are suggested during the process of making the algorithm parallel. Next, the performance, efficiency, and scalability of the algorithm are analyzed. The dissertation closes with a discussion of several ideas which have the potential to further enhance the hierarchical radiosity method, or provide an entirely new forum for the application of hierarchical methods.

  6. Parallel universes beguile science

    CERN Multimedia

    2007-01-01

    A staple of mind-bending science fiction, the possibility of multiple universes has long intrigued hard-nosed physicists, mathematicians and cosmologists too. We may not be able -- as least not yet -- to prove they exist, many serious scientists say, but there are plenty of reasons to think that parallel dimensions are more than figments of eggheaded imagination.

  7. Parallel k-means++

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-04

    A parallelization of the k-means++ seed selection algorithm on three distinct hardware platforms: GPU, multicore CPU, and multithreaded architecture. K-means++ was developed by David Arthur and Sergei Vassilvitskii in 2007 as an extension of the k-means data clustering technique. These algorithms allow people to cluster multidimensional data, by attempting to minimize the mean distance of data points within a cluster. K-means++ improved upon traditional k-means by using a more intelligent approach to selecting the initial seeds for the clustering process. While k-means++ has become a popular alternative to traditional k-means clustering, little work has been done to parallelize this technique. We have developed original C++ code for parallelizing the algorithm on three unique hardware architectures: GPU using NVidia's CUDA/Thrust framework, multicore CPU using OpenMP, and the Cray XMT multithreaded architecture. By parallelizing the process for these platforms, we are able to perform k-means++ clustering much more quickly than it could be done before.

  8. Parallel plate detectors

    International Nuclear Information System (INIS)

    Gardes, D.; Volkov, P.

    1981-01-01

    A 5x3cm 2 (timing only) and a 15x5cm 2 (timing and position) parallel plate avalanche counters (PPAC) are considered. The theory of operation and timing resolution is given. The measurement set-up and the curves of experimental results illustrate the possibilities of the two counters [fr

  9. Parallel hierarchical global illumination

    Energy Technology Data Exchange (ETDEWEB)

    Snell, Quinn O. [Iowa State Univ., Ames, IA (United States)

    1997-10-08

    Solving the global illumination problem is equivalent to determining the intensity of every wavelength of light in all directions at every point in a given scene. The complexity of the problem has led researchers to use approximation methods for solving the problem on serial computers. Rather than using an approximation method, such as backward ray tracing or radiosity, the authors have chosen to solve the Rendering Equation by direct simulation of light transport from the light sources. This paper presents an algorithm that solves the Rendering Equation to any desired accuracy, and can be run in parallel on distributed memory or shared memory computer systems with excellent scaling properties. It appears superior in both speed and physical correctness to recent published methods involving bidirectional ray tracing or hybrid treatments of diffuse and specular surfaces. Like progressive radiosity methods, it dynamically refines the geometry decomposition where required, but does so without the excessive storage requirements for ray histories. The algorithm, called Photon, produces a scene which converges to the global illumination solution. This amounts to a huge task for a 1997-vintage serial computer, but using the power of a parallel supercomputer significantly reduces the time required to generate a solution. Currently, Photon can be run on most parallel environments from a shared memory multiprocessor to a parallel supercomputer, as well as on clusters of heterogeneous workstations.

  10. Strategies to design clinical studies to identify predictive biomarkers in cancer research.

    Science.gov (United States)

    Perez-Gracia, Jose Luis; Sanmamed, Miguel F; Bosch, Ana; Patiño-Garcia, Ana; Schalper, Kurt A; Segura, Victor; Bellmunt, Joaquim; Tabernero, Josep; Sweeney, Christopher J; Choueiri, Toni K; Martín, Miguel; Fusco, Juan Pablo; Rodriguez-Ruiz, Maria Esperanza; Calvo, Alfonso; Prior, Celia; Paz-Ares, Luis; Pio, Ruben; Gonzalez-Billalabeitia, Enrique; Gonzalez Hernandez, Alvaro; Páez, David; Piulats, Jose María; Gurpide, Alfonso; Andueza, Mapi; de Velasco, Guillermo; Pazo, Roberto; Grande, Enrique; Nicolas, Pilar; Abad-Santos, Francisco; Garcia-Donas, Jesus; Castellano, Daniel; Pajares, María J; Suarez, Cristina; Colomer, Ramon; Montuenga, Luis M; Melero, Ignacio

    2017-02-01

    The discovery of reliable biomarkers to predict efficacy and toxicity of anticancer drugs remains one of the key challenges in cancer research. Despite its relevance, no efficient study designs to identify promising candidate biomarkers have been established. This has led to the proliferation of a myriad of exploratory studies using dissimilar strategies, most of which fail to identify any promising targets and are seldom validated. The lack of a proper methodology also determines that many anti-cancer drugs are developed below their potential, due to failure to identify predictive biomarkers. While some drugs will be systematically administered to many patients who will not benefit from them, leading to unnecessary toxicities and costs, others will never reach registration due to our inability to identify the specific patient population in which they are active. Despite these drawbacks, a limited number of outstanding predictive biomarkers have been successfully identified and validated, and have changed the standard practice of oncology. In this manuscript, a multidisciplinary panel reviews how those key biomarkers were identified and, based on those experiences, proposes a methodological framework-the DESIGN guidelines-to standardize the clinical design of biomarker identification studies and to develop future research in this pivotal field. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. A novel design for randomized immuno-oncology clinical trials with potentially delayed treatment effects

    Directory of Open Access Journals (Sweden)

    Pei He

    2015-10-01

    Full Text Available The semi-parametric proportional hazards model is widely adopted in randomized clinical trials with time-to-event outcomes, and the log-rank test is frequently used to detect a potential treatment effect. Immuno-oncology therapies pose unique challenges to the design of a trial as the treatment effect may be delayed, which violates the proportional hazards assumption, and the log-rank test has been shown to markedly lose power under the non-proportional hazards setting. A novel design and analysis approach for immuno-oncology trials is proposed through a piecewise treatment effect function, which is capable of detecting a potentially delayed treatment effect. The number of events required for the trial will be determined to ensure sufficient power for both the overall log-rank test without a delayed effect and the test beyond the delayed period when such a delay exists. The existence of a treatment delay is determined by a likelihood ratio test with resampling. Numerical results show that the proposed design adequately controls the Type I error rate, has a minimal loss in power under the proportional hazards setting and is markedly more powerful than the log-rank test with a delayed treatment effect.

  12. Determination of variation parameters as a crucial step in designing TMT-based clinical proteomics experiments.

    Directory of Open Access Journals (Sweden)

    Evelyne Maes

    Full Text Available In quantitative shotgun proteomic analyses by liquid chromatography and mass spectrometry, a rigid study design is necessary in order to obtain statistically relevant results. Hypothesis testing, sample size calculation and power estimation are fundamental concepts that require consideration upon designing an experiment. For this reason, the reproducibility and variability of the proteomic platform needs to be assessed. In this study, we evaluate the technical (sample preparation, labeling (isobaric labels, and total (biological + technical + labeling + experimental variability and reproducibility of a workflow that employs a shotgun LC-MS/MS approach in combination with TMT peptide labeling for the quantification of peripheral blood mononuclear cell (PBMC proteome. We illustrate that the variability induced by TMT labeling is small when compared to the technical variation. The latter is also responsible for a substantial part of the total variation. Prior knowledge about the experimental variability allows for a correct design, a prerequisite for the detection of biologically significant disease-specific differential proteins in clinical proteomics experiments.

  13. Fifth Ovarian Cancer Consensus Conference of the Gynecologic Cancer InterGroup (GCIG): clinical trial design for rare ovarian tumours

    NARCIS (Netherlands)

    Leary, A. F.; Quinn, M.; Fujiwara, K.; Coleman, R. L.; Kohn, E.; Sugiyama, T.; Glasspool, R.; Ray-Coquard, I.; Colombo, N.; Bacon, M.; Zeimet, A.; Westermann, A.; Gomez-Garcia, E.; Provencher, D.; Welch, S.; Small, W.; Millan, D.; Okamoto, A.; Stuart, G.; Ochiai, K.

    2017-01-01

    This manuscript reports the consensus statements on designing clinical trials in rare ovarian tumours reached at the fifth Ovarian Cancer Consensus Conference (OCCC) held in Tokyo, November 2015. Three important questions were identified concerning rare ovarian tumours (rare epithelial ovarian

  14. Design of Phase I Combination Trials: Recommendations of the Clinical Trial Design Task Force of the NCI Investigational Drug Steering Committee

    Science.gov (United States)

    Paller, Channing J.; Bradbury, Penelope A.; Ivy, S. Percy; Seymour, Lesley; LoRusso, Patricia M.; Baker, Laurence; Rubinstein, Larry; Huang, Erich; Collyar, Deborah; Groshen, Susan; Reeves, Steven; Ellis, Lee M.; Sargent, Daniel J.; Rosner, Gary L.; LeBlanc, Michael L.; Ratain, Mark J.

    2014-01-01

    Anticancer drugs are combined in an effort to treat a heterogeneous tumor or to maximize the pharmacodynamic effect. The development of combination regimens, while desirable, poses unique challenges. These include the selection of agents for combination therapy that may lead to improved efficacy while maintaining acceptable toxicity, the design of clinical trials that provide informative results for individual agents and combinations, and logistical and regulatory challenges. The phase 1 trial is often the initial step in the clinical evaluation of a combination regimen. In view of the importance of combination regimens and the challenges associated with developing them, the Clinical Trial Design (CTD) Task Force of the National Cancer Institute (NCI) Investigational Drug Steering Committee developed a set of recommendations for the phase 1 development of a combination regimen. The first two recommendations focus on the scientific rationale and development plans for the combination regimen; subsequent recommendations encompass clinical design aspects. The CTD Task Force recommends that selection of the proposed regimens be based on a biological or pharmacological rationale supported by clinical and/or robust and validated preclinical evidence, and accompanied by a plan for subsequent development of the combination. The design of the phase 1 clinical trial should take into consideration the potential pharmacokinetic and pharmacodynamic interactions as well as overlapping toxicity. Depending on the specific hypothesized interaction, the primary endpoint may be dose optimization, pharmacokinetics, and/or pharmacodynamic (i.e., biomarker). PMID:25125258

  15. A low-fat yoghurt supplemented with a rooster comb extract on muscle joint function in adults with mild knee pain: a randomized, double blind, parallel, placebo-controlled, clinical trial of efficacy.

    Science.gov (United States)

    Solà, Rosa; Valls, Rosa-Maria; Martorell, Isabel; Giralt, Montserrat; Pedret, Anna; Taltavull, Núria; Romeu, Marta; Rodríguez, Àurea; Moriña, David; Lopez de Frutos, Victor; Montero, Manuel; Casajuana, Maria-Carmen; Pérez, Laura; Faba, Jenny; Bernal, Gloria; Astilleros, Anna; González, Roser; Puiggrós, Francesc; Arola, Lluís; Chetrit, Carlos; Martinez-Puig, Daniel

    2015-11-01

    Preliminary results suggested that oral-administration of rooster comb extract (RCE) rich in hyaluronic acid (HA) was associated with improved muscle strength. Following these promising results, the objective of the present study was to evaluate the effect of low-fat yoghurt supplemented with RCE rich in HA on muscle function in adults with mild knee pain; a symptom of early osteoarthritis. Participants (n = 40) received low-fat yoghurt (125 mL d(-1)) supplemented with 80 mg d(-1) of RCE and the placebo group (n = 40) consumed the same yoghurt without the RCE, in a randomized, controlled, double-blind, parallel trial over 12 weeks. Using an isokinetic dynamometer (Biodex System 4), RCE consumption, compared to control, increased the affected knee peak torque, total work and mean power at 180° s(-1), at least 11% in men (p < 0.05) with no differences in women. No dietary differences were noted. These results suggest that long-term consumption of low-fat yoghurt supplemented with RCE could be a dietary tool to improve muscle strength in men, associated with possible clinical significance. However, further studies are needed to elucidate reasons for these sex difference responses observed, and may provide further insight into muscle function.

  16. Parallel programming practical aspects, models and current limitations

    CERN Document Server

    Tarkov, Mikhail S

    2014-01-01

    Parallel programming is designed for the use of parallel computer systems for solving time-consuming problems that cannot be solved on a sequential computer in a reasonable time. These problems can be divided into two classes: 1. Processing large data arrays (including processing images and signals in real time)2. Simulation of complex physical processes and chemical reactions For each of these classes, prospective methods are designed for solving problems. For data processing, one of the most promising technologies is the use of artificial neural networks. Particles-in-cell method and cellular automata are very useful for simulation. Problems of scalability of parallel algorithms and the transfer of existing parallel programs to future parallel computers are very acute now. An important task is to optimize the use of the equipment (including the CPU cache) of parallel computers. Along with parallelizing information processing, it is essential to ensure the processing reliability by the relevant organization ...

  17. Parallel computation of nondeterministic algorithms in VLSI

    Energy Technology Data Exchange (ETDEWEB)

    Hortensius, P D

    1987-01-01

    This work examines parallel VLSI implementations of nondeterministic algorithms. It is demonstrated that conventional pseudorandom number generators are unsuitable for highly parallel applications. Efficient parallel pseudorandom sequence generation can be accomplished using certain classes of elementary one-dimensional c