WorldWideScience

Sample records for large-scale randomized controlled

  1. Optimizing Implementation of Obesity Prevention Programs: A Qualitative Investigation Within a Large-Scale Randomized Controlled Trial.

    Science.gov (United States)

    Kozica, Samantha L; Teede, Helena J; Harrison, Cheryce L; Klein, Ruth; Lombard, Catherine B

    2016-01-01

    The prevalence of obesity in rural and remote areas is elevated in comparison to urban populations, highlighting the need for interventions targeting obesity prevention in these settings. Implementing evidence-based obesity prevention programs is challenging. This study aimed to investigate factors influencing the implementation of obesity prevention programs, including adoption, program delivery, community uptake, and continuation, specifically within rural settings. Nested within a large-scale randomized controlled trial, a qualitative exploratory approach was adopted, with purposive sampling techniques utilized, to recruit stakeholders from 41 small rural towns in Australia. In-depth semistructured interviews were conducted with clinical health professionals, health service managers, and local government employees. Open coding was completed independently by 2 investigators and thematic analysis undertaken. In-depth interviews revealed that obesity prevention programs were valued by the rural workforce. Program implementation is influenced by interrelated factors across: (1) contextual factors and (2) organizational capacity. Key recommendations to manage the challenges of implementing evidence-based programs focused on reducing program delivery costs, aided by the provision of a suite of implementation and evaluation resources. Informing the scale-up of future prevention programs, stakeholders highlighted the need to build local rural capacity through developing supportive university partnerships, generating local program ownership and promoting active feedback to all program partners. We demonstrate that the rural workforce places a high value on obesity prevention programs. Our results inform the future scale-up of obesity prevention programs, providing an improved understanding of strategies to optimize implementation of evidence-based prevention programs. © 2015 National Rural Health Association.

  2. Large-scale inverse model analyses employing fast randomized data reduction

    Science.gov (United States)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  3. Decision aid on breast cancer screening reduces attendance rate: results of a large-scale, randomized, controlled study by the DECIDEO group

    Science.gov (United States)

    Bourmaud, Aurelie; Soler-Michel, Patricia; Oriol, Mathieu; Regnier, Véronique; Tinquaut, Fabien; Nourissat, Alice; Bremond, Alain; Moumjid, Nora; Chauvin, Franck

    2016-01-01

    Controversies regarding the benefits of breast cancer screening programs have led to the promotion of new strategies taking into account individual preferences, such as decision aid. The aim of this study was to assess the impact of a decision aid leaflet on the participation of women invited to participate in a national breast cancer screening program. This Randomized, multicentre, controlled trial. Women aged 50 to 74 years, were randomly assigned to receive either a decision aid or the usual invitation letter. Primary outcome was the participation rate 12 months after the invitation. 16 000 women were randomized and 15 844 included in the modified intention-to-treat analysis. The participation rate in the intervention group was 40.25% (3174/7885 women) compared with 42.13% (3353/7959) in the control group (p = 0.02). Previous attendance for screening (RR = 6.24; [95%IC: 5.75-6.77]; p < 0.0001) and medium household income (RR = 1.05; [95%IC: 1.01-1.09]; p = 0.0074) were independently associated with attendance for screening. This large-scale study demonstrates that the decision aid reduced the participation rate. The decision aid activate the decision making process of women toward non-attendance to screening. These results show the importance of promoting informed patient choices, especially when those choices cannot be anticipated. PMID:26883201

  4. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253 ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.633, year: 2015

  5. Localized Power Control for Multihop Large-Scale Internet of Things

    KAUST Repository

    Bader, Ahmed

    2015-07-07

    In this paper, we promote the use of multihop networking in the context of large-scale Internet of Things (IoT). Recognizing concerns related to the scalability of classical multihop routing and medium access techniques, we advocate the use of blind cooperation in conjunction with multihop communications. However, we show that blind cooperation is actually inefficient unless power control is applied. Inefficiency in this paper is projected in terms of the transport rate normalized to energy consumption. To that end, we propose an uncoordinated power control mechanism whereby each device in a blind cooperative cluster randomly adjusts its transmit power level. We derive an upper bound on the mean transmit power that must be observed at each device. We also devise a practical mechanism for each device to infer about the size of its neighborhood; a requirement necessary for the operation of the power control scheme. Finally, we assess the performance of the developed power control mechanism and demonstrate how it consistently outperforms the point-to-point case.

  6. Localized Power Control for Multihop Large-Scale Internet of Things

    KAUST Repository

    Bader, Ahmed

    2015-08-04

    In this paper, we promote the use of multihop networking in the context of large-scale Internet of Things (IoT). Recognizing concerns related to the scalability of classical multihop routing and medium access techniques, we advocate the use of blind cooperation in conjunction with multihop communications. However, we show that blind cooperation is actually inefficient unless power control is applied. Inefficiency in this paper is projected in terms of the transport rate normalized to energy consumption. To that end, we propose an uncoordinated power control mechanism whereby each device in a blind cooperative cluster randomly adjusts its transmit power level. We derive an upper bound on the mean transmit power that must be observed at each device. We also devise a practical mechanism for each device to infer about the size of its neighborhood; a requirement necessary for the operation of the power control scheme. Finally, we assess the performance of the developed power control mechanism and demonstrate how it consistently outperforms the point-to-point case.

  7. Band gaps and localization of surface water waves over large-scale sand waves with random fluctuations

    Science.gov (United States)

    Zhang, Yu; Li, Yan; Shao, Hao; Zhong, Yaozhao; Zhang, Sai; Zhao, Zongxi

    2012-06-01

    Band structure and wave localization are investigated for sea surface water waves over large-scale sand wave topography. Sand wave height, sand wave width, water depth, and water width between adjacent sand waves have significant impact on band gaps. Random fluctuations of sand wave height, sand wave width, and water depth induce water wave localization. However, random water width produces a perfect transmission tunnel of water waves at a certain frequency so that localization does not occur no matter how large a disorder level is applied. Together with theoretical results, the field experimental observations in the Taiwan Bank suggest band gap and wave localization as the physical mechanism of sea surface water wave propagating over natural large-scale sand waves.

  8. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  9. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  10. Large-scale building energy efficiency retrofit: Concept, model and control

    International Nuclear Information System (INIS)

    Wu, Zhou; Wang, Bo; Xia, Xiaohua

    2016-01-01

    BEER (Building energy efficiency retrofit) projects are initiated in many nations and regions over the world. Existing studies of BEER focus on modeling and planning based on one building and one year period of retrofitting, which cannot be applied to certain large BEER projects with multiple buildings and multi-year retrofit. In this paper, the large-scale BEER problem is defined in a general TBT (time-building-technology) framework, which fits essential requirements of real-world projects. The large-scale BEER is newly studied in the control approach rather than the optimization approach commonly used before. Optimal control is proposed to design optimal retrofitting strategy in terms of maximal energy savings and maximal NPV (net present value). The designed strategy is dynamically changing on dimensions of time, building and technology. The TBT framework and the optimal control approach are verified in a large BEER project, and results indicate that promising performance of energy and cost savings can be achieved in the general TBT framework. - Highlights: • Energy efficiency retrofit of many buildings is studied. • A TBT (time-building-technology) framework is proposed. • The control system of the large-scale BEER is modeled. • The optimal retrofitting strategy is obtained.

  11. Study design of a cluster-randomized controlled trial to evaluate a large-scale distribution of cook stoves and water filters in Western Province, Rwanda.

    Science.gov (United States)

    Nagel, Corey L; Kirby, Miles A; Zambrano, Laura D; Rosa, Ghislane; Barstow, Christina K; Thomas, Evan A; Clasen, Thomas F

    2016-12-15

    In Rwanda, pneumonia and diarrhea are the first and second leading causes of death, respectively, among children under five. Household air pollution (HAP) resultant from cooking indoors with biomass fuels on traditional stoves is a significant risk factor for pneumonia, while consumption of contaminated drinking water is a primary cause of diarrheal disease. To date, there have been no large-scale effectiveness trials of programmatic efforts to provide either improved cookstoves or household water filters at scale in a low-income country. In this paper we describe the design of a cluster-randomized trial to evaluate the impact of a national-level program to distribute and promote the use of improved cookstoves and advanced water filters to the poorest quarter of households in Rwanda. We randomly allocated 72 sectors (administratively defined units) in Western Province to the intervention, with the remaining 24 sectors in the province serving as controls. In the intervention sectors, roughly 100,000 households received improved cookstoves and household water filters through a government-sponsored program targeting the poorest quarter of households nationally. The primary outcome measures are the incidence of acute respiratory infection (ARI) and diarrhea among children under five years of age. Over a one-year surveillance period, all cases of acute respiratory infection (ARI) and diarrhea identified by health workers in the study area will be extracted from records maintained at health facilities and by community health workers (CHW). In addition, we are conducting intensive, longitudinal data collection among a random sample of households in the study area for in-depth assessment of coverage, use, environmental exposures, and additional health measures. Although previous research has examined the impact of providing household water treatment and improved cookstoves on child health, there have been no studies of national-level programs to deliver these interventions

  12. Automatic initialization and quality control of large-scale cardiac MRI segmentations.

    Science.gov (United States)

    Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F

    2018-01-01

    Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The

  13. Tradeoffs between quality-of-control and quality-of-service in large-scale nonlinear networked control systems

    NARCIS (Netherlands)

    Borgers, D. P.; Geiselhart, R.; Heemels, W. P. M. H.

    2017-01-01

    In this paper we study input-to-state stability (ISS) of large-scale networked control systems (NCSs) in which sensors, controllers and actuators are connected via multiple (local) communication networks which operate asynchronously and independently of each other. We model the large-scale NCS as an

  14. Distributed and hierarchical control techniques for large-scale power plant systems

    International Nuclear Information System (INIS)

    Raju, G.V.S.; Kisner, R.A.

    1985-08-01

    In large-scale systems, integrated and coordinated control functions are required to maximize plant availability, to allow maneuverability through various power levels, and to meet externally imposed regulatory limitations. Nuclear power plants are large-scale systems. Prime subsystems are those that contribute directly to the behavior of the plant's ultimate output. The prime subsystems in a nuclear power plant include reactor, primary and intermediate heat transport, steam generator, turbine generator, and feedwater system. This paper describes and discusses the continuous-variable control system developed to supervise prime plant subsystems for optimal control and coordination

  15. Large-Scale Cubic-Scaling Random Phase Approximation Correlation Energy Calculations Using a Gaussian Basis.

    Science.gov (United States)

    Wilhelm, Jan; Seewald, Patrick; Del Ben, Mauro; Hutter, Jürg

    2016-12-13

    We present an algorithm for computing the correlation energy in the random phase approximation (RPA) in a Gaussian basis requiring [Formula: see text] operations and [Formula: see text] memory. The method is based on the resolution of the identity (RI) with the overlap metric, a reformulation of RI-RPA in the Gaussian basis, imaginary time, and imaginary frequency integration techniques, and the use of sparse linear algebra. Additional memory reduction without extra computations can be achieved by an iterative scheme that overcomes the memory bottleneck of canonical RPA implementations. We report a massively parallel implementation that is the key for the application to large systems. Finally, cubic-scaling RPA is applied to a thousand water molecules using a correlation-consistent triple-ζ quality basis.

  16. Stability and Control of Large-Scale Dynamical Systems A Vector Dissipative Systems Approach

    CERN Document Server

    Haddad, Wassim M

    2011-01-01

    Modern complex large-scale dynamical systems exist in virtually every aspect of science and engineering, and are associated with a wide variety of physical, technological, environmental, and social phenomena, including aerospace, power, communications, and network systems, to name just a few. This book develops a general stability analysis and control design framework for nonlinear large-scale interconnected dynamical systems, and presents the most complete treatment on vector Lyapunov function methods, vector dissipativity theory, and decentralized control architectures. Large-scale dynami

  17. Use acupuncture to treat functional constipation: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Li Ying

    2012-07-01

    Full Text Available Abstract Background Whether acupuncture is effective for patients with functional constipation is still unclear. Therefore, we report the protocol of a randomized controlled trial of using acupuncture to treat functional constipation. Design A randomized, controlled, four-arm design, large-scale trial is currently undergoing in China. Seven hundred participants are randomly assigned to three acupuncture treatment groups and Mosapride Citrate control group in a 1:1:1:1 ratio. Participants in acupuncture groups receive 16 sessions of acupuncture treatment, and are followed up for a period of 9 weeks after randomization. The acupuncture groups are: (1 Back-Shu and Front-Mu acupoints of Large Intestine meridians (Shu-Mu points group; (2 He-Sea and Lower He-Sea acupoints of Large Intestine meridians (He points group; (3 Combining used Back-Shu, Front-Mu, He-Sea, and Lower He-Sea acupoints of Large Intestine meridians (Shu-Mu-He points group. The control group is Mosapride Citrate group. The primary outcome is frequency of defecation per week at the fourth week after randomization. The secondary outcomes include Bristol stool scale, the extent of difficulty during defecating, MOS 36-item Short Form health survey (SF-36, Self-Rating Anxiety Scale (SAS, and Self-rating Depression Scale (SDS. The first two of second outcomes are measured 1 week before randomization and 2, 4, and 8 weeks after randomization. Other second outcomes are measured 1 week before randomization and 2 and 4 weeks after randomization, but SF-36 is measured at randomization and 4 weeks after randomization. Discussion The result of this trial (which will be available in 2012 will confirm whether acupuncture is effective to treat functional constipation and whether traditional acupuncture theories play an important role in it. Trials registration Clinical Trials.gov NCT01411501

  18. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  19. Modeling and control of a large nuclear reactor. A three-time-scale approach

    Energy Technology Data Exchange (ETDEWEB)

    Shimjith, S.R. [Indian Institute of Technology Bombay, Mumbai (India); Bhabha Atomic Research Centre, Mumbai (India); Tiwari, A.P. [Bhabha Atomic Research Centre, Mumbai (India); Bandyopadhyay, B. [Indian Institute of Technology Bombay, Mumbai (India). IDP in Systems and Control Engineering

    2013-07-01

    Recent research on Modeling and Control of a Large Nuclear Reactor. Presents a three-time-scale approach. Written by leading experts in the field. Control analysis and design of large nuclear reactors requires a suitable mathematical model representing the steady state and dynamic behavior of the reactor with reasonable accuracy. This task is, however, quite challenging because of several complex dynamic phenomena existing in a reactor. Quite often, the models developed would be of prohibitively large order, non-linear and of complex structure not readily amenable for control studies. Moreover, the existence of simultaneously occurring dynamic variations at different speeds makes the mathematical model susceptible to numerical ill-conditioning, inhibiting direct application of standard control techniques. This monograph introduces a technique for mathematical modeling of large nuclear reactors in the framework of multi-point kinetics, to obtain a comparatively smaller order model in standard state space form thus overcoming these difficulties. It further brings in innovative methods for controller design for systems exhibiting multi-time-scale property, with emphasis on three-time-scale systems.

  20. Local, distributed topology control for large-scale wireless ad-hoc networks

    NARCIS (Netherlands)

    Nieberg, T.; Hurink, Johann L.

    In this document, topology control of a large-scale, wireless network by a distributed algorithm that uses only locally available information is presented. Topology control algorithms adjust the transmission power of wireless nodes to create a desired topology. The algorithm, named local power

  1. The linac control system for the large-scale synchrotron radiation facility (SPring-8)

    Energy Technology Data Exchange (ETDEWEB)

    Sakaki, Hironao; Yoshikawa, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Itoh, Yuichi [Atomic Energy General Services Corporation, Tokai, Ibaraki (Japan); Terashima, Yasushi [Information Technology System Co., Ltd. (ITECS), Tokyo (Japan)

    2000-09-01

    The linac for large-scale synchrotron radiation facilities has been operated since August of 1996. The linac deal with the user requests without any big troubles. In this report, the control system development policy, details, and the operation for the linac are presented. It is also described so that these experiences can be used for control system of a large scale proton accelerators which will be developed in the High Intensity Proton Accelerator Project. (author)

  2. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  3. Random number generators for large-scale parallel Monte Carlo simulations on FPGA

    Science.gov (United States)

    Lin, Y.; Wang, F.; Liu, B.

    2018-05-01

    Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.

  4. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  5. Model Predictive Control for Flexible Power Consumption of Large-Scale Refrigeration Systems

    DEFF Research Database (Denmark)

    Shafiei, Seyed Ehsan; Stoustrup, Jakob; Rasmussen, Henrik

    2014-01-01

    A model predictive control (MPC) scheme is introduced to directly control the electrical power consumption of large-scale refrigeration systems. Deviation from the baseline of the consumption is corresponded to the storing and delivering of thermal energy. By virtue of such correspondence...

  6. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  7. Can administrative referenda be an instrument of control over large-scale technical installations?

    International Nuclear Information System (INIS)

    Rossnagel, A.

    1986-01-01

    An administrative referendum offers the possibility of direct participation of the citizens in decisions concerning large-scale technical installations. The article investigates the legal status of such a referendum on the basis of constitutional and democratic principles. The conclusion drawn is that any attempt to realize more direct democracy in a concrete field of jurisdiction of the state will meet with very large difficulties. On the other hand, the author clearly states more direct democracy for control over the establishment of large-scale technology to be sensible in terms of politics and principles of democracy, and possible within the constitutional system. Developments towards more direct democracy would mean an enhancement of representative democracy and would be adequate vis a vis the problems posed by large-scale technology. (HSCH) [de

  8. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  9. Economic Model Predictive Control for Large-Scale and Distributed Energy Systems

    DEFF Research Database (Denmark)

    Standardi, Laura

    Sources (RESs) in the smart grids is increasing. These energy sources bring uncertainty to the production due to their fluctuations. Hence,smart grids need suitable control systems that are able to continuously balance power production and consumption.  We apply the Economic Model Predictive Control (EMPC......) strategy to optimise the economic performances of the energy systems and to balance the power production and consumption. In the case of large-scale energy systems, the electrical grid connects a high number of power units. Because of this, the related control problem involves a high number of variables......In this thesis, we consider control strategies for large and distributed energy systems that are important for the implementation of smart grid technologies.  An electrical grid has to ensure reliability and avoid long-term interruptions in the power supply. Moreover, the share of Renewable Energy...

  10. Time delay effects on large-scale MR damper based semi-active control strategies

    International Nuclear Information System (INIS)

    Cha, Y-J; Agrawal, A K; Dyke, S J

    2013-01-01

    This paper presents a detailed investigation on the robustness of large-scale 200 kN MR damper based semi-active control strategies in the presence of time delays in the control system. Although the effects of time delay on stability and performance degradation of an actively controlled system have been investigated extensively by many researchers, degradation in the performance of semi-active systems due to time delay has yet to be investigated. Since semi-active systems are inherently stable, instability problems due to time delay are unlikely to arise. This paper investigates the effects of time delay on the performance of a building with a large-scale MR damper, using numerical simulations of near- and far-field earthquakes. The MR damper is considered to be controlled by four different semi-active control algorithms, namely (i) clipped-optimal control (COC), (ii) decentralized output feedback polynomial control (DOFPC), (iii) Lyapunov control, and (iv) simple-passive control (SPC). It is observed that all controllers except for the COC are significantly robust with respect to time delay. On the other hand, the clipped-optimal controller should be integrated with a compensator to improve the performance in the presence of time delay. (paper)

  11. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    Science.gov (United States)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  12. Control Algorithms for Large-scale Single-axis Photovoltaic Trackers

    Directory of Open Access Journals (Sweden)

    Dorian Schneider

    2012-01-01

    Full Text Available The electrical yield of large-scale photovoltaic power plants can be greatly improved by employing solar trackers. While fixed-tilt superstructures are stationary and immobile, trackers move the PV-module plane in order to optimize its alignment to the sun. This paper introduces control algorithms for single-axis trackers (SAT, including a discussion for optimal alignment and backtracking. The results are used to simulate and compare the electrical yield of fixed-tilt and SAT systems. The proposed algorithms have been field tested, and are in operation in solar parks worldwide.

  13. Model of large scale man-machine systems with an application to vessel traffic control

    NARCIS (Netherlands)

    Wewerinke, P.H.; van der Ent, W.I.; ten Hove, D.

    1989-01-01

    Mathematical models are discussed to deal with complex large-scale man-machine systems such as vessel (air, road) traffic and process control systems. Only interrelationships between subsystems are assumed. Each subsystem is controlled by a corresponding human operator (HO). Because of the

  14. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  15. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    Science.gov (United States)

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  16. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    Science.gov (United States)

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  17. Modeling and Control of a Large Nuclear Reactor A Three-Time-Scale Approach

    CERN Document Server

    Shimjith, S R; Bandyopadhyay, B

    2013-01-01

    Control analysis and design of large nuclear reactors requires a suitable mathematical model representing the steady state and dynamic behavior of the reactor with reasonable accuracy. This task is, however, quite challenging because of several complex dynamic phenomena existing in a reactor. Quite often, the models developed would be of prohibitively large order, non-linear and of complex structure not readily amenable for control studies. Moreover, the existence of simultaneously occurring dynamic variations at different speeds makes the mathematical model susceptible to numerical ill-conditioning, inhibiting direct application of standard control techniques. This monograph introduces a technique for mathematical modeling of large nuclear reactors in the framework of multi-point kinetics, to obtain a comparatively smaller order model in standard state space form thus overcoming these difficulties. It further brings in innovative methods for controller design for systems exhibiting multi-time-scale property,...

  18. Versatile synchronized real-time MEG hardware controller for large-scale fast data acquisition

    Science.gov (United States)

    Sun, Limin; Han, Menglai; Pratt, Kevin; Paulson, Douglas; Dinh, Christoph; Esch, Lorenz; Okada, Yoshio; Hämäläinen, Matti

    2017-05-01

    Versatile controllers for accurate, fast, and real-time synchronized acquisition of large-scale data are useful in many areas of science, engineering, and technology. Here, we describe the development of a controller software based on a technique called queued state machine for controlling the data acquisition (DAQ) hardware, continuously acquiring a large amount of data synchronized across a large number of channels (>400) at a fast rate (up to 20 kHz/channel) in real time, and interfacing with applications for real-time data analysis and display of electrophysiological data. This DAQ controller was developed specifically for a 384-channel pediatric whole-head magnetoencephalography (MEG) system, but its architecture is useful for wide applications. This controller running in a LabVIEW environment interfaces with microprocessors in the MEG sensor electronics to control their real-time operation. It also interfaces with a real-time MEG analysis software via transmission control protocol/internet protocol, to control the synchronous acquisition and transfer of the data in real time from >400 channels to acquisition and analysis workstations. The successful implementation of this controller for an MEG system with a large number of channels demonstrates the feasibility of employing the present architecture in several other applications.

  19. Multidimensional quantum entanglement with large-scale integrated optics.

    Science.gov (United States)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  20. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  1. Intelligent control for large-scale variable speed variable pitch wind turbines

    Institute of Scientific and Technical Information of China (English)

    Xinfang ZHANG; Daping XU; Yibing LIU

    2004-01-01

    Large-scale wind turbine generator systems have strong nonlinear multivariable characteristics with many uncertain factors and disturbances.Automatic control is crucial for the efficiency and reliability of wind turbines.On the basis of simplified and proper model of variable speed variable pitch wind turbines,the effective wind speed is estimated using extended Kalman filter.Intelligent control schemes proposed in the paper include two loops which operate in synchronism with each other.At below-rated wind speed,the inner loop adopts adaptive fuzzy control based on variable universe for generator torque regulation to realize maximum wind energy capture.At above-rated wind speed, a controller based on least square support vector machine is proposed to adjust pitch angle and keep rated output power.The simulation shows the effectiveness of the intelligent control.

  2. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  3. Large-scale application of highly-diluted bacteria for Leptospirosis epidemic control.

    Science.gov (United States)

    Bracho, Gustavo; Varela, Enrique; Fernández, Rolando; Ordaz, Barbara; Marzoa, Natalia; Menéndez, Jorge; García, Luis; Gilling, Esperanza; Leyva, Richard; Rufín, Reynaldo; de la Torre, Rubén; Solis, Rosa L; Batista, Niurka; Borrero, Reinier; Campa, Concepción

    2010-07-01

    Leptospirosis is a zoonotic disease of major importance in the tropics where the incidence peaks in rainy seasons. Natural disasters represent a big challenge to Leptospirosis prevention strategies especially in endemic regions. Vaccination is an effective option but of reduced effectiveness in emergency situations. Homeoprophylactic interventions might help to control epidemics by using highly-diluted pathogens to induce protection in a short time scale. We report the results of a very large-scale homeoprophylaxis (HP) intervention against Leptospirosis in a dangerous epidemic situation in three provinces of Cuba in 2007. Forecast models were used to estimate possible trends of disease incidence. A homeoprophylactic formulation was prepared from dilutions of four circulating strains of Leptospirosis. This formulation was administered orally to 2.3 million persons at high risk in an epidemic in a region affected by natural disasters. The data from surveillance were used to measure the impact of the intervention by comparing with historical trends and non-intervention regions. After the homeoprophylactic intervention a significant decrease of the disease incidence was observed in the intervention regions. No such modifications were observed in non-intervention regions. In the intervention region the incidence of Leptospirosis fell below the historic median. This observation was independent of rainfall. The homeoprophylactic approach was associated with a large reduction of disease incidence and control of the epidemic. The results suggest the use of HP as a feasible tool for epidemic control, further research is warranted. 2010 Elsevier Ltd. All rights reserved.

  4. Parents' perceived vulnerability and perceived control in preventing Meningococcal C infection: a large-scale interview study about vaccination

    Directory of Open Access Journals (Sweden)

    van der Wal Gerrit

    2008-02-01

    Full Text Available Abstract Background Parents' reported ambivalence toward large-scale vaccination programs for childhood diseases may be related to their perception of the risks of side-effects or safety of vaccination and the risk of contracting the disease. The aim of this study is to evaluate parents' perceptions of their child's risk contracting a Meningococcal C infection and parents' perceived control in preventing infection in relation to their evaluation of the safety, effectiveness and usefulness of vaccination. Methods In a large-scale interview study, a random sample of parents was interviewed after their children had received vaccination against Meningococcal C in a catch-up campaign. Questions were asked about the perceived relative vulnerability of their child contracting an infection, perceived control in preventing an infection, and parents' evaluation of the safety, usefulness and effectiveness of vaccination. Results 61% of 2910 (N = 1763 parents who were approached participated. A higher perceived relative vulnerability of their own child contracting the disease was related to a more positive evaluation of the vaccination campaign, while a lower perceived vulnerability did not result in a more negative evaluation. A higher perceived control in being able to prevent an infection was, however, related to a more critical attitude toward the safety, usefulness and effectiveness of vaccination. Conclusion Perceived relative vulnerability contracting an infection and parents' perceived control in preventing an infection seem to influence parents' evaluation of the vaccination programme. Future studies should determine if, and under which circumstances, these perceptions also affect parents' vaccination behaviour and would be relevant to be taken into account when educating parents about vaccination.

  5. Complex Formation Control of Large-Scale Intelligent Autonomous Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Lei

    2012-01-01

    Full Text Available A new formation framework of large-scale intelligent autonomous vehicles is developed, which can realize complex formations while reducing data exchange. Using the proposed hierarchy formation method and the automatic dividing algorithm, vehicles are automatically divided into leaders and followers by exchanging information via wireless network at initial time. Then, leaders form formation geometric shape by global formation information and followers track their own virtual leaders to form line formation by local information. The formation control laws of leaders and followers are designed based on consensus algorithms. Moreover, collision-avoiding problems are considered and solved using artificial potential functions. Finally, a simulation example that consists of 25 vehicles shows the effectiveness of theory.

  6. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  7. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  8. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  9. Modified truncated randomized singular value decomposition (MTRSVD) algorithms for large scale discrete ill-posed problems with general-form regularization

    Science.gov (United States)

    Jia, Zhongxiao; Yang, Yanfei

    2018-05-01

    In this paper, we propose new randomization based algorithms for large scale linear discrete ill-posed problems with general-form regularization: subject to , where L is a regularization matrix. Our algorithms are inspired by the modified truncated singular value decomposition (MTSVD) method, which suits only for small to medium scale problems, and randomized SVD (RSVD) algorithms that generate good low rank approximations to A. We use rank-k truncated randomized SVD (TRSVD) approximations to A by truncating the rank- RSVD approximations to A, where q is an oversampling parameter. The resulting algorithms are called modified TRSVD (MTRSVD) methods. At every step, we use the LSQR algorithm to solve the resulting inner least squares problem, which is proved to become better conditioned as k increases so that LSQR converges faster. We present sharp bounds for the approximation accuracy of the RSVDs and TRSVDs for severely, moderately and mildly ill-posed problems, and substantially improve a known basic bound for TRSVD approximations. We prove how to choose the stopping tolerance for LSQR in order to guarantee that the computed and exact best regularized solutions have the same accuracy. Numerical experiments illustrate that the best regularized solutions by MTRSVD are as accurate as the ones by the truncated generalized singular value decomposition (TGSVD) algorithm, and at least as accurate as those by some existing truncated randomized generalized singular value decomposition (TRGSVD) algorithms. This work was supported in part by the National Science Foundation of China (Nos. 11771249 and 11371219).

  10. Large-Scale Preventive Chemotherapy for the Control of Helminth Infection in Western Pacific Countries: Six Years Later

    Science.gov (United States)

    Montresor, Antonio; Cong, Dai Tran; Sinuon, Mouth; Tsuyuoka, Reiko; Chanthavisouk, Chitsavang; Strandgaard, Hanne; Velayudhan, Raman; Capuano, Corinne M.; Le Anh, Tuan; Tee Dató, Ah S.

    2008-01-01

    In 2001, Urbani and Palmer published a review of the epidemiological situation of helminthiases in the countries of the Western Pacific Region of the World Health Organization indicating the control needs in the region. Six years after this inspiring article, large-scale preventive chemotherapy for the control of helminthiasis has scaled up dramatically in the region. This paper analyzes the most recent published and unpublished country information on large-scale preventive chemotherapy and summarizes the progress made since 2000. Almost 39 million treatments were provided in 2006 in the region for the control of helminthiasis: nearly 14 million for the control of lymphatic filariasis, more than 22 million for the control of soil-transmitted helminthiasis, and over 2 million for the control of schistosomiasis. In general, control of these helminthiases is progressing well in the Mekong countries and Pacific Islands. In China, despite harboring the majority of the helminth infections of the region, the control activities have not reached the level of coverage of countries with much more limited financial resources. The control of food-borne trematodes is still limited, but pilot activities have been initiated in China, Lao People's Democratic Republic, and Vietnam. PMID:18846234

  11. Implementing Large-Scale Instructional Technology in Kenya: Changing Instructional Practice and Developing Accountability in a National Education System

    Science.gov (United States)

    Piper, Benjamin; Oyanga, Arbogast; Mejia, Jessica; Pouezevara, Sarah

    2017-01-01

    Previous large-scale education technology interventions have shown only modest impacts on student achievement. Building on results from an earlier randomized controlled trial of three different applications of information and communication technologies (ICTs) on primary education in Kenya, the Tusome Early Grade Reading Activity developed the…

  12. Evaluation of the effect of Spiritual care on patients with generalized anxiety and depression: a randomized controlled study.

    Science.gov (United States)

    Sankhe, A; Dalal, K; Save, D; Sarve, P

    2017-12-01

    The present study was conducted to assess the effect of spiritual care in patients with depression, anxiety or both in a randomized controlled design. The participants were randomized either to receive spiritual care or not and Hamilton anxiety rating scale-A (HAM-A), Hamilton depression rating scale-D (HAM-D), WHO-quality of life-Brief (WHOQOL-BREF) and Functional assessment of chronic illness therapy - Spiritual well-being (FACIT-Sp) were assessed before therapy and two follow-ups at 3 and 6 week. However, with regard to the spiritual care therapy group, statistically significant differences were observed in both HAM-A and HAM-D scales between the baseline and visit 2 (p scales during the follow-up periods for the control group of participants. When the scores were compared between the study groups, HAM-A, HAM-D and FACIT-Sp 12 scores were significantly lower in the interventional group as compared to the control group at both third and sixth weeks. This suggests a significant improvement in symptoms of anxiety and depression in the spiritual care therapy group than the control group; however, large randomized controlled trials with robust design are needed to confirm the same.

  13. Use of ABB ADVANT Power for large scale instrumentation and controls replacements in nuclear power plants

    International Nuclear Information System (INIS)

    Pucak, J.L.; Brown, E.M.

    1999-01-01

    One of the major issues facing plants planning for life extension is the viability and feasibility of modernization of a plant's existing I and C systems including the safety systems and the control room. This paper discusses the ABB approach to the implementation of large scale Instrumentation and Controls (I and C) modernization. ABB applies a segmented architecture approach using the ADVANT Power control system to meet the numerous constraints of a major I and C upgrade program. The segmented architecture and how it supports implementation of a complete I and C upgrade either in one outage or in a series of outages is presented. ADVANT Power contains standardized industrial control equipment that is designed to support 1E applications as well as turbine and non-1E process control. This equipment forms the basis for the architecture proposed for future new nuclear plant sales as well as large scale retrofits. (author)

  14. Large Scale Investments in Infrastructure : Competing Policy regimes to Control Connections

    NARCIS (Netherlands)

    Otsuki, K.; Read, M.L.; Zoomers, E.B.

    2016-01-01

    This paper proposes to analyse implications of large-scale investments in physical infrastructure for social and environmental justice. While case studies on the global land rush and climate change have advanced our understanding of how large-scale investments in land, forests and water affect

  15. Bi-Level Decentralized Active Power Control for Large-Scale Wind Farm Cluster

    DEFF Research Database (Denmark)

    Huang, Shengli; Wu, Qiuwei; Guo, Yifei

    2018-01-01

    This paper presents a bi-level decentralized active power control (DAPC) for a large-scale wind farm cluster, consisting of several wind farms for better active power dispatch. In the upper level, a distributed active power control scheme based on the distributed consensus is designed to achieve...... fair active power sharing among multiple wind farms, which generates the power reference for each wind farm. A distributed estimator is used to estimate the total available power of all wind farms. In the lower level, a centralized control scheme based on the Model Predictive Control (MPC) is proposed...... to regulate active power outputs of all wind turbines (WTs) within a wind farm, which reduces the fatigue loads of WTs while tracking the power reference obtained from the upper level control. A wind farm cluster with 8 wind farms and totally 160 WTs, was used to test the control performance of the proposed...

  16. Solution Coating of Superior Large-Area Flexible Perovskite Thin Films with Controlled Crystal Packing

    KAUST Repository

    Li, Jianbo

    2017-05-08

    Solution coating of organohalide lead perovskites offers great potential for achieving low-cost manufacturing of large-area flexible optoelectronics. However, the rapid coating speed needed for industrial-scale production poses challenges to the control of crystal packing. Herein, this study reports using solution shearing to confine crystal nucleation and growth in large-area printed MAPbI3 thin films. Near single-crystalline perovskite microarrays are demonstrated with a high degree of controlled macroscopic alignment and crystal orientation, which exhibit significant improvements in optical and optoelectronic properties comparing with their random counterparts, spherulitic, and nanograined films. In particular, photodetectors based on the confined films showing intense anisotropy in charge transport are fabricated, and the device exhibits significantly improved performance in all aspects by one more orders of magnitude relative to their random counterparts. It is anticipated that perovskite films with controlled crystal packing may find applications in high-performance, large-area printed optoelectronics, and solar cells.

  17. Solution Coating of Superior Large-Area Flexible Perovskite Thin Films with Controlled Crystal Packing

    KAUST Repository

    Li, Jianbo; Liu, Yucheng; Ren, Xiaodong; Yang, Zhou; Li, Ruipeng; Su, Hang; Yang, Xiaoming; Xu, Junzhuo; Xu, Hua; Hu, Jian-Yong; Amassian, Aram; Zhao, Kui; Liu, Shengzhong Frank

    2017-01-01

    Solution coating of organohalide lead perovskites offers great potential for achieving low-cost manufacturing of large-area flexible optoelectronics. However, the rapid coating speed needed for industrial-scale production poses challenges to the control of crystal packing. Herein, this study reports using solution shearing to confine crystal nucleation and growth in large-area printed MAPbI3 thin films. Near single-crystalline perovskite microarrays are demonstrated with a high degree of controlled macroscopic alignment and crystal orientation, which exhibit significant improvements in optical and optoelectronic properties comparing with their random counterparts, spherulitic, and nanograined films. In particular, photodetectors based on the confined films showing intense anisotropy in charge transport are fabricated, and the device exhibits significantly improved performance in all aspects by one more orders of magnitude relative to their random counterparts. It is anticipated that perovskite films with controlled crystal packing may find applications in high-performance, large-area printed optoelectronics, and solar cells.

  18. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  19. A novel artificial fish swarm algorithm for solving large-scale reliability-redundancy application problem.

    Science.gov (United States)

    He, Qiang; Hu, Xiangtao; Ren, Hong; Zhang, Hongqi

    2015-11-01

    A novel artificial fish swarm algorithm (NAFSA) is proposed for solving large-scale reliability-redundancy allocation problem (RAP). In NAFSA, the social behaviors of fish swarm are classified in three ways: foraging behavior, reproductive behavior, and random behavior. The foraging behavior designs two position-updating strategies. And, the selection and crossover operators are applied to define the reproductive ability of an artificial fish. For the random behavior, which is essentially a mutation strategy, the basic cloud generator is used as the mutation operator. Finally, numerical results of four benchmark problems and a large-scale RAP are reported and compared. NAFSA shows good performance in terms of computational accuracy and computational efficiency for large scale RAP. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  1. Global asymptotic stabilization of large-scale hydraulic networks using positive proportional controls

    DEFF Research Database (Denmark)

    Jensen, Tom Nørgaard; Wisniewski, Rafal

    2014-01-01

    An industrial case study involving a large-scale hydraulic network underlying a district heating system subject to structural changes is considered. The problem of controlling the pressure drop across the so-called end-user valves in the network to a designated vector of reference values under...... directional actuator constraints is addressed. The proposed solution consists of a set of decentralized positively constrained proportional control actions. The results show that the closed-loop system always has a globally asymptotically stable equilibrium point independently on the number of end......-users. Furthermore, by a proper design of controller gains the closed-loop equilibrium point can be designed to belong to an arbitrarily small neighborhood of the desired equilibrium point. Since there exists a globally asymptotically stable equilibrium point independently on the number of end-users in the system...

  2. Some Statistics for Measuring Large-Scale Structure

    OpenAIRE

    Brandenberger, Robert H.; Kaplan, David M.; A, Stephen; Ramsey

    1993-01-01

    Good statistics for measuring large-scale structure in the Universe must be able to distinguish between different models of structure formation. In this paper, two and three dimensional ``counts in cell" statistics and a new ``discrete genus statistic" are applied to toy versions of several popular theories of structure formation: random phase cold dark matter model, cosmic string models, and global texture scenario. All three statistics appear quite promising in terms of differentiating betw...

  3. WAMS Based Intelligent Operation and Control of Modern Power System with large Scale Renewable Energy Penetration

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain

    security limits. Under such scenario, progressive displacement of conventional generation by wind generation is expected to eventually lead a complex power system with least presence of central power plants. Consequently the support from conventional power plants is expected to reach its all-time low...... system voltage control responsibility from conventional power plants to wind turbines. With increased wind penetration and displaced conventional central power plants, dynamic voltage security has been identified as one of the challenging issue for large scale wind integration. To address the dynamic...... security issue, a WAMS based systematic voltage control scheme for large scale wind integrated power system has been proposed. Along with the optimal reactive power compensation, the proposed scheme considers voltage support from wind farms (equipped with voltage support functionality) and refurbished...

  4. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  5. Event-triggered decentralized robust model predictive control for constrained large-scale interconnected systems

    Directory of Open Access Journals (Sweden)

    Ling Lu

    2016-12-01

    Full Text Available This paper considers the problem of event-triggered decentralized model predictive control (MPC for constrained large-scale linear systems subject to additive bounded disturbances. The constraint tightening method is utilized to formulate the MPC optimization problem. The local predictive control law for each subsystem is determined aperiodically by relevant triggering rule which allows a considerable reduction of the computational load. And then, the robust feasibility and closed-loop stability are proved and it is shown that every subsystem state will be driven into a robust invariant set. Finally, the effectiveness of the proposed approach is illustrated via numerical simulations.

  6. Literature Review: Herbal Medicine Treatment after Large-Scale Disasters.

    Science.gov (United States)

    Takayama, Shin; Kaneko, Soichiro; Numata, Takehiro; Kamiya, Tetsuharu; Arita, Ryutaro; Saito, Natsumi; Kikuchi, Akiko; Ohsawa, Minoru; Kohayagawa, Yoshitaka; Ishii, Tadashi

    2017-01-01

    Large-scale natural disasters, such as earthquakes, tsunamis, volcanic eruptions, and typhoons, occur worldwide. After the Great East Japan earthquake and tsunami, our medical support operation's experiences suggested that traditional medicine might be useful for treating the various symptoms of the survivors. However, little information is available regarding herbal medicine treatment in such situations. Considering that further disasters will occur, we performed a literature review and summarized the traditional medicine approaches for treatment after large-scale disasters. We searched PubMed and Cochrane Library for articles written in English, and Ichushi for those written in Japanese. Articles published before 31 March 2016 were included. Keywords "disaster" and "herbal medicine" were used in our search. Among studies involving herbal medicine after a disaster, we found two randomized controlled trials investigating post-traumatic stress disorder (PTSD), three retrospective investigations of trauma or common diseases, and seven case series or case reports of dizziness, pain, and psychosomatic symptoms. In conclusion, herbal medicine has been used to treat trauma, PTSD, and other symptoms after disasters. However, few articles have been published, likely due to the difficulty in designing high quality studies in such situations. Further study will be needed to clarify the usefulness of herbal medicine after disasters.

  7. Dynamic Control of Facts Devices to Enable Large Scale Penetration of Renewable Energy Resources

    Science.gov (United States)

    Chavan, Govind Sahadeo

    This thesis focuses on some of the problems caused by large scale penetration of Renewable Energy Resources within EHV transmission networks, and investigates some approaches in resolving these problems. In chapter 4, a reduced-order model of the 500 kV WECC transmission system is developed by estimating its key parameters from phasor measurement unit (PMU) data. The model was then implemented in RTDS and was investigated for its accuracy with respect to the PMU data. Finally it was tested for observing the effects of various contingencies like transmission line loss, generation loss and large scale penetration of wind farms on EHV transmission systems. Chapter 5 introduces Static Series Synchronous Compensators (SSSC) which are seriesconnected converters that can control real power flow along a transmission line. A new application of SSSCs in mitigating Ferranti effect on unloaded transmission lines was demonstrated on PSCAD. A new control scheme for SSSCs based on the Cascaded H-bridge (CHB) converter configuration was proposed and was demonstrated using PSCAD and RTDS. A new centralized controller was developed for the distributed SSSCs based on some of the concepts used in the CHB-based SSSC. The controller's efficacy was demonstrated using RTDS. Finally chapter 6 introduces the problem of power oscillations induced by renewable sources in a transmission network. A power oscillation damping (POD) controller is designed using distributed SSSCs in NYPA's 345 kV three-bus AC system and its efficacy is demonstrated in PSCAD. A similar POD controller is then designed for the CHB-based SSSC in the IEEE 14 bus system in PSCAD. Both controllers were noted to have significantly damped power oscillations in the transmission networks.

  8. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  9. Large scale particle simulations in a virtual memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Million, R.; Wagner, J.S.; Tajima, T.

    1983-01-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceeds the computer core size. The required address space is automatically mapped onto slow disc memory the the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Assesses to slow memory significantly reduce the excecution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time. (orig.)

  10. Large-scale particle simulations in a virtual-memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Wagner, J.S.; Tajima, T.; Million, R.

    1982-08-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceed the computer core size. The required address space is automatically mapped onto slow disc memory by the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Accesses to slow memory significantly reduce the execution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time

  11. Platelet-rich plasma for arthroscopic repair of medium to large rotator cuff tears: a randomized controlled trial.

    Science.gov (United States)

    Jo, Chris Hyunchul; Shin, Ji Sun; Shin, Won Hyoung; Lee, Seung Yeon; Yoon, Kang Sup; Shin, Sue

    2015-09-01

    Two main questions about the use of platelet-rich plasma (PRP) for regeneration purposes are its effect on the speed of healing and the quality of healing. Despite recent numerous studies, evidence is still lacking in this area, especially in a representative patient population with medium to large rotator cuff tears. To assess the efficacy of PRP augmentation on the speed and quality of healing in patients undergoing arthroscopic repair for medium to large rotator cuff tears. Randomized controlled trial; Level of evidence, 1. A total of 74 patients scheduled for arthroscopic repair of medium to large rotator cuff tears were randomly assigned to undergo either PRP-augmented repair (PRP group) or conventional repair (conventional group). In the PRP group, 3 PRP gels (3 × 3 mL) were applied to each patient between the torn end and the greater tuberosity. The primary outcome was the Constant score at 3 months after surgery. Secondary outcome measures included the visual analog scale (VAS) for pain, range of motion (ROM), muscle strength, overall satisfaction and function, functional scores, retear rate, and change in the cross-sectional area (CSA) of the supraspinatus muscle. There was no difference between the 2 groups in the Constant score at 3 months (P > .05). The 2 groups had similar results on the VAS for pain, ROM, muscle strength, overall satisfaction and function, and other functional scores (all P > .05) except for the VAS for worst pain (P = .043). The retear rate of the PRP group (3.0%) was significantly lower than that of the conventional group (20.0%) (P = .032). The change in 1-year postoperative and immediately postoperative CSAs was significantly different between the 2 groups: -36.76 ± 45.31 mm(2) in the PRP group versus -67.47 ± 47.26 mm(2) in the conventional group (P = .014). Compared with repairs without PRP augmentation, the current PRP preparation and application methods for medium to large rotator cuff repairs significantly improved the

  12. How Can the Evidence from Global Large-scale Clinical Trials for Cardiovascular Diseases be Improved?

    Science.gov (United States)

    Sawata, Hiroshi; Tsutani, Kiichiro

    2011-06-29

    Clinical investigations are important for obtaining evidence to improve medical treatment. Large-scale clinical trials with thousands of participants are particularly important for this purpose in cardiovascular diseases. Conducting large-scale clinical trials entails high research costs. This study sought to investigate global trends in large-scale clinical trials in cardiovascular diseases. We searched for trials using clinicaltrials.gov (URL: http://www.clinicaltrials.gov/) using the key words 'cardio' and 'event' in all fields on 10 April, 2010. We then selected trials with 300 or more participants examining cardiovascular diseases. The search revealed 344 trials that met our criteria. Of 344 trials, 71% were randomized controlled trials, 15% involved more than 10,000 participants, and 59% were funded by industry. In RCTs whose results were disclosed, 55% of industry-funded trials and 25% of non-industry funded trials reported statistically significant superiority over control (p = 0.012, 2-sided Fisher's exact test). Our findings highlighted concerns regarding potential bias related to funding sources, and that researchers should be aware of the importance of trial information disclosures and conflicts of interest. We should keep considering management and training regarding information disclosures and conflicts of interest for researchers. This could lead to better clinical evidence and further improvements in the development of medical treatment worldwide.

  13. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  14. Normalization of emotion control scale

    Directory of Open Access Journals (Sweden)

    Hojatoolah Tahmasebian

    2014-09-01

    Full Text Available Background: Emotion control skill teaches the individuals how to identify their emotions and how to express and control them in various situations. The aim of this study was to normalize and measure the internal and external validity and reliability of emotion control test. Methods: This standardization study was carried out on a statistical society, including all pupils, students, teachers, nurses and university professors in Kermanshah in 2012, using Williams’ emotion control scale. The subjects included 1,500 (810 females and 690 males people who were selected by stratified random sampling. Williams (1997 emotion control scale, was used to collect the required data. Emotional Control Scale is a tool for measuring the degree of control people have over their emotions. This scale has four subscales, including anger, depressed mood, anxiety and positive affect. The collected data were analyzed by SPSS software using correlation and Cronbach's alpha tests. Results: The results of internal consistency of the questionnaire reported by Cronbach's alpha indicated an acceptable internal consistency for emotional control scale, and the correlation between the subscales of the test and between the items of the questionnaire was significant at 0.01 confidence level. Conclusion: The validity of emotion control scale among the pupils, students, teachers, nurses and teachers in Iran has an acceptable range, and the test itemswere correlated with each other, thereby making them appropriate for measuring emotion control.

  15. Control protocol: large scale implementation at the CERN PS complex - a first assessment

    International Nuclear Information System (INIS)

    Abie, H.; Benincasa, G.; Coudert, G.; Davydenko, Y.; Dehavay, C.; Gavaggio, R.; Gelato, G.; Heinze, W.; Legras, M.; Lustig, H.; Merard, L.; Pearson, T.; Strubin, P.; Tedesco, J.

    1994-01-01

    The Control Protocol is a model-based, uniform access procedure from a control system to accelerator equipment. It was proposed at CERN about 5 years ago and prototypes were developed in the following years. More recently, this procedure has been finalized and implemented at a large scale in the PS Complex. More than 300 pieces of equipment are now using this protocol in normal operation and another 300 are under implementation. These include power converters, vacuum systems, beam instrumentation devices, RF equipment, etc. This paper describes how the single general procedure is applied to the different kinds of equipment. The advantages obtained are also discussed. ((orig.))

  16. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  17. A theoretical bilevel control scheme for power networks with large-scale penetration of distributed renewable resources

    DEFF Research Database (Denmark)

    Boroojeni, Kianoosh; Amini, M. Hadi; Nejadpak, Arash

    2016-01-01

    In this paper, we present a bilevel control framework to achieve a highly-reliable smart distribution network with large-scale penetration of distributed renewable resources (DRRs). We assume that the power distribution network consists of several residential/commercial communities. In the first ...

  18. Environment and host as large-scale controls of ectomycorrhizal fungi.

    Science.gov (United States)

    van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I

    2018-06-06

    Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.

  19. Scaling Argument of Anisotropic Random Walk

    International Nuclear Information System (INIS)

    Xu Bingzhen; Jin Guojun; Wang Feifeng

    2005-01-01

    In this paper, we analytically discuss the scaling properties of the average square end-to-end distance (R 2 ) for anisotropic random walk in D-dimensional space (D≥2), and the returning probability P n (r 0 ) for the walker into a certain neighborhood of the origin. We will not only give the calculating formula for (R 2 ) and P n (r 0 ), but also point out that if there is a symmetric axis for the distribution of the probability density of a single step displacement, we always obtain (R p erpendicular n 2 )∼n, where perpendicular refers to the projections of the displacement perpendicular to each symmetric axes of the walk; in D-dimensional space with D symmetric axes perpendicular to each other, we always have (R n 2 )∼n and the random walk will be like a purely random motion; if the number of inter-perpendicular symmetric axis is smaller than the dimensions of the space, we must have (R n 2 )∼n 2 for very large n and the walk will be like a ballistic motion. It is worth while to point out that unlike the isotropic random walk in one and two dimensions, which is certain to return into the neighborhood of the origin, generally there is only a nonzero probability for the anisotropic random walker in two dimensions to return to the neighborhood.

  20. Measuring the topology of large-scale structure in the universe

    Science.gov (United States)

    Gott, J. Richard, III

    1988-11-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.

  1. Measuring the topology of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Gott, J.R. III

    1988-01-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data. 45 references

  2. Large-scale cryopumping for controlled fusion

    International Nuclear Information System (INIS)

    Pittenger, L.C.

    1977-01-01

    Vacuum pumping by freezing out or otherwise immobilizing the pumped gas is an old concept. In several plasma physics experiments for controlled fusion research, cryopumping has been used to provide clean, ultrahigh vacua. Present day fusion research devices, which rely almost universally upon neutral beams for heating, are high gas throughput systems, the pumping of which is best accomplished by cryopumping in the high mass-flow, moderate-to-high vacuum regime. Cryopumping systems have been developed for neutral beam injection systems on several fusion experiments (HVTS, TFTR) and are being developed for the overall pumping of a large, high-throughput mirror containment experiment (MFTF). In operation, these large cryopumps will require periodic defrosting, some schemes for which are discussed, along with other operational considerations. The development of cryopumps for fusion reactors is begun with the TFTR and MFTF systems. Likely paths for necessary further development for power-producing reactors are also discussed

  3. Large-scale cryopumping for controlled fusion

    Energy Technology Data Exchange (ETDEWEB)

    Pittenger, L.C.

    1977-07-25

    Vacuum pumping by freezing out or otherwise immobilizing the pumped gas is an old concept. In several plasma physics experiments for controlled fusion research, cryopumping has been used to provide clean, ultrahigh vacua. Present day fusion research devices, which rely almost universally upon neutral beams for heating, are high gas throughput systems, the pumping of which is best accomplished by cryopumping in the high mass-flow, moderate-to-high vacuum regime. Cryopumping systems have been developed for neutral beam injection systems on several fusion experiments (HVTS, TFTR) and are being developed for the overall pumping of a large, high-throughput mirror containment experiment (MFTF). In operation, these large cryopumps will require periodic defrosting, some schemes for which are discussed, along with other operational considerations. The development of cryopumps for fusion reactors is begun with the TFTR and MFTF systems. Likely paths for necessary further development for power-producing reactors are also discussed.

  4. Parameters affecting the resilience of scale-free networks to random failures.

    Energy Technology Data Exchange (ETDEWEB)

    Link, Hamilton E.; LaViolette, Randall A.; Lane, Terran (University of New Mexico, Albuquerque, NM); Saia, Jared (University of New Mexico, Albuquerque, NM)

    2005-09-01

    It is commonly believed that scale-free networks are robust to massive numbers of random node deletions. For example, Cohen et al. in (1) study scale-free networks including some which approximate the measured degree distribution of the Internet. Their results suggest that if each node in this network failed independently with probability 0.99, most of the remaining nodes would still be connected in a giant component. In this paper, we show that a large and important subclass of scale-free networks are not robust to massive numbers of random node deletions. In particular, we study scale-free networks which have minimum node degree of 1 and a power-law degree distribution beginning with nodes of degree 1 (power-law networks). We show that, in a power-law network approximating the Internet's reported distribution, when the probability of deletion of each node is 0.5 only about 25% of the surviving nodes in the network remain connected in a giant component, and the giant component does not persist beyond a critical failure rate of 0.9. The new result is partially due to improved analytical accommodation of the large number of degree-0 nodes that result after node deletions. Our results apply to power-law networks with a wide range of power-law exponents, including Internet-like networks. We give both analytical and empirical evidence that such networks are not generally robust to massive random node deletions.

  5. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  6. Web-Based and Mobile Stress Management Intervention for Employees: A Randomized Controlled Trial

    OpenAIRE

    Heber, Elena; Lehr, Dirk; Ebert, David Daniel; Berking, Matthias; Riper, Heleen

    2016-01-01

    Background: Work-related stress is highly prevalent among employees and is associated with adverse mental health consequences. Web-based interventions offer the opportunity to deliver effective solutions on a large scale; however, the evidence is limited and the results conflicting. Objective: This randomized controlled trial evaluated the efficacy of guided Web-and mobile-based stress management training for employees. Methods: A total of 264 employees with elevated symptoms of stress (Perce...

  7. Performance of automatic generation control mechanisms with large-scale wind power

    Energy Technology Data Exchange (ETDEWEB)

    Ummels, B.C.; Gibescu, M.; Paap, G.C. [Delft Univ. of Technology (Netherlands); Kling, W.L. [Transmission Operations Department of TenneT bv (Netherlands)

    2007-11-15

    The unpredictability and variability of wind power increasingly challenges real-time balancing of supply and demand in electric power systems. In liberalised markets, balancing is a responsibility jointly held by the TSO (real-time power balancing) and PRPs (energy programs). In this paper, a procedure is developed for the simulation of power system balancing and the assessment of AGC performance in the presence of large-scale wind power, using the Dutch control zone as a case study. The simulation results show that the performance of existing AGC-mechanisms is adequate for keeping ACE within acceptable bounds. At higher wind power penetrations, however, the capabilities of the generation mix are increasingly challenged and additional reserves are required at the same level. (au)

  8. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  9. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  10. Communication: An effective linear-scaling atomic-orbital reformulation of the random-phase approximation using a contracted double-Laplace transformation

    International Nuclear Information System (INIS)

    Schurkus, Henry F.; Ochsenfeld, Christian

    2016-01-01

    An atomic-orbital (AO) reformulation of the random-phase approximation (RPA) correlation energy is presented allowing to reduce the steep computational scaling to linear, so that large systems can be studied on simple desktop computers with fully numerically controlled accuracy. Our AO-RPA formulation introduces a contracted double-Laplace transform and employs the overlap-metric resolution-of-the-identity. First timings of our pilot code illustrate the reduced scaling with systems comprising up to 1262 atoms and 10 090 basis functions. 

  11. Control for large scale demand response of thermostatic loads

    DEFF Research Database (Denmark)

    Totu, Luminita Cristiana; Leth, John; Wisniewski, Rafal

    2013-01-01

    appliances with on/off operation. The objective is to reduce the consumption peak of a group of loads composed of both flexible and inflexible units. The power flexible units are the thermostat-based appliances. We discuss a centralized, model predictive approach and a distributed structure with a randomized......Demand response is an important Smart Grid concept that aims at facilitating the integration of volatile energy resources into the electricity grid. This paper considers a residential demand response scenario and specifically looks into the problem of managing a large number thermostatbased...

  12. Large Scale Community Detection Using a Small World Model

    Directory of Open Access Journals (Sweden)

    Ranjan Kumar Behera

    2017-11-01

    Full Text Available In a social network, small or large communities within the network play a major role in deciding the functionalities of the network. Despite of diverse definitions, communities in the network may be defined as the group of nodes that are more densely connected as compared to nodes outside the group. Revealing such hidden communities is one of the challenging research problems. A real world social network follows small world phenomena, which indicates that any two social entities can be reachable in a small number of steps. In this paper, nodes are mapped into communities based on the random walk in the network. However, uncovering communities in large-scale networks is a challenging task due to its unprecedented growth in the size of social networks. A good number of community detection algorithms based on random walk exist in literature. In addition, when large-scale social networks are being considered, these algorithms are observed to take considerably longer time. In this work, with an objective to improve the efficiency of algorithms, parallel programming framework like Map-Reduce has been considered for uncovering the hidden communities in social network. The proposed approach has been compared with some standard existing community detection algorithms for both synthetic and real-world datasets in order to examine its performance, and it is observed that the proposed algorithm is more efficient than the existing ones.

  13. How Can the Evidence from Global Large-scale Clinical Trials for Cardiovascular Diseases be Improved?

    Directory of Open Access Journals (Sweden)

    Tsutani Kiichiro

    2011-06-01

    Full Text Available Abstract Background Clinical investigations are important for obtaining evidence to improve medical treatment. Large-scale clinical trials with thousands of participants are particularly important for this purpose in cardiovascular diseases. Conducting large-scale clinical trials entails high research costs. This study sought to investigate global trends in large-scale clinical trials in cardiovascular diseases. Findings We searched for trials using clinicaltrials.gov (URL: http://www.clinicaltrials.gov/ using the key words 'cardio' and 'event' in all fields on 10 April, 2010. We then selected trials with 300 or more participants examining cardiovascular diseases. The search revealed 344 trials that met our criteria. Of 344 trials, 71% were randomized controlled trials, 15% involved more than 10,000 participants, and 59% were funded by industry. In RCTs whose results were disclosed, 55% of industry-funded trials and 25% of non-industry funded trials reported statistically significant superiority over control (p = 0.012, 2-sided Fisher's exact test. Conclusions Our findings highlighted concerns regarding potential bias related to funding sources, and that researchers should be aware of the importance of trial information disclosures and conflicts of interest. We should keep considering management and training regarding information disclosures and conflicts of interest for researchers. This could lead to better clinical evidence and further improvements in the development of medical treatment worldwide.

  14. Large-deviation theory for diluted Wishart random matrices

    Science.gov (United States)

    Castillo, Isaac Pérez; Metz, Fernando L.

    2018-03-01

    Wishart random matrices with a sparse or diluted structure are ubiquitous in the processing of large datasets, with applications in physics, biology, and economy. In this work, we develop a theory for the eigenvalue fluctuations of diluted Wishart random matrices based on the replica approach of disordered systems. We derive an analytical expression for the cumulant generating function of the number of eigenvalues IN(x ) smaller than x ∈R+ , from which all cumulants of IN(x ) and the rate function Ψx(k ) controlling its large-deviation probability Prob[IN(x ) =k N ] ≍e-N Ψx(k ) follow. Explicit results for the mean value and the variance of IN(x ) , its rate function, and its third cumulant are discussed and thoroughly compared to numerical diagonalization, showing very good agreement. The present work establishes the theoretical framework put forward in a recent letter [Phys. Rev. Lett. 117, 104101 (2016), 10.1103/PhysRevLett.117.104101] as an exact and compelling approach to deal with eigenvalue fluctuations of sparse random matrices.

  15. Reporting funding source or conflict of interest in abstracts of randomized controlled trials, no evidence of a large impact on general practitioners' confidence in conclusions, a three-arm randomized controlled trial.

    Science.gov (United States)

    Buffel du Vaure, Céline; Boutron, Isabelle; Perrodeau, Elodie; Ravaud, Philippe

    2014-04-28

    Systematic reporting of funding sources is recommended in the CONSORT Statement for abstracts. However, no specific recommendation is related to the reporting of conflicts of interest (CoI). The objective was to compare physicians' confidence in the conclusions of abstracts of randomized controlled trials of pharmaceutical treatment indexed in PubMed. We planned a three-arm parallel-group randomized trial. French general practitioners (GPs) were invited to participate and were blinded to the study's aim. We used a representative sample of 75 abstracts of pharmaceutical industry-funded randomized controlled trials published in 2010 and indexed in PubMed. Each abstract was standardized and reported in three formats: 1) no mention of the funding source or CoI; 2) reporting the funding source only; and 3) reporting the funding source and CoI. GPs were randomized according to a computerized randomization on a secure Internet system at a 1:1:1 ratio to assess one abstract among the three formats. The primary outcome was GPs' confidence in the abstract conclusions (0, not at all, to 10, completely confident). The study was planned to detect a large difference with an effect size of 0.5. Between October 2012 and June 2013, among 605 GPs contacted, 354 were randomized, 118 for each type of abstract. The mean difference (95% confidence interval) in GPs' confidence in abstract findings was 0.2 (-0.6; 1.0) (P = 0.84) for abstracts reporting the funding source only versus no funding source or CoI; -0.4 (-1.3; 0.4) (P = 0.39) for abstracts reporting the funding source and CoI versus no funding source and CoI; and -0.6 (-1.5; 0.2) (P = 0.15) for abstracts reporting the funding source and CoI versus the funding source only. We found no evidence of a large impact of trial report abstracts mentioning funding sources or CoI on GPs' confidence in the conclusions of the abstracts. ClinicalTrials.gov identifier: NCT01679873.

  16. Low rank approximation methods for MR fingerprinting with large scale dictionaries.

    Science.gov (United States)

    Yang, Mingrui; Ma, Dan; Jiang, Yun; Hamilton, Jesse; Seiberlich, Nicole; Griswold, Mark A; McGivney, Debra

    2018-04-01

    This work proposes new low rank approximation approaches with significant memory savings for large scale MR fingerprinting (MRF) problems. We introduce a compressed MRF with randomized singular value decomposition method to significantly reduce the memory requirement for calculating a low rank approximation of large sized MRF dictionaries. We further relax this requirement by exploiting the structures of MRF dictionaries in the randomized singular value decomposition space and fitting them to low-degree polynomials to generate high resolution MRF parameter maps. In vivo 1.5T and 3T brain scan data are used to validate the approaches. T 1 , T 2 , and off-resonance maps are in good agreement with that of the standard MRF approach. Moreover, the memory savings is up to 1000 times for the MRF-fast imaging with steady-state precession sequence and more than 15 times for the MRF-balanced, steady-state free precession sequence. The proposed compressed MRF with randomized singular value decomposition and dictionary fitting methods are memory efficient low rank approximation methods, which can benefit the usage of MRF in clinical settings. They also have great potentials in large scale MRF problems, such as problems considering multi-component MRF parameters or high resolution in the parameter space. Magn Reson Med 79:2392-2400, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  17. Cosmological streaming velocities and large-scale density maxima

    International Nuclear Information System (INIS)

    Peacock, J.A.; Lumsden, S.L.; Heavens, A.F.

    1987-01-01

    The statistical testing of models for galaxy formation against the observed peculiar velocities on 10-100 Mpc scales is considered. If it is assumed that observers are likely to be sited near maxima in the primordial field of density perturbations, then the observed filtered velocity field will be biased to low values by comparison with a point selected at random. This helps to explain how the peculiar velocities (relative to the microwave background) of the local supercluster and the Rubin-Ford shell can be so similar in magnitude. Using this assumption to predict peculiar velocities on two scales, we test models with large-scale damping (i.e. adiabatic perturbations). Allowed models have a damping length close to the Rubin-Ford scale and are mildly non-linear. Both purely baryonic universes and universes dominated by massive neutrinos can account for the observed velocities, provided 0.1 ≤ Ω ≤ 1. (author)

  18. A Randomized trial of an Asthma Internet Self-management Intervention (RAISIN): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Morrison, Deborah; Wyke, Sally; Thomson, Neil C; McConnachie, Alex; Agur, Karolina; Saunderson, Kathryn; Chaudhuri, Rekha; Mair, Frances S

    2014-05-24

    The financial costs associated with asthma care continue to increase while care remains suboptimal. Promoting optimal self-management, including the use of asthma action plans, along with regular health professional review has been shown to be an effective strategy and is recommended in asthma guidelines internationally. Despite evidence of benefit, guided self-management remains underused, however the potential for online resources to promote self-management behaviors is gaining increasing recognition. The aim of this paper is to describe the protocol for a pilot evaluation of a website 'Living well with asthma' which has been developed with the aim of promoting self-management behaviors shown to improve outcomes. The study is a parallel randomized controlled trial, where adults with asthma are randomly assigned to either access to the website for 12 weeks, or usual asthma care for 12 weeks (followed by access to the website if desired). Individuals are included if they are over 16-years-old, have a diagnosis of asthma with an Asthma Control Questionnaire (ACQ) score of greater than, or equal to 1, and have access to the internet. Primary outcomes for this evaluation include recruitment and retention rates, changes at 12 weeks from baseline for both ACQ and Asthma Quality of Life Questionnaire (AQLQ) scores, and quantitative data describing website usage (number of times logged on, length of time logged on, number of times individual pages looked at, and for how long). Secondary outcomes include clinical outcomes (medication use, health services use, lung function) and patient reported outcomes (including adherence, patient activation measures, and health status). Piloting of complex interventions is considered best practice and will maximise the potential of any future large-scale randomized controlled trial to successfully recruit and be able to report on necessary outcomes. Here we will provide results across a range of outcomes which will provide estimates of

  19. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  20. Culturally adaptive storytelling intervention versus didactic intervention to improve hypertension control in Vietnam: a cluster-randomized controlled feasibility trial.

    Science.gov (United States)

    Nguyen, Hoa L; Allison, Jeroan J; Ha, Duc A; Chiriboga, Germán; Ly, Ha N; Tran, Hanh T; Nguyen, Cuong K; Dang, Diem M; Phan, Ngoc T; Vu, Nguyen C; Nguyen, Quang P; Goldberg, Robert J

    2017-01-01

    Vietnam is experiencing an epidemiologic transition with an increased prevalence of non-communicable diseases. Novel, large-scale, effective, and sustainable interventions to control hypertension in Vietnam are needed. We report the results of a cluster-randomized feasibility trial at 3 months follow-up conducted in Hung Yen province, Vietnam, designed to evaluate the feasibility and acceptability of two community-based interventions to improve hypertension control: a "storytelling" intervention, "We Talk about Our Hypertension," and a didactic intervention. The storytelling intervention included stories about strategies for coping with hypertension, with patients speaking in their own words, and didactic content about the importance of healthy lifestyle behaviors including salt reduction and exercise. The didactic intervention included only didactic content. The storytelling intervention was delivered by two DVDs at 3-month intervals; the didactic intervention included only one installment. The trial was conducted in four communes, equally randomized to the two interventions. The mean age of the 160 study patients was 66 years, and 54% were men. Most participants described both interventions as understandable, informative, and motivational. Between baseline and 3 months, mean systolic blood pressure declined by 8.2 mmHg (95% CI 4.1-12.2) in the storytelling group and by 5.5 mmHg (95% CI 1.4-9.5) in the didactic group. The storytelling group also reported a significant increase in hypertension medication adherence. Both interventions were well accepted in several rural communities and were shown to be potentially effective in lowering blood pressure. A large-scale randomized trial is needed to compare the effectiveness of the two interventions in controlling hypertension. ClinicalTrials.gov, NCT02483780.

  1. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  2. Origin of the large scale structures of the universe

    International Nuclear Information System (INIS)

    Oaknin, David H.

    2004-01-01

    We revise the statistical properties of the primordial cosmological density anisotropies that, at the time of matter-radiation equality, seeded the gravitational development of large scale structures in the otherwise homogeneous and isotropic Friedmann-Robertson-Walker flat universe. Our analysis shows that random fluctuations of the density field at the same instant of equality and with comoving wavelength shorter than the causal horizon at that time can naturally account, when globally constrained to conserve the total mass (energy) of the system, for the observed scale invariance of the anisotropies over cosmologically large comoving volumes. Statistical systems with similar features are generically known as glasslike or latticelike. Obviously, these conclusions conflict with the widely accepted understanding of the primordial structures reported in the literature, which requires an epoch of inflationary cosmology to precede the standard expansion of the universe. The origin of the conflict must be found in the widespread, but unjustified, claim that scale invariant mass (energy) anisotropies at the instant of equality over comoving volumes of cosmological size, larger than the causal horizon at the time, must be generated by fluctuations in the density field with comparably large comoving wavelength

  3. Behavior therapy for children with Tourette disorder: a randomized controlled trial.

    Science.gov (United States)

    Piacentini, John; Woods, Douglas W; Scahill, Lawrence; Wilhelm, Sabine; Peterson, Alan L; Chang, Susanna; Ginsburg, Golda S; Deckersbach, Thilo; Dziura, James; Levi-Pearl, Sue; Walkup, John T

    2010-05-19

    Tourette disorder is a chronic and typically impairing childhood-onset neurologic condition. Antipsychotic medications, the first-line treatments for moderate to severe tics, are often associated with adverse effects. Behavioral interventions, although promising, have not been evaluated in large-scale controlled trials. To determine the efficacy of a comprehensive behavioral intervention for reducing tic severity in children and adolescents. Randomized, observer-blind, controlled trial of 126 children recruited from December 2004 through May 2007 and aged 9 through 17 years, with impairing Tourette or chronic tic disorder as a primary diagnosis, randomly assigned to 8 sessions during 10 weeks of behavior therapy (n = 61) or a control treatment consisting of supportive therapy and education (n = 65). Responders received 3 monthly booster treatment sessions and were reassessed at 3 and 6 months following treatment. Comprehensive behavioral intervention. Yale Global Tic Severity Scale (range 0-50, score >15 indicating clinically significant tics) and Clinical Global Impressions-Improvement Scale (range 1 [very much improved] to 8 [very much worse]). Behavioral intervention led to a significantly greater decrease on the Yale Global Tic Severity Scale (24.7 [95% confidence interval {CI}, 23.1-26.3] to 17.1 [95% CI, 15.1-19.1]) from baseline to end point compared with the control treatment (24.6 [95% CI, 23.2-26.0] to 21.1 [95% CI, 19.2-23.0]) (P tic worsening was reported by 4% of children (5/126). Treatment gains were durable, with 87% of available responders to behavior therapy exhibiting continued benefit 6 months following treatment. A comprehensive behavioral intervention, compared with supportive therapy and education, resulted in greater improvement in symptom severity among children with Tourette and chronic tic disorder. clinicaltrials.gov Identifier: NCT00218777.

  4. Scaling A Moment-Rate Function For Small To Large Magnitude Events

    Science.gov (United States)

    Archuleta, Ralph; Ji, Chen

    2017-04-01

    Since the 1980's seismologists have recognized that peak ground acceleration (PGA) and peak ground velocity (PGV) scale differently with magnitude for large and moderate earthquakes. In a recent paper (Archuleta and Ji, GRL 2016) we introduced an apparent moment-rate function (aMRF) that accurately predicts the scaling with magnitude of PGA, PGV, PWA (Wood-Anderson Displacement) and the ratio PGA/2πPGV (dominant frequency) for earthquakes 3.3 ≤ M ≤ 5.3. This apparent moment-rate function is controlled by two temporal parameters, tp and td, which are related to the time for the moment-rate function to reach its peak amplitude and the total duration of the earthquake, respectively. These two temporal parameters lead to a Fourier amplitude spectrum (FAS) of displacement that has two corners in between which the spectral amplitudes decay as 1/f, f denotes frequency. At higher or lower frequencies, the FAS of the aMRF looks like a single-corner Aki-Brune omega squared spectrum. However, in the presence of attenuation the higher corner is almost certainly masked. Attempting to correct the spectrum to an Aki-Brune omega-squared spectrum will produce an "apparent" corner frequency that falls between the double corner frequency of the aMRF. We reason that the two corners of the aMRF are the reason that seismologists deduce a stress drop (e.g., Allmann and Shearer, JGR 2009) that is generally much smaller than the stress parameter used to produce ground motions from stochastic simulations (e.g., Boore, 2003 Pageoph.). The presence of two corners for the smaller magnitude earthquakes leads to several questions. Can deconvolution be successfully used to determine scaling from small to large earthquakes? Equivalently will large earthquakes have a double corner? If large earthquakes are the sum of many smaller magnitude earthquakes, what should the displacement FAS look like for a large magnitude earthquake? Can a combination of such a double-corner spectrum and random

  5. The influence of control parameter estimation on large scale geomorphological interpretation of pointclouds

    Science.gov (United States)

    Dorninger, P.; Koma, Z.; Székely, B.

    2012-04-01

    In recent years, laser scanning, also referred to as LiDAR, has proved to be an important tool for topographic data acquisition. Basically, laser scanning acquires a more or less homogeneously distributed point cloud. These points represent all natural objects like terrain and vegetation as well as man-made objects such as buildings, streets, powerlines, or other constructions. Due to the enormous amount of data provided by current scanning systems capturing up to several hundred thousands of points per second, the immediate application of such point clouds for large scale interpretation and analysis is often prohibitive due to restrictions of the hard- and software infrastructure. To overcome this, numerous methods for the determination of derived products do exist. Commonly, Digital Terrain Models (DTM) or Digital Surface Models (DSM) are derived to represent the topography using a regular grid as datastructure. The obvious advantages are a significant reduction of the amount of data and the introduction of an implicit neighborhood topology enabling the application of efficient post processing methods. The major disadvantages are the loss of 3D information (i.e. overhangs) as well as the loss of information due to the interpolation approach used. We introduced a segmentation approach enabling the determination of planar structures within a given point cloud. It was originally developed for the purpose of building modeling but has proven to be well suited for large scale geomorphological analysis as well. The result is an assignment of the original points to a set of planes. Each plane is represented by its plane parameters. Additionally, numerous quality and quantity parameters are determined (e.g. aspect, slope, local roughness, etc.). In this contribution, we investigate the influence of the control parameters required for the plane segmentation on the geomorphological interpretation of the derived product. The respective control parameters may be determined

  6. Use of electronic healthcare records in large-scale simple randomized trials at the point of care for the documentation of value-based medicine.

    Science.gov (United States)

    van Staa, T-P; Klungel, O; Smeeth, L

    2014-06-01

    A solid foundation of evidence of the effects of an intervention is a prerequisite of evidence-based medicine. The best source of such evidence is considered to be randomized trials, which are able to avoid confounding. However, they may not always estimate effectiveness in clinical practice. Databases that collate anonymized electronic health records (EHRs) from different clinical centres have been widely used for many years in observational studies. Randomized point-of-care trials have been initiated recently to recruit and follow patients using the data from EHR databases. In this review, we describe how EHR databases can be used for conducting large-scale simple trials and discuss the advantages and disadvantages of their use. © 2014 The Association for the Publication of the Journal of Internal Medicine.

  7. Utilisation of ISA Reverse Genetics and Large-Scale Random Codon Re-Encoding to Produce Attenuated Strains of Tick-Borne Encephalitis Virus within Days.

    Science.gov (United States)

    de Fabritus, Lauriane; Nougairède, Antoine; Aubry, Fabien; Gould, Ernest A; de Lamballerie, Xavier

    2016-01-01

    Large-scale codon re-encoding is a new method of attenuating RNA viruses. However, the use of infectious clones to generate attenuated viruses has inherent technical problems. We previously developed a bacterium-free reverse genetics protocol, designated ISA, and now combined it with large-scale random codon-re-encoding method to produce attenuated tick-borne encephalitis virus (TBEV), a pathogenic flavivirus which causes febrile illness and encephalitis in humans. We produced wild-type (WT) and two re-encoded TBEVs, containing 273 or 273+284 synonymous mutations in the NS5 and NS5+NS3 coding regions respectively. Both re-encoded viruses were attenuated when compared with WT virus using a laboratory mouse model and the relative level of attenuation increased with the degree of re-encoding. Moreover, all infected animals produced neutralizing antibodies. This novel, rapid and efficient approach to engineering attenuated viruses could potentially expedite the development of safe and effective new-generation live attenuated vaccines.

  8. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  9. Dynamic Modeling, Optimization, and Advanced Control for Large Scale Biorefineries

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail

    with a complex conversion route. Computational fluid dynamics is used to model transport phenomena in large reactors capturing tank profiles, and delays due to plug flows. This work publishes for the first time demonstration scale real data for validation showing that the model library is suitable...

  10. Limited Impact of Music Therapy on Patient Anxiety with the Large Loop Excision of Transformation Zone Procedure - a Randomized Controlled Trial.

    Science.gov (United States)

    Kongsawatvorakul, Chompunoot; Charakorn, Chuenkamon; Paiwattananupant, Krissada; Lekskul, Navamol; Rattanasiri, Sasivimol; Lertkhachonsuk, Arb-Aroon

    2016-01-01

    Many studies have pointed to strategies to cope with patient anxiety in colposcopy. Evidence shows that patients experienced considerable distress with the large loop excision of transformation zone (LLETZ) procedure and suitable interventions should be introduced to reduce anxiety. This study aimed to investigate the effects of music therapy in patients undergoing LLETZ. A randomized controlled trial was conducted with patients undergoing LLETZ performed under local anesthesia in an out patient setting at Ramathibodi Hospital, Bangkok, Thailand, from February 2015 to January 2016. After informed consent and demographic data were obtained, we assessed the anxiety level using State Anxiety Inventory pre and post procedures. Music group patients listened to classical songs through headphones, while the control group received the standard care. Pain score was evaluated with a visual analog scale (VAS). Statistical analysis was conducted using Pearson Chi-square, Fisher's Exact test and T-Test and p-values less than 0.05 were considered statistically significant. A total of 73 patients were enrolled and randomized, resulting in 36 women in the music group and 37 women in the non-music control group. The preoperative mean anxiety score was higher in the music group (46.8 VS 45.8 points). The postoperative mean anxiety scores in the music and the non-music groups were 38.7 and 41.3 points, respectively. VAS was lower in music group (2.55 VS 3.33). The percent change of anxiety was greater in the music group, although there was no significant difference between two groups. Music therapy did not significantly reduce anxiety in patients undergoing the LLETZ procedure. However, different interventions should be developed to ease the patients' apprehension during this procedure.

  11. Large-scale Estimates of Leaf Area Index from Active Remote Sensing Laser Altimetry

    Science.gov (United States)

    Hopkinson, C.; Mahoney, C.

    2016-12-01

    Leaf area index (LAI) is a key parameter that describes the spatial distribution of foliage within forest canopies which in turn control numerous relationships between the ground, canopy, and atmosphere. The retrieval of LAI has demonstrated success by in-situ (digital) hemispherical photography (DHP) and airborne laser scanning (ALS) data; however, field and ALS acquisitions are often spatially limited (100's km2) and costly. Large-scale (>1000's km2) retrievals have been demonstrated by optical sensors, however, accuracies remain uncertain due to the sensor's inability to penetrate the canopy. The spaceborne Geoscience Laser Altimeter System (GLAS) provides a possible solution in retrieving large-scale derivations whilst simultaneously penetrating the canopy. LAI retrieved by multiple DHP from 6 Australian sites, representing a cross-section of Australian ecosystems, were employed to model ALS LAI, which in turn were used to infer LAI from GLAS data at 5 other sites. An optimally filtered GLAS dataset was then employed in conjunction with a host of supplementary data to build a Random Forest (RF) model to infer predictions (and uncertainties) of LAI at a 250 m resolution across the forested regions of Australia. Predictions were validated against ALS-based LAI from 20 sites (R2=0.64, RMSE=1.1 m2m-2); MODIS-based LAI were also assessed against these sites (R2=0.30, RMSE=1.78 m2m-2) to demonstrate the strength of GLAS-based predictions. The large-scale nature of current predictions was also leveraged to demonstrate large-scale relationships of LAI with other environmental characteristics, such as: canopy height, elevation, and slope. The need for such wide-scale quantification of LAI is key in the assessment and modification of forest management strategies across Australia. Such work also assists Australia's Terrestrial Ecosystem Research Network, in fulfilling their government issued mandates.

  12. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  13. A single point acupuncture treatment at large intestine meridian: a randomized controlled trial in acute tonsillitis and pharyngitis.

    Science.gov (United States)

    Fleckenstein, Johannes; Lill, Christian; Lüdtke, Rainer; Gleditsch, Jochen; Rasp, Gerd; Irnich, Dominik

    2009-09-01

    One out of 4 patients visiting a general practitioner reports of a sore throat associated with pain on swallowing. This study was established to examine the immediate pain alleviating effect of a single point acupuncture treatment applied to the large intestine meridian of patients with sore throat. Sixty patients with acute tonsillitis and pharyngitis were enrolled in this randomized placebo-controlled trial. They either received acupuncture, or sham laser acupuncture, directed to the large intestine meridian section between acupuncture points LI 8 and LI 10. The main outcome measure was the change of pain intensity on swallowing a sip of water evaluated by a visual analog scale 15 minutes after treatment. A credibility assessment regarding the respective treatment was performed. The pain intensity for the acupuncture group before and immediately after therapy was 5.6+/-2.8 and 3.0+/-3.0, and for the sham group 5.6+/-2.5 and 3.8+/-2.5, respectively. Despite the articulation of a more pronounced improvement among the acupuncture group, there was no significant difference between groups (Delta=0.9, confidence interval: -0.2-2.0; P=0.12; analysis of covariance). Patients' satisfaction was high in both treatment groups. The study was prematurely terminated due to a subsequent lack of suitable patients. A single acupuncture treatment applied to a selected area of the large intestine meridian was no more effective in the alleviation of pain associated with clinical sore throat than sham laser acupuncture applied to the same area. Hence, clinically relevant improvement could be achieved. Pain alleviation might partly be due to the intense palpation of the large intestine meridian. The benefit of a comprehensive acupuncture treatment protocol in this condition should be subject to further trials.

  14. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  15. Financial management of a large multisite randomized clinical trial.

    Science.gov (United States)

    Sheffet, Alice J; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E; Longbottom, Mary E; Howard, Virginia J; Marler, John R; Brott, Thomas G

    2014-08-01

    The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years' funding ($21 112 866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2500 randomized participants at 40 sites. Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Projections of the original grant's fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant's fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2500 targeted sample size, 138 (5·5%) were randomized during the first five years and 1387 (55·5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13 845) of the projected per-patient costs ($152 992) of the fixed model. Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. © 2014 The Authors. International Journal of Stroke © 2014 World Stroke Organization.

  16. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  17. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  18. Temporal flexibility and careers: The role of large-scale organizations for physicians

    OpenAIRE

    Forrest Briscoe

    2006-01-01

    Temporal flexibility and careers: The role of large-scale organizations for physicians. Forrest Briscoe Briscoe This study investigates how employment in large-scale organizations affects the work lives of practicing physicians. Well-established theory associates larger organizations with bureaucratic constraint, loss of workplace control, and dissatisfaction, but this author finds that large scale is also associated with greater schedule and career flexibility. Ironically, the bureaucratic p...

  19. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  20. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  1. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  2. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  3. Hypersingular integral equations, waveguiding effects in Cantorian Universe and genesis of large scale structures

    International Nuclear Information System (INIS)

    Iovane, G.; Giordano, P.

    2005-01-01

    In this work we introduce the hypersingular integral equations and analyze a realistic model of gravitational waveguides on a cantorian space-time. A waveguiding effect is considered with respect to the large scale structure of the Universe, where the structure formation appears as if it were a classically self-similar random process at all astrophysical scales. The result is that it seems we live in an El Naschie's o (∞) Cantorian space-time, where gravitational lensing and waveguiding effects can explain the appearing Universe. In particular, we consider filamentary and planar large scale structures as possible refraction channels for electromagnetic radiation coming from cosmological structures. From this vision the Universe appears like a large self-similar adaptive mirrors set, thanks to three numerical simulations. Consequently, an infinite Universe is just an optical illusion that is produced by mirroring effects connected with the large scale structure of a finite and not a large Universe

  4. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  5. Parallel Optimization of Polynomials for Large-scale Problems in Stability and Control

    Science.gov (United States)

    Kamyar, Reza

    In this thesis, we focus on some of the NP-hard problems in control theory. Thanks to the converse Lyapunov theory, these problems can often be modeled as optimization over polynomials. To avoid the problem of intractability, we establish a trade off between accuracy and complexity. In particular, we develop a sequence of tractable optimization problems --- in the form of Linear Programs (LPs) and/or Semi-Definite Programs (SDPs) --- whose solutions converge to the exact solution of the NP-hard problem. However, the computational and memory complexity of these LPs and SDPs grow exponentially with the progress of the sequence - meaning that improving the accuracy of the solutions requires solving SDPs with tens of thousands of decision variables and constraints. Setting up and solving such problems is a significant challenge. The existing optimization algorithms and software are only designed to use desktop computers or small cluster computers --- machines which do not have sufficient memory for solving such large SDPs. Moreover, the speed-up of these algorithms does not scale beyond dozens of processors. This in fact is the reason we seek parallel algorithms for setting-up and solving large SDPs on large cluster- and/or super-computers. We propose parallel algorithms for stability analysis of two classes of systems: 1) Linear systems with a large number of uncertain parameters; 2) Nonlinear systems defined by polynomial vector fields. First, we develop a distributed parallel algorithm which applies Polya's and/or Handelman's theorems to some variants of parameter-dependent Lyapunov inequalities with parameters defined over the standard simplex. The result is a sequence of SDPs which possess a block-diagonal structure. We then develop a parallel SDP solver which exploits this structure in order to map the computation, memory and communication to a distributed parallel environment. Numerical tests on a supercomputer demonstrate the ability of the algorithm to

  6. Hierarchical, decentralized control system for large-scale smart-structures

    International Nuclear Information System (INIS)

    Algermissen, Stephan; Fröhlich, Tim; Monner, Hans Peter

    2014-01-01

    Active control of sound and vibration has gained much attention in all kinds of industries in the past decade. Future prospects for maximizing airline passenger comfort are especially promising. The objectives of recent research projects in this area are the reduction of noise transmission through thin walled structures such as fuselages, linings or interior elements. Besides different external noise sources, such as the turbulent boundary layer, rotor or jet noise, the actuator and sensor placement as well as different control concepts are addressed. Mostly, the work is focused on a single panel or section of the fuselage, neglecting the fact that for effective noise reduction the entire fuselage has to be taken into account. Nevertheless, extending the scope of an active system from a single panel to the entire fuselage increases the effort for control hardware dramatically. This paper presents a control concept for large structures using distributed control nodes. Each node has the capability to execute a vibration or noise controller for a specific part or section of the fuselage. For maintenance, controller tuning or performance measurement, all nodes are connected to a host computer via Universal Serial Bus (USB). This topology allows a partitioning and distributing of tasks. The nodes execute the low-level control functions. High-level tasks like maintenance, system identification and control synthesis are operated by the host using streamed data from the nodes. By choosing low-price nodes, a very cost effective way of implementing an active system for large structures is realized. Besides the system identification and controller synthesis on the host computer, a detailed view on the hardware and software concept for the nodes is given. Finally, the results of an experimental test of a system running a robust vibration controller at an active panel demonstrator are shown. (paper)

  7. Gradient networks on uncorrelated random scale-free networks

    International Nuclear Information System (INIS)

    Pan Guijun; Yan Xiaoqing; Huang Zhongbing; Ma Weichuan

    2011-01-01

    Uncorrelated random scale-free (URSF) networks are useful null models for checking the effects of scale-free topology on network-based dynamical processes. Here, we present a comparative study of the jamming level of gradient networks based on URSF networks and Erdos-Renyi (ER) random networks. We find that the URSF networks are less congested than ER random networks for the average degree (k)>k c (k c ∼ 2 denotes a critical connectivity). In addition, by investigating the topological properties of the two kinds of gradient networks, we discuss the relations between the topological structure and the transport efficiency of the gradient networks. These findings show that the uncorrelated scale-free structure might allow more efficient transport than the random structure.

  8. A Randomized Controlled Trial Evaluation of "Time to Read", a Volunteer Tutoring Program for 8- to 9-Year-Olds

    Science.gov (United States)

    Miller, Sarah; Connolly, Paul

    2013-01-01

    Tutoring is commonly employed to prevent early reading failure, and evidence suggests that it can have a positive effect. This article presents findings from a large-scale ("n" = 734) randomized controlled trial evaluation of the effect of "Time to Read"--a volunteer tutoring program aimed at children aged 8 to 9 years--on…

  9. Financial Management of a Large Multi-site Randomized Clinical Trial

    Science.gov (United States)

    Sheffet, Alice J.; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E.; Longbottom, Mary E.; Howard, Virginia J.; Marler, John R.; Brott, Thomas G.

    2014-01-01

    Background The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years’ funding ($21,112,866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2,500 randomized participants at 40 sites. Aims Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Methods Projections of the original grant’s fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant’s fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Results Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2,500 targeted sample size, 138 (5.5%) were randomized during the first five years and 1,387 (55.5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13,845) of the projected per-patient costs ($152,992) of the fixed model. Conclusions Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. PMID:24661748

  10. Large-scale modeling of rain fields from a rain cell deterministic model

    Science.gov (United States)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  11. Investigation of the Contamination Control in a Cleaning Room with a Moving AGV by 3D Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Qing-He Yao

    2013-01-01

    Full Text Available The motions of the airflow induced by the movement of an automatic guided vehicle (AGV in a cleanroom are numerically studied by large-scale simulation. For this purpose, numerical experiments scheme based on domain decomposition method is designed. Compared with the related past research, the high Reynolds number is treated by large-scale computation in this work. A domain decomposition Lagrange-Galerkin method is employed to approximate the Navier-Stokes equations and the convection diffusion equation; the stiffness matrix is symmetric and an incomplete balancing preconditioned conjugate gradient (PCG method is employed to solve the linear algebra system iteratively. The end wall effects are readily viewed, and the necessity of the extension to 3 dimensions is confirmed. The effect of the high efficiency particular air (HEPA filter on contamination control is studied and the proper setting of the speed of the clean air flow is also investigated. More details of the recirculation zones are revealed by the 3D large-scale simulation.

  12. LARGE-SCALE MECURY CONTROL TECHNOLOGY TESTING FOR LIGNITE-FIRED UTILITIES-OXIDATION SYSTEMS FOR WET FGD

    Energy Technology Data Exchange (ETDEWEB)

    Michael J. Holmes; Steven A. Benson; Jeffrey S. Thompson

    2004-03-01

    The Energy & Environmental Research Center (EERC) is conducting a consortium-based effort directed toward resolving the mercury (Hg) control issues facing the lignite industry. Specifically, the EERC team--the EERC, EPRI, URS, ADA-ES, Babcock & Wilcox, the North Dakota Industrial Commission, SaskPower, and the Mercury Task Force, which includes Basin Electric Power Cooperative, Otter Tail Power Company, Great River Energy, Texas Utilities (TXU), Montana-Dakota Utilities Co., Minnkota Power Cooperative, BNI Coal Ltd., Dakota Westmoreland Corporation, and the North American Coal Company--has undertaken a project to significantly and cost-effectively oxidize elemental mercury in lignite combustion gases, followed by capture in a wet scrubber. This approach will be applicable to virtually every lignite utility in the United States and Canada and potentially impact subbituminous utilities. The oxidation process is proven at the pilot-scale and in short-term full-scale tests. Additional optimization is continuing on oxidation technologies, and this project focuses on longer-term full-scale testing. The lignite industry has been proactive in advancing the understanding of and identifying control options for Hg in lignite combustion flue gases. Approximately 1 year ago, the EERC and EPRI began a series of Hg-related discussions with the Mercury Task Force as well as utilities firing Texas and Saskatchewan lignites. This project is one of three being undertaken by the consortium to perform large-scale Hg control technology testing to address the specific needs and challenges to be met in controlling Hg from lignite-fired power plants. This project involves Hg oxidation upstream of a system equipped with an electrostatic precipitator (ESP) followed by wet flue gas desulfurization (FGD). The team involved in conducting the technical aspects of the project includes the EERC, Babcock & Wilcox, URS, and ADA-ES. The host sites include Minnkota Power Cooperative Milton R. Young

  13. Sustainability Risk Evaluation for Large-Scale Hydropower Projects with Hybrid Uncertainty

    Directory of Open Access Journals (Sweden)

    Weiyao Tang

    2018-01-01

    Full Text Available As large-scale hydropower projects are influenced by many factors, risk evaluations are complex. This paper considers a hydropower project as a complex system from the perspective of sustainability risk, and divides it into three subsystems: the natural environment subsystem, the eco-environment subsystem and the socioeconomic subsystem. Risk-related factors and quantitative dimensions of each subsystem are comprehensively analyzed considering uncertainty of some quantitative dimensions solved by hybrid uncertainty methods, including fuzzy (e.g., the national health degree, the national happiness degree, the protection of cultural heritage, random (e.g., underground water levels, river width, and fuzzy random uncertainty (e.g., runoff volumes, precipitation. By calculating the sustainability risk-related degree in each of the risk-related factors, a sustainable risk-evaluation model is built. Based on the calculation results, the critical sustainability risk-related factors are identified and targeted to reduce the losses caused by sustainability risk factors of the hydropower project. A case study at the under-construction Baihetan hydropower station is presented to demonstrate the viability of the risk-evaluation model and to provide a reference for the sustainable risk evaluation of other large-scale hydropower projects.

  14. Large-scale control site selection for population monitoring: an example assessing Sage-grouse trends

    Science.gov (United States)

    Fedy, Bradley C.; O'Donnell, Michael; Bowen, Zachary H.

    2015-01-01

    Human impacts on wildlife populations are widespread and prolific and understanding wildlife responses to human impacts is a fundamental component of wildlife management. The first step to understanding wildlife responses is the documentation of changes in wildlife population parameters, such as population size. Meaningful assessment of population changes in potentially impacted sites requires the establishment of monitoring at similar, nonimpacted, control sites. However, it is often difficult to identify appropriate control sites in wildlife populations. We demonstrated use of Geographic Information System (GIS) data across large spatial scales to select biologically relevant control sites for population monitoring. Greater sage-grouse (Centrocercus urophasianus; hearafter, sage-grouse) are negatively affected by energy development, and monitoring of sage-grouse population within energy development areas is necessary to detect population-level responses. Weused population data (1995–2012) from an energy development area in Wyoming, USA, the Atlantic Rim Project Area (ARPA), and GIS data to identify control sites that were not impacted by energy development for population monitoring. Control sites were surrounded by similar habitat and were within similar climate areas to the ARPA. We developed nonlinear trend models for both the ARPA and control sites and compared long-term trends from the 2 areas. We found little difference between the ARPA and control sites trends over time. This research demonstrated an approach for control site selection across large landscapes and can be used as a template for similar impact-monitoring studies. It is important to note that identification of changes in population parameters between control and treatment sites is only the first step in understanding the mechanisms that underlie those changes. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  15. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  16. Computational Techniques for Model Predictive Control of Large-Scale Systems with Continuous-Valued and Discrete-Valued Inputs

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available We propose computational techniques for model predictive control of large-scale systems with both continuous-valued control inputs and discrete-valued control inputs, which are a class of hybrid systems. In the proposed method, we introduce the notion of virtual control inputs, which are obtained by relaxing discrete-valued control inputs to continuous variables. In online computation, first, we find continuous-valued control inputs and virtual control inputs minimizing a cost function. Next, using the obtained virtual control inputs, only discrete-valued control inputs at the current time are computed in each subsystem. In addition, we also discuss the effect of quantization errors. Finally, the effectiveness of the proposed method is shown by a numerical example. The proposed method enables us to reduce and decentralize the computation load.

  17. A Review of Control Strategy of the Large-scale of Electric Vehicles Charging and Discharging Behavior

    Science.gov (United States)

    Kong, Lingyu; Han, Jiming; Xiong, Wenting; Wang, Hao; Shen, Yaqi; Li, Ying

    2017-05-01

    Large scale access of electric vehicles will bring huge challenges to the safe operation of the power grid, and it’s important to control the charging and discharging of the electric vehicle. First of all, from the electric quality and network loss, this paper points out the influence on the grid caused by electric vehicle charging behaviour. Besides, control strategy of electric vehicle charging and discharging has carried on the induction and the summary from the direct and indirect control. Direct control strategy means control the electric charging behaviour by controlling its electric vehicle charging and discharging power while the indirect control strategy by means of controlling the price of charging and discharging. Finally, for the convenience of the reader, this paper also proposed a complete idea of the research methods about how to study the control strategy, taking the adaptability and possibility of failure of electric vehicle control strategy into consideration. Finally, suggestions on the key areas for future research are put up.

  18. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  19. Combined cognitive-strategy and task-specific training improves transfer to untrained activities in sub-acute stroke: An exploratory randomized controlled trial

    Science.gov (United States)

    McEwen, Sara; Polatajko, Helene; Baum, Carolyn; Rios, Jorge; Cirone, Dianne; Doherty, Meghan; Wolf, Timothy

    2014-01-01

    Purpose The purpose of this study was to estimate the effect of the Cognitive Orientation to daily Occupational Performance (CO-OP) approach compared to usual outpatient rehabilitation on activity and participation in people less than 3 months post stroke. Methods An exploratory, single blind, randomized controlled trial with a usual care control arm was conducted. Participants referred to 2 stroke rehabilitation outpatient programs were randomized to receive either Usual Care or CO-OP. The primary outcome was actual performance of trained and untrained self-selected activities, measured using the Performance Quality Rating Scale (PQRS). Additional outcomes included the Canadian Occupational Performance Measure (COPM), the Stroke Impact Scale Participation Domain, the Community Participation Index, and the Self Efficacy Gauge. Results Thirty-five (35) eligible participants were randomized; 26 completed the intervention. Post-intervention, PQRS change scores demonstrated CO-OP had a medium effect over Usual Care on trained self-selected activities (d=0.5) and a large effect on untrained (d=1.2). At a 3 month follow-up, PQRS change scores indicated a large effect of CO-OP on both trained (d=1.6) and untrained activities (d=1.1). CO-OP had a small effect on COPM and a medium effect on the Community Participation Index perceived control and the Self-Efficacy Gauge. Conclusion CO-OP was associated with a large treatment effect on follow up performances of self-selected activities, and demonstrated transfer to untrained activities. A larger trial is warranted. PMID:25416738

  20. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  1. Secure access control and large scale robust representation for online multimedia event detection.

    Science.gov (United States)

    Liu, Changyu; Lu, Bin; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.

  2. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    Directory of Open Access Journals (Sweden)

    Changyu Liu

    2014-01-01

    Full Text Available We developed an online multimedia event detection (MED system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.

  3. Membrane biofilm communities in full-scale membrane bioreactors are not randomly assembled and consist of a core microbiome

    KAUST Repository

    Matar, Gerald Kamil

    2017-06-21

    Finding efficient biofouling control strategies requires a better understanding of the microbial ecology of membrane biofilm communities in membrane bioreactors (MBRs). Studies that characterized the membrane biofilm communities in lab-and pilot-scale MBRs are numerous, yet similar studies in full-scale MBRs are limited. Also, most of these studies have characterized the mature biofilm communities with very few studies addressing early biofilm communities. In this study, five full-scale MBRs located in Seattle (Washington, U.S.A.) were selected to address two questions concerning membrane biofilm communities (early and mature): (i) Is the assembly of biofilm communities (early and mature) the result of random immigration of species from the source community (i.e. activated sludge)? and (ii) Is there a core membrane biofilm community in full-scale MBRs? Membrane biofilm (early and mature) and activated sludge (AS) samples were collected from the five MBRs, and 16S rRNA gene sequencing was applied to investigate the bacterial communities of AS and membrane biofilms (early and mature). Alpha and beta diversity measures revealed clear differences in the bacterial community structure between the AS and biofilm (early and mature) samples in the five full-scale MBRs. These differences were mainly due to the presence of large number of unique but rare operational taxonomic units (∼13% of total reads in each MBR) in each sample. In contrast, a high percentage (∼87% of total reads in each MBR) of sequence reads was shared between AS and biofilm samples in each MBR, and these shared sequence reads mainly belong to the dominant taxa in these samples. Despite the large fraction of shared sequence reads between AS and biofilm samples, simulated biofilm communities from random sampling of the respective AS community revealed that biofilm communities differed significantly from the random assemblages (P < 0.001 for each MBR), indicating that the biofilm communities (early

  4. The Landscape Evolution Observatory: a large-scale controllable infrastructure to study coupled Earth-surface processes

    Science.gov (United States)

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferré, T.P.A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-01-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  5. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases.

    Science.gov (United States)

    Sawata, Hiroshi; Ueshima, Kenji; Tsutani, Kiichiro

    2011-04-14

    Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP) study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1) to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2) to find ways to improve the environment surrounding clinical trials in Japan more generally. We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization), websites of related medical societies, the University Hospital Medical Information Network (UMIN) Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs). Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5%) was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not disclosed. To improve the quality of clinical

  6. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases

    Directory of Open Access Journals (Sweden)

    Tsutani Kiichiro

    2011-04-01

    Full Text Available Abstract Background Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1 to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2 to find ways to improve the environment surrounding clinical trials in Japan more generally. Methods We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization, websites of related medical societies, the University Hospital Medical Information Network (UMIN Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. Results We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs. Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5% was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not

  7. Large Deviations for Two-Time-Scale Diffusions, with Delays

    International Nuclear Information System (INIS)

    Kushner, Harold J.

    2010-01-01

    We consider the problem of large deviations for a two-time-scale reflected diffusion process, possibly with delays in the dynamical terms. The Dupuis-Ellis weak convergence approach is used. It is perhaps the most intuitive and simplest for the problems of concern. The results have applications to the problem of approximating optimal controls for two-time-scale systems via use of the averaged equation.

  8. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  9. Growth of wrinkle-free graphene on texture-controlled platinum films and thermal-assisted transfer of large-scale patterned graphene.

    Science.gov (United States)

    Choi, Jae-Kyung; Kwak, Jinsung; Park, Soon-Dong; Yun, Hyung Duk; Kim, Se-Yang; Jung, Minbok; Kim, Sung Youb; Park, Kibog; Kang, Seoktae; Kim, Sung-Dae; Park, Dong-Yeon; Lee, Dong-Su; Hong, Suk-Kyoung; Shin, Hyung-Joon; Kwon, Soon-Yong

    2015-01-27

    Growth of large-scale patterned, wrinkle-free graphene and the gentle transfer technique without further damage are most important requirements for the practical use of graphene. Here we report the growth of wrinkle-free, strictly uniform monolayer graphene films by chemical vapor deposition on a platinum (Pt) substrate with texture-controlled giant grains and the thermal-assisted transfer of large-scale patterned graphene onto arbitrary substrates. The designed Pt surfaces with limited numbers of grain boundaries and improved surface perfectness as well as small thermal expansion coefficient difference to graphene provide a venue for uniform growth of monolayer graphene with wrinkle-free characteristic. The thermal-assisted transfer technique allows the complete transfer of large-scale patterned graphene films onto arbitrary substrates without any ripples, tears, or folds. The transferred graphene shows high crystalline quality with an average carrier mobility of ∼ 5500 cm(2) V(-1) s(-1) at room temperature. Furthermore, this transfer technique shows a high tolerance to variations in types and morphologies of underlying substrates.

  10. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  11. Improved decomposition–coordination and discrete differential dynamic programming for optimization of large-scale hydropower system

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Ouyang, Shuo; Ding, Xiaoling; Chen, Lu

    2014-01-01

    Highlights: • Optimization of large-scale hydropower system in the Yangtze River basin. • Improved decomposition–coordination and discrete differential dynamic programming. • Generating initial solution randomly to reduce generation time. • Proposing relative coefficient for more power generation. • Proposing adaptive bias corridor technology to enhance convergence speed. - Abstract: With the construction of major hydro plants, more and more large-scale hydropower systems are taking shape gradually, which brings up a challenge to optimize these systems. Optimization of large-scale hydropower system (OLHS), which is to determine water discharges or water levels of overall hydro plants for maximizing total power generation when subjecting to lots of constrains, is a high dimensional, nonlinear and coupling complex problem. In order to solve the OLHS problem effectively, an improved decomposition–coordination and discrete differential dynamic programming (IDC–DDDP) method is proposed in this paper. A strategy that initial solution is generated randomly is adopted to reduce generation time. Meanwhile, a relative coefficient based on maximum output capacity is proposed for more power generation. Moreover, an adaptive bias corridor technology is proposed to enhance convergence speed. The proposed method is applied to long-term optimal dispatches of large-scale hydropower system (LHS) in the Yangtze River basin. Compared to other methods, IDC–DDDP has competitive performances in not only total power generation but also convergence speed, which provides a new method to solve the OLHS problem

  12. Kinetically controlled synthesis of large-scale morphology-tailored silver nanostructures at low temperature

    Science.gov (United States)

    Zhang, Ling; Zhao, Yuda; Lin, Ziyuan; Gu, Fangyuan; Lau, Shu Ping; Li, Li; Chai, Yang

    2015-08-01

    Ag nanostructures are widely used in catalysis, energy conversion and chemical sensing. Morphology-tailored synthesis of Ag nanostructures is critical to tune physical and chemical properties. In this study, we develop a method for synthesizing the morphology-tailored Ag nanostructures in aqueous solution at a low temperature (45 °C). With the use of AgCl nanoparticles as the precursor, the growth kinetics of Ag nanostructures can be tuned with the pH value of solution and the concentration of Pd cubes which catalyze the reaction. Ascorbic acid and cetylpyridinium chloride are used as the mild reducing agent and capping agent in aqueous solution, respectively. High-yield Ag nanocubes, nanowires, right triangular bipyramids/cubes with twinned boundaries, and decahedra are successfully produced. Our method opens up a new environmentally-friendly and economical route to synthesize large-scale and morphology-tailored Ag nanostructures, which is significant to the controllable fabrication of Ag nanostructures and fundamental understanding of the growth kinetics.Ag nanostructures are widely used in catalysis, energy conversion and chemical sensing. Morphology-tailored synthesis of Ag nanostructures is critical to tune physical and chemical properties. In this study, we develop a method for synthesizing the morphology-tailored Ag nanostructures in aqueous solution at a low temperature (45 °C). With the use of AgCl nanoparticles as the precursor, the growth kinetics of Ag nanostructures can be tuned with the pH value of solution and the concentration of Pd cubes which catalyze the reaction. Ascorbic acid and cetylpyridinium chloride are used as the mild reducing agent and capping agent in aqueous solution, respectively. High-yield Ag nanocubes, nanowires, right triangular bipyramids/cubes with twinned boundaries, and decahedra are successfully produced. Our method opens up a new environmentally-friendly and economical route to synthesize large-scale and morphology

  13. A Randomized, Controlled Clinical Trial Comparing Efficacy, Safety ...

    African Journals Online (AJOL)

    A Randomized, Controlled Clinical Trial Comparing Efficacy, Safety and Cost Effectiveness of Lornoxicam with Diclofenac Sodium in Patients of Osteoarthritis Knee. ... All patients were assessed with visual analogue scale and 100 meter walking test before starting of therapy, at 15 days and at 1, 2 and 3 months of therapy.

  14. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  15. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  16. Multi-parameter decoupling and slope tracking control strategy of a large-scale high altitude environment simulation test cabin

    Directory of Open Access Journals (Sweden)

    Li Ke

    2014-12-01

    Full Text Available A large-scale high altitude environment simulation test cabin was developed to accurately control temperatures and pressures encountered at high altitudes. The system was developed to provide slope-tracking dynamic control of the temperature–pressure two-parameter and overcome the control difficulties inherent to a large inertia lag link with a complex control system which is composed of turbine refrigeration device, vacuum device and liquid nitrogen cooling device. The system includes multi-parameter decoupling of the cabin itself to avoid equipment damage of air refrigeration turbine caused by improper operation. Based on analysis of the dynamic characteristics and modeling for variations in temperature, pressure and rotation speed, an intelligent controller was implemented that includes decoupling and fuzzy arithmetic combined with an expert PID controller to control test parameters by decoupling and slope tracking control strategy. The control system employed centralized management in an open industrial ethernet architecture with an industrial computer at the core. The simulation and field debugging and running results show that this method can solve the problems of a poor anti-interference performance typical for a conventional PID and overshooting that can readily damage equipment. The steady-state characteristics meet the system requirements.

  17. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  18. Nonlinear evolution of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Frenk, C.S.; White, S.D.M.; Davis, M.

    1983-01-01

    Using N-body simulations we study the nonlinear development of primordial density perturbation in an Einstein--de Sitter universe. We compare the evolution of an initial distribution without small-scale density fluctuations to evolution from a random Poisson distribution. These initial conditions mimic the assumptions of the adiabatic and isothermal theories of galaxy formation. The large-scale structures which form in the two cases are markedly dissimilar. In particular, the correlation function xi(r) and the visual appearance of our adiabatic (or ''pancake'') models match better the observed distribution of galaxies. This distribution is characterized by large-scale filamentary structure. Because the pancake models do not evolve in a self-similar fashion, the slope of xi(r) steepens with time; as a result there is a unique epoch at which these models fit the galaxy observations. We find the ratio of cutoff length to correlation length at this time to be lambda/sub min//r 0 = 5.1; its expected value in a neutrino dominated universe is 4(Ωh) -1 (H 0 = 100h km s -1 Mpc -1 ). At early epochs these models predict a negligible amplitude for xi(r) and could explain the lack of measurable clustering in the Lyα absorption lines of high-redshift quasars. However, large-scale structure in our models collapses after z = 2. If this collapse precedes galaxy formation as in the usual pancake theory, galaxies formed uncomfortably recently. The extent of this problem may depend on the cosmological model used; the present series of experiments should be extended in the future to include models with Ω<1

  19. Selective vulnerability related to aging in large-scale resting brain networks.

    Science.gov (United States)

    Zhang, Hong-Ying; Chen, Wen-Xin; Jiao, Yun; Xu, Yao; Zhang, Xiang-Rong; Wu, Jing-Tao

    2014-01-01

    Normal aging is associated with cognitive decline. Evidence indicates that large-scale brain networks are affected by aging; however, it has not been established whether aging has equivalent effects on specific large-scale networks. In the present study, 40 healthy subjects including 22 older (aged 60-80 years) and 18 younger (aged 22-33 years) adults underwent resting-state functional MRI scanning. Four canonical resting-state networks, including the default mode network (DMN), executive control network (ECN), dorsal attention network (DAN) and salience network, were extracted, and the functional connectivities in these canonical networks were compared between the younger and older groups. We found distinct, disruptive alterations present in the large-scale aging-related resting brain networks: the ECN was affected the most, followed by the DAN. However, the DMN and salience networks showed limited functional connectivity disruption. The visual network served as a control and was similarly preserved in both groups. Our findings suggest that the aged brain is characterized by selective vulnerability in large-scale brain networks. These results could help improve our understanding of the mechanism of degeneration in the aging brain. Additional work is warranted to determine whether selective alterations in the intrinsic networks are related to impairments in behavioral performance.

  20. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  1. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  2. Large-scale data analysis of power grid resilience across multiple US service regions

    Science.gov (United States)

    Ji, Chuanyi; Wei, Yun; Mei, Henry; Calzada, Jorge; Carey, Matthew; Church, Steve; Hayes, Timothy; Nugent, Brian; Stella, Gregory; Wallace, Matthew; White, Joe; Wilcox, Robert

    2016-05-01

    Severe weather events frequently result in large-scale power failures, affecting millions of people for extended durations. However, the lack of comprehensive, detailed failure and recovery data has impeded large-scale resilience studies. Here, we analyse data from four major service regions representing Upstate New York during Super Storm Sandy and daily operations. Using non-stationary spatiotemporal random processes that relate infrastructural failures to recoveries and cost, our data analysis shows that local power failures have a disproportionally large non-local impact on people (that is, the top 20% of failures interrupted 84% of services to customers). A large number (89%) of small failures, represented by the bottom 34% of customers and commonplace devices, resulted in 56% of the total cost of 28 million customer interruption hours. Our study shows that extreme weather does not cause, but rather exacerbates, existing vulnerabilities, which are obscured in daily operations.

  3. Large-scale use of mosquito larval source management for malaria control in Africa: a cost analysis.

    Science.gov (United States)

    Worrall, Eve; Fillinger, Ulrike

    2011-11-08

    At present, large-scale use of two malaria vector control methods, long-lasting insecticidal nets (LLINs) and indoor residual spraying (IRS) is being scaled up in Africa with substantial funding from donors. A third vector control method, larval source management (LSM), has been historically very successful and is today widely used for mosquito control globally, except in Africa. With increasing risk of insecticide resistance and a shift to more exophilic vectors, LSM is now under re-evaluation for use against afro-tropical vector species. Here the costs of this intervention were evaluated. The 'ingredients approach' was used to estimate the economic and financial costs per person protected per year (pppy) for large-scale LSM using microbial larvicides in three ecologically diverse settings: (1) the coastal metropolitan area of Dar es Salaam in Tanzania, (2) a highly populated Kenyan highland area (Vihiga District), and (3) a lakeside setting in rural western Kenya (Mbita Division). Two scenarios were examined to investigate the cost implications of using alternative product formulations. Sensitivity analyses on product prices were carried out. The results show that for programmes using the same granular formulation larviciding costs the least pppy in Dar es Salaam (US$0.94), approximately 60% more in Vihiga District (US$1.50) and the most in Mbita Division (US$2.50). However, these costs are reduced substantially if an alternative water-dispensable formulation is used; in Vihiga, this would reduce costs to US$0.79 and, in Mbita Division, to US$1.94. Larvicide and staff salary costs each accounted for approximately a third of the total economic costs per year. The cost pppy depends mainly on: (1) the type of formulation required for treating different aquatic habitats, (2) the human population density relative to the density of aquatic habitats and (3) the potential to target the intervention in space and/or time. Costs for LSM compare favourably with costs for IRS

  4. Large-scale use of mosquito larval source management for malaria control in Africa: a cost analysis

    Science.gov (United States)

    2011-01-01

    Background At present, large-scale use of two malaria vector control methods, long-lasting insecticidal nets (LLINs) and indoor residual spraying (IRS) is being scaled up in Africa with substantial funding from donors. A third vector control method, larval source management (LSM), has been historically very successful and is today widely used for mosquito control globally, except in Africa. With increasing risk of insecticide resistance and a shift to more exophilic vectors, LSM is now under re-evaluation for use against afro-tropical vector species. Here the costs of this intervention were evaluated. Methods The 'ingredients approach' was used to estimate the economic and financial costs per person protected per year (pppy) for large-scale LSM using microbial larvicides in three ecologically diverse settings: (1) the coastal metropolitan area of Dar es Salaam in Tanzania, (2) a highly populated Kenyan highland area (Vihiga District), and (3) a lakeside setting in rural western Kenya (Mbita Division). Two scenarios were examined to investigate the cost implications of using alternative product formulations. Sensitivity analyses on product prices were carried out. Results The results show that for programmes using the same granular formulation larviciding costs the least pppy in Dar es Salaam (US$0.94), approximately 60% more in Vihiga District (US$1.50) and the most in Mbita Division (US$2.50). However, these costs are reduced substantially if an alternative water-dispensable formulation is used; in Vihiga, this would reduce costs to US$0.79 and, in Mbita Division, to US$1.94. Larvicide and staff salary costs each accounted for approximately a third of the total economic costs per year. The cost pppy depends mainly on: (1) the type of formulation required for treating different aquatic habitats, (2) the human population density relative to the density of aquatic habitats and (3) the potential to target the intervention in space and/or time. Conclusion Costs for LSM

  5. Received signal strength in large-scale wireless relay sensor network: a stochastic ray approach

    NARCIS (Netherlands)

    Hu, L.; Chen, Y.; Scanlon, W.G.

    2011-01-01

    The authors consider a point percolation lattice representation of a large-scale wireless relay sensor network (WRSN) deployed in a cluttered environment. Each relay sensor corresponds to a grid point in the random lattice and the signal sent by the source is modelled as an ensemble of photons that

  6. Large-scale control of mosquito vectors of disease

    International Nuclear Information System (INIS)

    Curtis, C.F.; Andreasen, M.H.

    2000-01-01

    By far the most important vector borne disease is malaria transmitted by Anopheles mosquitoes causing an estimated 300-500 million clinical cases per year and 1.4-2.6 million deaths, mostly in tropical Africa (WHO 1995). The second most important mosquito borne disease is lymphatic filariasis, but there are now such effective, convenient and cheap drugs for its treatment that vector control will now have at most a supplementary role (Maxwell et al. 1999a). The only other mosquito borne disease likely to justify large-scale vector control is dengue which is carried in urban areas of Southeast Asia and Latin America by Aedes aegypti L. which was also the urban vector of yellow fever in Latin America. This mosquito was eradicated from most countries of Latin America between the 1930s and 60s but, unfortunately in recent years, it has been allowed to re-infest and cause serious dengue epidemics, except in Cuba where it has been held close to eradication (Reiter and Gubler 1997). In the 1930s and 40s, invasions by An. gambiae Giles s.l., the main tropical African malaria vector, were eradicated from Brazil (Soper and Wilson 1943) and Egypt (Shousha 1947). It is surprising that greatly increased air traffic has not led to more such invasions of apparently climatically suitable areas, e.g., of Polynesia which has no anophelines and therefore no malaria. The above mentioned temporary or permanent eradications were achieved before the advent of DDT, using larvicidal methods (of a kind which would now be considered environmentally unacceptable) carried out by rigorously disciplined teams. MALARIA Between the end of the Second World War and the 1960s, the availability of DDT for spraying of houses allowed eradication of malaria from the Soviet Union, southern Europe, the USA, northern Venezuela and Guyana, Taiwan and the Caribbean Islands, apart from Hispaniola. Its range and intensity were also greatly reduced in China, India and South Africa and, at least temporarily, in

  7. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  8. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    Science.gov (United States)

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-08-16

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  9. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems

    Directory of Open Access Journals (Sweden)

    Ali Albattat

    2016-08-01

    Full Text Available The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems. These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  10. Large scale Brownian dynamics of confined suspensions of rigid particles

    Science.gov (United States)

    Sprinkle, Brennan; Balboa Usabiaga, Florencio; Patankar, Neelesh A.; Donev, Aleksandar

    2017-12-01

    We introduce methods for large-scale Brownian Dynamics (BD) simulation of many rigid particles of arbitrary shape suspended in a fluctuating fluid. Our method adds Brownian motion to the rigid multiblob method [F. Balboa Usabiaga et al., Commun. Appl. Math. Comput. Sci. 11(2), 217-296 (2016)] at a cost comparable to the cost of deterministic simulations. We demonstrate that we can efficiently generate deterministic and random displacements for many particles using preconditioned Krylov iterative methods, if kernel methods to efficiently compute the action of the Rotne-Prager-Yamakawa (RPY) mobility matrix and its "square" root are available for the given boundary conditions. These kernel operations can be computed with near linear scaling for periodic domains using the positively split Ewald method. Here we study particles partially confined by gravity above a no-slip bottom wall using a graphical processing unit implementation of the mobility matrix-vector product, combined with a preconditioned Lanczos iteration for generating Brownian displacements. We address a major challenge in large-scale BD simulations, capturing the stochastic drift term that arises because of the configuration-dependent mobility. Unlike the widely used Fixman midpoint scheme, our methods utilize random finite differences and do not require the solution of resistance problems or the computation of the action of the inverse square root of the RPY mobility matrix. We construct two temporal schemes which are viable for large-scale simulations, an Euler-Maruyama traction scheme and a trapezoidal slip scheme, which minimize the number of mobility problems to be solved per time step while capturing the required stochastic drift terms. We validate and compare these schemes numerically by modeling suspensions of boomerang-shaped particles sedimented near a bottom wall. Using the trapezoidal scheme, we investigate the steady-state active motion in dense suspensions of confined microrollers, whose

  11. Culturally adaptive storytelling method to improve hypertension control in Vietnam - "We talk about our hypertension": study protocol for a feasibility cluster-randomized controlled trial.

    Science.gov (United States)

    Allison, Jeroan J; Nguyen, Hoa L; Ha, Duc A; Chiriboga, Germán; Ly, Ha N; Tran, Hanh T; Phan, Ngoc T; Vu, Nguyen C; Kim, Minjin; Goldberg, Robert J

    2016-01-14

    Vietnam is experiencing an epidemiologic transition with an increased prevalence of non-communicable diseases. At present, the major risk factors for cardiovascular disease (CVD) are either on the rise or at alarming levels in Vietnam; inasmuch, the burden of CVD will continue to increase in this country unless effective prevention and control measures are put in place. A national survey in 2008 found that the prevalence of hypertension (HTN) was approximately 25 % among Vietnamese adults and it increased with advancing age. Therefore, novel, large-scale, and sustainable interventions for public health education to promote engagement in the process of detecting and treating HTN in Vietnam are urgently needed. A feasibility randomized trial will be conducted in Hung Yen province, Vietnam to evaluate the feasibility and acceptability of a novel community-based intervention using the "storytelling" method to enhance the control of HTN in adults residing in four rural communities. The intervention will center on stories about living with HTN, with patients speaking in their own words. The stories will be obtained from particularly eloquent patients, or "video stars," identified during Story Development Groups. The study will involve two phases: (i) developing a HTN intervention using the storytelling method, which is designed to empower patients to facilitate changes in their lifestyle practices, and (ii) conducting a feasibility cluster-randomized trial to investigate the feasibility, acceptability, and potential efficacy of the intervention compared with usual care in HTN control among rural residents. The trial will be conducted at four communes, and within each commune, 25 individuals 50 years or older with HTN will be enrolled in the trial resulting in a total sample size of 100 patients. This feasibility trial will provide the necessary groundwork for a subsequent large-scale, fully powered, cluster-randomized controlled trial to test the efficacy of our novel

  12. Nonlinear Model-Based Predictive Control applied to Large Scale Cryogenic Facilities

    CERN Document Server

    Blanco Vinuela, Enrique; de Prada Moraga, Cesar

    2001-01-01

    The thesis addresses the study, analysis, development, and finally the real implementation of an advanced control system for the 1.8 K Cooling Loop of the LHC (Large Hadron Collider) accelerator. The LHC is the next accelerator being built at CERN (European Center for Nuclear Research), it will use superconducting magnets operating below a temperature of 1.9 K along a circumference of 27 kilometers. The temperature of these magnets is a control parameter with strict operating constraints. The first control implementations applied a procedure that included linear identification, modelling and regulation using a linear predictive controller. It did improve largely the overall performance of the plant with respect to a classical PID regulator, but the nature of the cryogenic processes pointed out the need of a more adequate technique, such as a nonlinear methodology. This thesis is a first step to develop a global regulation strategy for the overall control of the LHC cells when they will operate simultaneously....

  13. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  14. Simulation test of PIUS-type reactor with large scale experimental apparatus

    International Nuclear Information System (INIS)

    Tamaki, M.; Tsuji, Y.; Ito, T.; Tasaka, K.; Kukita, Yutaka

    1995-01-01

    A large scale experimental apparatus for simulating the PIUS-type reactor has been constructed keeping the volumetric scaling ratio to the realistic reactor model. Fundamental experiments such as a steady state operation and a pump trip simulation were performed. Experimental results were compared with those obtained by the small scale apparatus in JAERI. We have already reported the effectiveness of the feedback control for the primary loop pump speed (PI control) for the stable operation. In this paper this feedback system is modified and the PID control is introduced. This new system worked well for the operation of the PIUS-type reactor even in a rapid transient condition. (author)

  15. Complex modular structure of large-scale brain networks

    Science.gov (United States)

    Valencia, M.; Pastor, M. A.; Fernández-Seara, M. A.; Artieda, J.; Martinerie, J.; Chavez, M.

    2009-06-01

    Modular structure is ubiquitous among real-world networks from related proteins to social groups. Here we analyze the modular organization of brain networks at a large scale (voxel level) extracted from functional magnetic resonance imaging signals. By using a random-walk-based method, we unveil the modularity of brain webs and show modules with a spatial distribution that matches anatomical structures with functional significance. The functional role of each node in the network is studied by analyzing its patterns of inter- and intramodular connections. Results suggest that the modular architecture constitutes the structural basis for the coexistence of functional integration of distant and specialized brain areas during normal brain activities at rest.

  16. Large-scale Ising-machines composed of magnetic neurons

    Science.gov (United States)

    Mizushima, Koichi; Goto, Hayato; Sato, Rie

    2017-10-01

    We propose Ising-machines composed of magnetic neurons, that is, magnetic bits in a recording track. In large-scale machines, the sizes of both neurons and synapses need to be reduced, and neat and smart connections among neurons are also required to achieve all-to-all connectivity among them. These requirements can be fulfilled by adopting magnetic recording technologies such as race-track memories and skyrmion tracks because the area of a magnetic bit is almost two orders of magnitude smaller than that of static random access memory, which has normally been used as a semiconductor neuron, and the smart connections among neurons are realized by using the read and write methods of these technologies.

  17. The Transition to Large-scale Cosmic Homogeneity in the WiggleZ Dark Energy Survey

    Science.gov (United States)

    Scrimgeour, Morag; Davis, T.; Blake, C.; James, B.; Poole, G. B.; Staveley-Smith, L.; Dark Energy Survey, WiggleZ

    2013-01-01

    The most fundamental assumption of the standard cosmological model (ΛCDM) is that the universe is homogeneous on large scales. This is clearly not true on small scales, where clusters and voids exist, and some studies seem to suggest that galaxies follow a fractal distribution up to very large scales 200 h-1 Mpc or more), whereas the ΛCDM model predicts transition to homogeneity at scales of ~100 h-1 Mpc. Any cosmological measurements made below the scale of homogeneity (such as the power spectrum) could be misleading, so it is crucial to measure the scale of homogeneity in the Universe. We have used the WiggleZ Dark Energy Survey to make the largest volume measurement to date of the transition to homogeneity in the galaxy distribution. WiggleZ is a UV-selected spectroscopic survey of ~200,000 luminous blue galaxies up to z=1, made with the Anglo-Australian Telescope. We have corrected for survey incompleteness using random catalogues that account for the various survey selection criteria, and tested the robustness of our results using a suite of fractal mock catalogues. The large volume and depth of WiggleZ allows us to probe the transition of the galaxy distribution to homogeneity on large scales and over several epochs, and see if this is consistent with a ΛCDM prediction.

  18. Localization Algorithm Based on a Spring Model (LASM for Large Scale Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Shuai Li

    2008-03-01

    Full Text Available A navigation method for a lunar rover based on large scale wireless sensornetworks is proposed. To obtain high navigation accuracy and large exploration area, highnode localization accuracy and large network scale are required. However, thecomputational and communication complexity and time consumption are greatly increasedwith the increase of the network scales. A localization algorithm based on a spring model(LASM method is proposed to reduce the computational complexity, while maintainingthe localization accuracy in large scale sensor networks. The algorithm simulates thedynamics of physical spring system to estimate the positions of nodes. The sensor nodesare set as particles with masses and connected with neighbor nodes by virtual springs. Thevirtual springs will force the particles move to the original positions, the node positionscorrespondingly, from the randomly set positions. Therefore, a blind node position can bedetermined from the LASM algorithm by calculating the related forces with the neighbornodes. The computational and communication complexity are O(1 for each node, since thenumber of the neighbor nodes does not increase proportionally with the network scale size.Three patches are proposed to avoid local optimization, kick out bad nodes and deal withnode variation. Simulation results show that the computational and communicationcomplexity are almost constant despite of the increase of the network scale size. The time consumption has also been proven to remain almost constant since the calculation steps arealmost unrelated with the network scale size.

  19. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  20. Money matters: evidence from a large-scale randomized field experiment with vouchers for adult training

    OpenAIRE

    Messer, Dolores; Wolter, Stefan C.

    2009-01-01

    This paper presents the results of a randomized experiment analyzing the use of vouchers for adult training. In 2006, 2,400 people were issued with a training voucher which they were entitled to use in payment for a training course of their choice. User behavior was compared with a control group of 14,000 people. People in the treatment and in the control group were not aware at any time that they were part of an experiment. The experiment shows that the voucher had a significant causal impac...

  1. Weighted Scaling in Non-growth Random Networks

    International Nuclear Information System (INIS)

    Chen Guang; Yang Xuhua; Xu Xinli

    2012-01-01

    We propose a weighted model to explain the self-organizing formation of scale-free phenomenon in non-growth random networks. In this model, we use multiple-edges to represent the connections between vertices and define the weight of a multiple-edge as the total weights of all single-edges within it and the strength of a vertex as the sum of weights for those multiple-edges attached to it. The network evolves according to a vertex strength preferential selection mechanism. During the evolution process, the network always holds its total number of vertices and its total number of single-edges constantly. We show analytically and numerically that a network will form steady scale-free distributions with our model. The results show that a weighted non-growth random network can evolve into scale-free state. It is interesting that the network also obtains the character of an exponential edge weight distribution. Namely, coexistence of scale-free distribution and exponential distribution emerges.

  2. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  3. Novel probabilistic and distributed algorithms for guidance, control, and nonlinear estimation of large-scale multi-agent systems

    Science.gov (United States)

    Bandyopadhyay, Saptarshi

    Multi-agent systems are widely used for constructing a desired formation shape, exploring an area, surveillance, coverage, and other cooperative tasks. This dissertation introduces novel algorithms in the three main areas of shape formation, distributed estimation, and attitude control of large-scale multi-agent systems. In the first part of this dissertation, we address the problem of shape formation for thousands to millions of agents. Here, we present two novel algorithms for guiding a large-scale swarm of robotic systems into a desired formation shape in a distributed and scalable manner. These probabilistic swarm guidance algorithms adopt an Eulerian framework, where the physical space is partitioned into bins and the swarm's density distribution over each bin is controlled using tunable Markov chains. In the first algorithm - Probabilistic Swarm Guidance using Inhomogeneous Markov Chains (PSG-IMC) - each agent determines its bin transition probabilities using a time-inhomogeneous Markov chain that is constructed in real-time using feedback from the current swarm distribution. This PSG-IMC algorithm minimizes the expected cost of the transitions required to achieve and maintain the desired formation shape, even when agents are added to or removed from the swarm. The algorithm scales well with a large number of agents and complex formation shapes, and can also be adapted for area exploration applications. In the second algorithm - Probabilistic Swarm Guidance using Optimal Transport (PSG-OT) - each agent determines its bin transition probabilities by solving an optimal transport problem, which is recast as a linear program. In the presence of perfect feedback of the current swarm distribution, this algorithm minimizes the given cost function, guarantees faster convergence, reduces the number of transitions for achieving the desired formation, and is robust to disturbances or damages to the formation. We demonstrate the effectiveness of these two proposed swarm

  4. Multi-Time Scale Coordinated Scheduling Strategy with Distributed Power Flow Controllers for Minimizing Wind Power Spillage

    Directory of Open Access Journals (Sweden)

    Yi Tang

    2017-11-01

    Full Text Available The inherent variability and randomness of large-scale wind power integration have brought great challenges to power flow control and dispatch. The distributed power flow controller (DPFC has the higher flexibility and capacity in power flow control in the system with wind generation. This paper proposes a multi-time scale coordinated scheduling model with DPFC to minimize wind power spillage. Configuration of DPFCs is initially determined by stochastic method. Afterward, two sequential procedures containing day-head and real-time scales are applied for determining maximum schedulable wind sources, optimal outputs of generating units and operation setting of DPFCs. The generating plan is obtained initially in day-ahead scheduling stage and modified in real-time scheduling model, while considering the uncertainty of wind power and fast operation of DPFC. Numerical simulation results in IEEE-RTS79 system illustrate that wind power is maximum scheduled with the optimal deployment and operation of DPFC, which confirms the applicability and effectiveness of the proposed method.

  5. The efficacy of imagery rescripting (IR) for social phobia: a randomized controlled trial.

    Science.gov (United States)

    Lee, Seung Won; Kwon, Jung-Hye

    2013-12-01

    There is a need for brief effective treatment of social phobia and Imagery Rescripting (IR) is a potential candidate. The purpose of this study was to examine the efficacy of IR preceded by cognitive restructuring as a stand-alone brief treatment using a randomized controlled design. Twenty-three individuals with social phobia were randomly assigned to an IR group or to a control group. Participants in the IR group were provided with one session of imagery interviewing and two sessions of cognitive restructuring and Imagery Rescripting. Those in the control group had one session of clinical interviewing and two sessions of supportive therapy. Outcome measures including the Korean version of the social avoidance and distress scale (K-SADS) were administered before and after treatment, and at three-month follow-up. The short version of the Questionnaire upon Mental Imagery and the Traumatic Experience Scale were also administered before treatment. Participants in the IR group improved significantly on K-SADS and other outcome measures, compared to the control group. The beneficial effects of IR were maintained at three-month follow-up. It was also found that mental imagery ability and the severity of the traumatic experience did not moderate the outcome of IR. Further studies are needed to replicate the findings of our study using a large sample. The efficacy of IR as a stand-alone brief treatment was demonstrated for social phobia. The findings indicate that IR could be utilized as a cost-effective intervention for social phobia. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  7. Treating major depression with yoga: A prospective, randomized, controlled pilot trial.

    Directory of Open Access Journals (Sweden)

    Sudha Prathikanti

    Full Text Available Conventional pharmacotherapies and psychotherapies for major depression are associated with limited adherence to care and relatively low remission rates. Yoga may offer an alternative treatment option, but rigorous studies are few. This randomized controlled trial with blinded outcome assessors examined an 8-week hatha yoga intervention as mono-therapy for mild-to-moderate major depression.Investigators recruited 38 adults in San Francisco meeting criteria for major depression of mild-to-moderate severity, per structured psychiatric interview and scores of 14-28 on Beck Depression Inventory-II (BDI. At screening, individuals engaged in psychotherapy, antidepressant pharmacotherapy, herbal or nutraceutical mood therapies, or mind-body practices were excluded. Participants were 68% female, with mean age 43.4 years (SD = 14.8, range = 22-72, and mean BDI score 22.4 (SD = 4.5. Twenty participants were randomized to 90-minute hatha yoga practice groups twice weekly for 8 weeks. Eighteen participants were randomized to 90-minute attention control education groups twice weekly for 8 weeks. Certified yoga instructors delivered both interventions at a university clinic. Primary outcome was depression severity, measured by BDI scores every 2 weeks from baseline to 8 weeks. Secondary outcomes were self-efficacy and self-esteem, measured by scores on the General Self-Efficacy Scale (GSES and Rosenberg Self-Esteem Scale (RSES at baseline and at 8 weeks.In intent-to-treat analysis, yoga participants exhibited significantly greater 8-week decline in BDI scores than controls (p-value = 0.034. In sub-analyses of participants completing final 8-week measures, yoga participants were more likely to achieve remission, defined per final BDI score ≤ 9 (p-value = 0.018. Effect size of yoga in reducing BDI scores was large, per Cohen's d = -0.96 [95%CI, -1.81 to -0.12]. Intervention groups did not differ significantly in 8-week change scores for either the GSES or

  8. Architecture for large-scale automatic web accessibility evaluation based on the UWEM methodology

    DEFF Research Database (Denmark)

    Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.

    2008-01-01

    The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....

  9. A Randomized Controlled Trial of an Eczema Care Plan.

    Science.gov (United States)

    Rea, Corinna J; Tran, Katherine D; Jorina, Maria; Wenren, Larissa M; Hawryluk, Elena B; Toomey, Sara L

    2018-03-02

    To test whether an eczema care plan (ECP) would improve provider documentation and management, decrease eczema severity, and increase patient quality of life (QOL) in the pediatric primary care setting. We conducted a randomized controlled trial from June 2015 to September 2016 at a large hospital-based pediatric primary care clinic. Participants included children from 1 month to 16 years of age with a diagnosis of eczema. The intervention group received the ECP and the control group received usual care. Both groups completed a validated eczema severity scale (Patient-Oriented Eczema Measure [POEM]) and a QOL scale (Infant's Dermatitis Quality of Life Index [IDQOL]) or Children's Dermatology Life Quality Index [CDLQI]) before the visit and again ~1 month later. A total of 211 caregivers completed both the pre- and postintervention surveys (100 control group and 111 intervention group [94% completion]). Intervention group providers were more likely to recommend a comprehensive "step-up" plan (88%) vs 28%; P plan to families (80%) vs 2%; P improved between the pre- and postintervention periods. However, there was not a significant difference between the groups on either measure: POEM difference -0.8, 95% confidence interval (CI) -3.2 to 1.7; IDQOL difference -0.1, 95% CI -1.8 to 1.6; CDLQI difference 0.8, 95% CI -0.9 to 2.6. Intervention group providers documented more comprehensive eczema care than control group providers. Although patients improved on all measures in the postintervention period, the ECP did not augment that improvement. Copyright © 2018 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  10. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  11. Initial condition effects on large scale structure in numerical simulations of plane mixing layers

    Science.gov (United States)

    McMullan, W. A.; Garrett, S. J.

    2016-01-01

    In this paper, Large Eddy Simulations are performed on the spatially developing plane turbulent mixing layer. The simulated mixing layers originate from initially laminar conditions. The focus of this research is on the effect of the nature of the imposed fluctuations on the large-scale spanwise and streamwise structures in the flow. Two simulations are performed; one with low-level three-dimensional inflow fluctuations obtained from pseudo-random numbers, the other with physically correlated fluctuations of the same magnitude obtained from an inflow generation technique. Where white-noise fluctuations provide the inflow disturbances, no spatially stationary streamwise vortex structure is observed, and the large-scale spanwise turbulent vortical structures grow continuously and linearly. These structures are observed to have a three-dimensional internal geometry with branches and dislocations. Where physically correlated provide the inflow disturbances a "streaky" streamwise structure that is spatially stationary is observed, with the large-scale turbulent vortical structures growing with the square-root of time. These large-scale structures are quasi-two-dimensional, on top of which the secondary structure rides. The simulation results are discussed in the context of the varying interpretations of mixing layer growth that have been postulated. Recommendations are made concerning the data required from experiments in order to produce accurate numerical simulation recreations of real flows.

  12. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  13. Multilevel method for modeling large-scale networks.

    Energy Technology Data Exchange (ETDEWEB)

    Safro, I. M. (Mathematics and Computer Science)

    2012-02-24

    Understanding the behavior of real complex networks is of great theoretical and practical significance. It includes developing accurate artificial models whose topological properties are similar to the real networks, generating the artificial networks at different scales under special conditions, investigating a network dynamics, reconstructing missing data, predicting network response, detecting anomalies and other tasks. Network generation, reconstruction, and prediction of its future topology are central issues of this field. In this project, we address the questions related to the understanding of the network modeling, investigating its structure and properties, and generating artificial networks. Most of the modern network generation methods are based either on various random graph models (reinforced by a set of properties such as power law distribution of node degrees, graph diameter, and number of triangles) or on the principle of replicating an existing model with elements of randomization such as R-MAT generator and Kronecker product modeling. Hierarchical models operate at different levels of network hierarchy but with the same finest elements of the network. However, in many cases the methods that include randomization and replication elements on the finest relationships between network nodes and modeling that addresses the problem of preserving a set of simplified properties do not fit accurately enough the real networks. Among the unsatisfactory features are numerically inadequate results, non-stability of algorithms on real (artificial) data, that have been tested on artificial (real) data, and incorrect behavior at different scales. One reason is that randomization and replication of existing structures can create conflicts between fine and coarse scales of the real network geometry. Moreover, the randomization and satisfying of some attribute at the same time can abolish those topological attributes that have been undefined or hidden from

  14. Large-scale block adjustment without use of ground control points based on the compensation of geometric calibration for ZY-3 images

    Science.gov (United States)

    Yang, Bo; Wang, Mi; Xu, Wen; Li, Deren; Gong, Jianya; Pi, Yingdong

    2017-12-01

    The potential of large-scale block adjustment (BA) without ground control points (GCPs) has long been a concern among photogrammetric researchers, which is of effective guiding significance for global mapping. However, significant problems with the accuracy and efficiency of this method remain to be solved. In this study, we analyzed the effects of geometric errors on BA, and then developed a step-wise BA method to conduct integrated processing of large-scale ZY-3 satellite images without GCPs. We first pre-processed the BA data, by adopting a geometric calibration (GC) method based on the viewing-angle model to compensate for systematic errors, such that the BA input images were of good initial geometric quality. The second step was integrated BA without GCPs, in which a series of technical methods were used to solve bottleneck problems and ensure accuracy and efficiency. The BA model, based on virtual control points (VCPs), was constructed to address the rank deficiency problem caused by lack of absolute constraints. We then developed a parallel matching strategy to improve the efficiency of tie points (TPs) matching, and adopted a three-array data structure based on sparsity to relieve the storage and calculation burden of the high-order modified equation. Finally, we used the conjugate gradient method to improve the speed of solving the high-order equations. To evaluate the feasibility of the presented large-scale BA method, we conducted three experiments on real data collected by the ZY-3 satellite. The experimental results indicate that the presented method can effectively improve the geometric accuracies of ZY-3 satellite images. This study demonstrates the feasibility of large-scale mapping without GCPs.

  15. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  16. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...

  17. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  18. Testing a workplace physical activity intervention: a cluster randomized controlled trial.

    Science.gov (United States)

    McEachan, Rosemary R C; Lawton, Rebecca J; Jackson, Cath; Conner, Mark; Meads, David M; West, Robert M

    2011-04-11

    Increased physical activity levels benefit both an individuals' health and productivity at work. The purpose of the current study was to explore the impact and cost-effectiveness of a workplace physical activity intervention designed to increase physical activity levels. A total of 1260 participants from 44 UK worksites (based within 5 organizations) were recruited to a cluster randomized controlled trial with worksites randomly allocated to an intervention or control condition. Measurement of physical activity and other variables occurred at baseline, and at 0 months, 3 months and 9 months post-intervention. Health outcomes were measured during a 30 minute health check conducted in worksites at baseline and 9 months post intervention. The intervention consisted of a 3 month tool-kit of activities targeting components of the Theory of Planned Behavior, delivered in-house by nominated facilitators. Self-reported physical activity (measured using the IPAQ short-form) and health outcomes were assessed. Multilevel modelling found no significant effect of the intervention on MET minutes of activity (from the IPAQ) at any of the follow-up time points controlling for baseline activity. However, the intervention did significantly reduce systolic blood pressure (B=-1.79 mm/Hg) and resting heart rate (B=-2.08 beats) and significantly increased body mass index (B=.18 units) compared to control. The intervention was found not to be cost-effective, however the substantial variability round this estimate suggested that further research is warranted. The current study found mixed support for this worksite physical activity intervention. The paper discusses some of the tensions involved in conducting rigorous evaluations of large-scale randomized controlled trials in real-world settings. © 2011 McEachan et al; licensee BioMed Central Ltd.

  19. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  20. Arthroscopic Rotator Cuff Repair With Graft Augmentation of 3-Dimensional Biological Collagen for Moderate to Large Tears: A Randomized Controlled Study.

    Science.gov (United States)

    Cai, You-Zhi; Zhang, Chi; Jin, Ri-Long; Shen, Tong; Gu, Peng-Cheng; Lin, Xiang-Jin; Chen, Jian-De

    2018-05-01

    Due to the highly organized tissue and avascular nature of the rotator cuff, rotator cuff tears have limited ability to heal after the tendon is reinserted directly on the greater tubercle of the humerus. Consequently, retears are among the most common complications after rotator cuff repair. Augmentation of rotator cuff repairs with patches has been an active area of research in recent years to reduce retear rate. Graft augmentation with 3D collagen could prevent retears of the repaired tendon and improve tendon-bone healing in moderate to large rotator cuff tears. Randomized controlled study; Level of evidence, 2. A prospective, randomized controlled study was performed in a consecutive series of 112 patients age 50 to 85 years who underwent rotator cuff repair with the suture-bridge technique (58 patients, control group) or the suture-bridge technique augmented with 3-dimensional (3D) collagen (54 patients, study group). All patients were followed for 28.2 months (range, 24-36 months). Visual analog scale score for pain, University of California Los Angeles (UCLA) shoulder score, and Constant score were determined. Magnetic resonance imaging was performed pre- and postoperatively (at a minimum of 24 months) to evaluate the integrity of the rotator cuff and the retear rate of the repaired tendon. Three patients in each group had biopsies at nearly 24 months after surgery with histological assessment and transmission electron microscopy. A total of 104 patients completed the final follow-up. At the 12-month follow-up, the UCLA shoulder score was 28.1 ± 1.9 in the study group, which was significantly better than that in the control group (26.9 ± 2.1, P = .002). The Constant score was also significantly better in the study group (87.1 ± 3.2) than in the control group (84.9 ± 4.2, P = .003). However, at the final follow-up, no significant differences were found in the UCLA shoulder scores (29.4 ± 1.9 in the control group and 30.0 ± 1.6 in the study group, P

  1. Decentralized control of large-scale systems: Fixed modes, sensitivity and parametric robustness. Ph.D. Thesis - Universite Paul Sabatier, 1985

    Science.gov (United States)

    Tarras, A.

    1987-01-01

    The problem of stabilization/pole placement under structural constraints of large scale linear systems is discussed. The existence of a solution to this problem is expressed in terms of fixed modes. The aim is to provide a bibliographic survey of the available results concerning the fixed modes (characterization, elimination, control structure selection to avoid them, control design in their absence) and to present the author's contribution to this problem which can be summarized by the use of the mode sensitivity concept to detect or to avoid them, the use of vibrational control to stabilize them, and the addition of parametric robustness considerations to design an optimal decentralized robust control.

  2. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  3. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  4. Adaptive Fuzzy Output-Constrained Fault-Tolerant Control of Nonlinear Stochastic Large-Scale Systems With Actuator Faults.

    Science.gov (United States)

    Li, Yongming; Ma, Zhiyao; Tong, Shaocheng

    2017-09-01

    The problem of adaptive fuzzy output-constrained tracking fault-tolerant control (FTC) is investigated for the large-scale stochastic nonlinear systems of pure-feedback form. The nonlinear systems considered in this paper possess the unstructured uncertainties, unknown interconnected terms and unknown nonaffine nonlinear faults. The fuzzy logic systems are employed to identify the unknown lumped nonlinear functions so that the problems of structured uncertainties can be solved. An adaptive fuzzy state observer is designed to solve the nonmeasurable state problem. By combining the barrier Lyapunov function theory, adaptive decentralized and stochastic control principles, a novel fuzzy adaptive output-constrained FTC approach is constructed. All the signals in the closed-loop system are proved to be bounded in probability and the system outputs are constrained in a given compact set. Finally, the applicability of the proposed controller is well carried out by a simulation example.

  5. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  6. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  7. Dietary Soy Supplement on Fibromyalgia Symptoms: A Randomized, Double-Blind, Placebo-Controlled, Early Phase Trial

    Science.gov (United States)

    Wahner-Roedler, Dietlind L.; Thompson, Jeffrey M.; Luedtke, Connie A.; King, Susan M.; Cha, Stephen S.; Elkin, Peter L.; Bruce, Barbara K.; Townsend, Cynthia O.; Bergeson, Jody R.; Eickhoff, Andrea L.; Loehrer, Laura L.; Sood, Amit; Bauer, Brent A.

    2011-01-01

    Most patients with fibromyalgia use complementary and alternative medicine (CAM). Properly designed controlled trials are necessary to assess the effectiveness of these practices. This study was a randomized, double-blind, placebo-controlled, early phase trial. Fifty patients seen at a fibromyalgia outpatient treatment program were randomly assigned to a daily soy or placebo (casein) shake. Outcome measures were scores of the Fibromyalgia Impact Questionnaire (FIQ) and the Center for Epidemiologic Studies Depression Scale (CES-D) at baseline and after 6 weeks of intervention. Analysis was with standard statistics based on the null hypothesis, and separation test for early phase CAM comparative trials. Twenty-eight patients completed the study. Use of standard statistics with intent-to-treat analysis showed that total FIQ scores decreased by 14% in the soy group (P = .02) and by 18% in the placebo group (P fibromyalgia treatment program, provide a decrease in fibromyalgia symptoms. Separation between the effects of soy and casein (control) shakes did not favor the intervention. Therefore, large-sample studies using soy for patients with fibromyalgia are probably not indicated. PMID:18990724

  8. Legal control of technical large-scale projects

    International Nuclear Information System (INIS)

    Kuhnt, D.

    1981-01-01

    The principle derived from experience that large projects require approval by the courts may not longer be valid. On the contrary, the courts are only entitled to real legal control according to the principle of the division of powers. If not accurately defined legal terms cannot be waived, the administration has to set the frame for review by courts by technical standards to be given in statutory ordinances, administrative provisions and administrative instructions. The average term of administrative proceedings has to be shortened considerably. The plaintiff as well as the beneficiaries of the act of licensing have a right to a prompt decision. The immediate execution of a decision can, on principle, also not be waived in future. More than up to now, the careful consideration of the interests and not an anticipated judgement on the main issue has to be the subject of legal examination according to section 80, subsection 5 of the German code of administrative procedure (Verwaltungsgerichtsordnung). (orig./HP) [de

  9. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  10. Scalable multi-objective control for large scale water resources systems under uncertainty

    Science.gov (United States)

    Giuliani, Matteo; Quinn, Julianne; Herman, Jonathan; Castelletti, Andrea; Reed, Patrick

    2016-04-01

    The use of mathematical models to support the optimal management of environmental systems is rapidly expanding over the last years due to advances in scientific knowledge of the natural processes, efficiency of the optimization techniques, and availability of computational resources. However, undergoing changes in climate and society introduce additional challenges for controlling these systems, ultimately motivating the emergence of complex models to explore key causal relationships and dependencies on uncontrolled sources of variability. In this work, we contribute a novel implementation of the evolutionary multi-objective direct policy search (EMODPS) method for controlling environmental systems under uncertainty. The proposed approach combines direct policy search (DPS) with hierarchical parallelization of multi-objective evolutionary algorithms (MOEAs) and offers a threefold advantage: the DPS simulation-based optimization can be combined with any simulation model and does not add any constraint on modeled information, allowing the use of exogenous information in conditioning the decisions. Moreover, the combination of DPS and MOEAs prompts the generation or Pareto approximate set of solutions for up to 10 objectives, thus overcoming the decision biases produced by cognitive myopia, where narrow or restrictive definitions of optimality strongly limit the discovery of decision relevant alternatives. Finally, the use of large-scale MOEAs parallelization improves the ability of the designed solutions in handling the uncertainty due to severe natural variability. The proposed approach is demonstrated on a challenging water resources management problem represented by the optimal control of a network of four multipurpose water reservoirs in the Red River basin (Vietnam). As part of the medium-long term energy and food security national strategy, four large reservoirs have been constructed on the Red River tributaries, which are mainly operated for hydropower

  11. Topological relics of symmetry breaking: winding numbers and scaling tilts from random vortex–antivortex pairs

    International Nuclear Information System (INIS)

    Zurek, W H

    2013-01-01

    I show that random distributions of vortex–antivortex pairs (rather than of individual vortices) lead to scaling of typical winding numbers W trapped inside a loop of circumference C with the square root of that circumference, W∼√C, when the expected winding numbers are large, |W| ≫ 1. Such scaling is consistent with the Kibble–Zurek mechanism (KZM), with 〈W 2 〉 inversely proportional to ξ-hat , the typical size of the domain that can break symmetry in unison. (The dependence of ξ-hat on quench rate is predicted by KZM from critical exponents of the phase transition.) Thus, according to KZM, the dispersion √ 2 > scales as √(C/ ξ-hat ) for large W. By contrast, a distribution of individual vortices with randomly assigned topological charges would result in the dispersion scaling with the square root of the area inside C (i.e., √ 2 > ∼ C). Scaling of the dispersion of W as well as of the probability of detection of non-zero W with C and ξ-hat can be also studied for loops so small that non-zero windings are rare. In this case I show that dispersion varies not as 1/√( ξ-hat ), but as 1/ ξ-hat , which results in a doubling of the scaling of dispersion with the quench rate when compared to the large |W| regime. Moreover, the probability of trapping of non-zero W becomes approximately equal to 〈W 2 〉, and scales as 1/ ξ-hat 2 . This quadruples—as compared with √ 2 > ≃ √C/ξ-circumflex valid for large W—the exponent in the power law dependence of the frequency of trapping of |W| = 1 on ξ-hat when the probability of |W| > 1 is negligible. This change of the power law exponent by a factor of four—from 1/√( ξ-hat ) for the dispersion of large W to 1/ ξ-hat 2 for the frequency of non-zero W when |W| > 1 is negligibly rare—is of paramount importance for experimental tests of KZM. (paper)

  12. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  13. Large-scale analysis of phosphorylation site occupancy in eukaryotic proteins

    DEFF Research Database (Denmark)

    Rao, R Shyama Prasad; Møller, Ian Max

    2012-01-01

    in proteins is currently lacking. We have therefore analyzed the occurrence and occupancy of phosphorylated sites (~ 100,281) in a large set of eukaryotic proteins (~ 22,995). Phosphorylation probability was found to be much higher in both the  termini of protein sequences and this is much pronounced...... maximum randomness. An analysis of phosphorylation motifs indicated that just 40 motifs and a much lower number of associated kinases might account for nearly 50% of the known phosphorylations in eukaryotic proteins. Our results provide a broad picture of the phosphorylation sites in eukaryotic proteins.......Many recent high throughput technologies have enabled large-scale discoveries of new phosphorylation sites and phosphoproteins. Although they have provided a number of insights into protein phosphorylation and the related processes, an inclusive analysis on the nature of phosphorylated sites...

  14. Volumetric water control in a large-scale open canal irrigation system with many smallholders: The case of Chancay-Lambayeque in Peru

    NARCIS (Netherlands)

    Vos, J.M.C.; Vincent, L.F.

    2011-01-01

    Volumetric water control (VWC) is widely seen as a means to increase productivity through flexible scheduling and user incentives to apply just enough water. However, the technical and social requirements for VWC are poorly understood. Also, many experts assert that VWC in large-scale open canals

  15. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  16. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  17. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  18. LARGE-SCALE TOPOLOGICAL PROPERTIES OF MOLECULAR NETWORKS.

    Energy Technology Data Exchange (ETDEWEB)

    MASLOV,S.SNEPPEN,K.

    2003-11-17

    Bio-molecular networks lack the top-down design. Instead, selective forces of biological evolution shape them from raw material provided by random events such as gene duplications and single gene mutations. As a result individual connections in these networks are characterized by a large degree of randomness. One may wonder which connectivity patterns are indeed random, while which arose due to the network growth, evolution, and/or its fundamental design principles and limitations? Here we introduce a general method allowing one to construct a random null-model version of a given network while preserving the desired set of its low-level topological features, such as, e.g., the number of neighbors of individual nodes, the average level of modularity, preferential connections between particular groups of nodes, etc. Such a null-model network can then be used to detect and quantify the non-random topological patterns present in large networks. In particular, we measured correlations between degrees of interacting nodes in protein interaction and regulatory networks in yeast. It was found that in both these networks, links between highly connected proteins are systematically suppressed. This effect decreases the likelihood of cross-talk between different functional modules of the cell, and increases the overall robustness of a network by localizing effects of deleterious perturbations. It also teaches us about the overall computational architecture of such networks and points at the origin of large differences in the number of neighbors of individual nodes.

  19. The anomalous scaling exponents of turbulence in general dimension from random geometry

    Energy Technology Data Exchange (ETDEWEB)

    Eling, Christopher [Rudolf Peierls Centre for Theoretical Physics, University of Oxford, 1 Keble Road, Oxford OX1 3NP (United Kingdom); Oz, Yaron [Raymond and Beverly Sackler School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978 (Israel)

    2015-09-22

    We propose an analytical formula for the anomalous scaling exponents of inertial range structure functions in incompressible fluid turbulence. The formula is a Knizhnik-Polyakov-Zamolodchikov (KPZ)-type relation and is valid in any number of space dimensions. It incorporates intermittency in a novel way by dressing the Kolmogorov linear scaling via a coupling to a lognormal random geometry. The formula has one real parameter γ that depends on the number of space dimensions. The scaling exponents satisfy the convexity inequality, and the supersonic bound constraint. They agree with the experimental and numerical data in two and three space dimensions, and with numerical data in four space dimensions. Intermittency increases with γ, and in the infinite γ limit the scaling exponents approach the value one, as in Burgers turbulence. At large n the nth order exponent scales as √n. We discuss the relation between fluid flows and black hole geometry that inspired our proposal.

  20. Dynamic Output Feedback Control for Nonlinear Networked Control Systems with Random Packet Dropout and Random Delay

    Directory of Open Access Journals (Sweden)

    Shuiqing Yu

    2013-01-01

    Full Text Available This paper investigates the dynamic output feedback control for nonlinear networked control systems with both random packet dropout and random delay. Random packet dropout and random delay are modeled as two independent random variables. An observer-based dynamic output feedback controller is designed based upon the Lyapunov theory. The quantitative relationship of the dropout rate, transition probability matrix, and nonlinear level is derived by solving a set of linear matrix inequalities. Finally, an example is presented to illustrate the effectiveness of the proposed method.

  1. Yoga for High‑Risk Pregnancy: A Randomized Controlled Trial ...

    African Journals Online (AJOL)

    The study was a single‑blind randomized controlled clinical trial. Perceived stress scale (PSS) was measured during the 12th, 20th, and 28th weeks of pregnancy. SPSS version 16.0 (Chicago, IL, USA) was used for all data analysis. When the data were found to be normally distributed,the RMANOVA were used to assess ...

  2. Massage Therapy for Pain and Function in Patients With Arthritis: A Systematic Review of Randomized Controlled Trials.

    Science.gov (United States)

    Nelson, Nicole L; Churilla, James R

    2017-09-01

    Massage therapy is gaining interest as a therapeutic approach to managing osteoarthritis and rheumatoid arthritis symptoms. To date, there have been no systematic reviews investigating the effects of massage therapy on these conditions. Systematic review was used. The primary aim of this review was to critically appraise and synthesize the current evidence regarding the effects of massage therapy as a stand-alone treatment on pain and functional outcomes among those with osteoarthritis or rheumatoid arthritis. Relevant randomized controlled trials were searched using the electronic databases Google Scholar, MEDLINE, and PEDro. The PEDro scale was used to assess risk of bias, and the quality of evidence was assessed with the GRADE approach. This review found seven randomized controlled trials representing 352 participants who satisfied the inclusion criteria. Risk of bias ranged from four to seven. Our results found low- to moderate-quality evidence that massage therapy is superior to nonactive therapies in reducing pain and improving certain functional outcomes. It is unclear whether massage therapy is more effective than other forms of treatment. There is a need for large, methodologically rigorous randomized controlled trials investigating the effectiveness of massage therapy as an intervention for individuals with arthritis.

  3. Aerobic exercise in obese diabetic patients with chronic kidney disease: a randomized and controlled pilot study

    Directory of Open Access Journals (Sweden)

    Cooper Cheryl

    2009-12-01

    Full Text Available Abstract Background Patients with obesity, diabetes, and chronic kidney disease (CKD are generally physically inactive, have a high mortality rate, and may benefit from an exercise program. Methods We performed a 24-week randomized controlled feasibility study comparing aerobic exercise plus optimal medical management to medical management alone in patients with type 2 diabetes, obesity (body mass index [BMI] > 30 kg/m2, and stage 2-4 CKD (estimated glomerular filtration rate [eGFR] 15-90 mL/min/1.73 m2 with persistent proteinuria. Subjects randomized to exercise underwent thrice weekly aerobic training for 6 followed by 18 weeks of supervised home exercise. The primary outcome variable was change in proteinuria. Results Seven subjects randomized to exercise and 4 control subjects completed the study. Exercise training resulted in an increase in exercise duration during treadmill testing, which was accompanied by slight but insignificant decreases in resting systolic blood pressure and 24-hour proteinuria. Exercise did not alter GFR, hemoglobin, glycated hemoglobin, serum lipids, or C-reactive protein (CRP. Caloric intake and body weight and composition also did not change with exercise training. Conclusion Exercise training in obese diabetic patients with CKD is feasible and may have clinical benefits. A large-scale randomized controlled trial to determine the effects of exercise on renal functions, cardiovascular fitness, inflammation, and oxidative stress in diabetic patients with CKD is planned.

  4. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  5. Long term effectiveness on prescribing of two multifaceted educational interventions: results of two large scale randomized cluster trials.

    Directory of Open Access Journals (Sweden)

    Nicola Magrini

    Full Text Available INTRODUCTION: Information on benefits and risks of drugs is a key element affecting doctors' prescribing decisions. Outreach visits promoting independent information have proved moderately effective in changing prescribing behaviours. OBJECTIVES: Testing the short and long-term effectiveness on general practitioners' prescribing of small groups meetings led by pharmacists. METHODS: Two cluster open randomised controlled trials (RCTs were carried out in a large scale NHS setting. Ad hoc prepared evidence based material were used considering a therapeutic area approach--TEA, with information materials on osteoporosis or prostatic hyperplasia--and a single drug oriented approach--SIDRO, with information materials on me-too drugs of 2 different classes: barnidipine or prulifloxacin. In each study, all 115 Primary Care Groups in a Northern Italy area (2.2 million inhabitants, 1737 general practitioners were randomised to educational small groups meetings, in which available evidence was provided together with drug utilization data and clinical scenarios. Main outcomes were changes in the six-months prescription of targeted drugs. Longer term results (24 and 48 months were also evaluated. RESULTS: In the TEA trial, one of the four primary outcomes showed a reduction (prescription of alfuzosin compared to tamsulosin and terazosin in benign prostatic hyperplasia: prescribing ratio -8.5%, p = 0.03. Another primary outcome (prescription of risedronate showed a reduction at 24 and 48 months (-7.6%, p = 0.02; and -9,8%, p = 0.03, but not at six months (-5.1%, p = 0.36. In the SIDRO trial both primary outcomes showed a statistically significant reduction (prescription of barnidipine -9.8%, p = 0.02; prescription of prulifloxacin -11.1%, p = 0.04, which persisted or increased over time. INTERPRETATION: These two cluster RCTs showed the large scale feasibility of a complex educational program in a NHS setting, and its potentially

  6. A Randomized Central Limit Theorem

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2010-01-01

    The Central Limit Theorem (CLT), one of the most elemental pillars of Probability Theory and Statistical Physics, asserts that: the universal probability law of large aggregates of independent and identically distributed random summands with zero mean and finite variance, scaled by the square root of the aggregate-size (√(n)), is Gaussian. The scaling scheme of the CLT is deterministic and uniform - scaling all aggregate-summands by the common and deterministic factor √(n). This Letter considers scaling schemes which are stochastic and non-uniform, and presents a 'Randomized Central Limit Theorem' (RCLT): we establish a class of random scaling schemes which yields universal probability laws of large aggregates of independent and identically distributed random summands. The RCLT universal probability laws, in turn, are the one-sided and the symmetric Levy laws.

  7. Wordless intervention for epilepsy in learning disabilities (WIELD): study protocol for a randomized controlled feasibility trial.

    Science.gov (United States)

    Durand, Marie-Anne; Gates, Bob; Parkes, Georgina; Zia, Asif; Friedli, Karin; Barton, Garry; Ring, Howard; Oostendorp, Linda; Wellsted, David

    2014-11-20

    Epilepsy is the most common neurological problem that affects people with learning disabilities. The high seizure frequency, resistance to treatments, associated skills deficit and co-morbidities make the management of epilepsy particularly challenging for people with learning disabilities. The Books Beyond Words booklet for epilepsy uses images to help people with learning disabilities manage their condition and improve quality of life. Our aim is to conduct a randomized controlled feasibility trial exploring key methodological, design and acceptability issues, in order to subsequently undertake a large-scale randomized controlled trial of the Books Beyond Words booklet for epilepsy. We will use a two-arm, single-centre randomized controlled feasibility design, over a 20-month period, across five epilepsy clinics in Hertfordshire, United Kingdom. We will recruit 40 eligible adults with learning disabilities and a confirmed diagnosis of epilepsy and will randomize them to use either the Books Beyond Words booklet plus usual care (intervention group) or to receive routine information and services (control group). We will collect quantitative data about the number of eligible participants, number of recruited participants, demographic data, discontinuation rates, variability of the primary outcome measure (quality of life: Epilepsy and Learning Disabilities Quality of Life scale), seizure severity, seizure control, intervention's patterns of use, use of other epilepsy-related information, resource use and the EQ-5D-5L health questionnaire. We will also gather qualitative data about the feasibility and acceptability of the study procedures and the Books Beyond Words booklet. Ethical approval for this study was granted on 28 April 2014, by the Wales Research Ethics Committee 5. Recruitment began on 1 July 2014. The outcomes of this feasibility study will be used to inform the design and methodology of a definitive study, adequately powered to determine the impact of

  8. Large-scale agent-based social simulation : A study on epidemic prediction and control

    NARCIS (Netherlands)

    Zhang, M.

    2016-01-01

    Large-scale agent-based social simulation is gradually proving to be a versatile methodological approach for studying human societies, which could make contributions from policy making in social science, to distributed artificial intelligence and agent technology in computer science, and to theory

  9. Tile-Based Semisupervised Classification of Large-Scale VHR Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Haikel Alhichri

    2018-01-01

    Full Text Available This paper deals with the problem of the classification of large-scale very high-resolution (VHR remote sensing (RS images in a semisupervised scenario, where we have a limited training set (less than ten training samples per class. Typical pixel-based classification methods are unfeasible for large-scale VHR images. Thus, as a practical and efficient solution, we propose to subdivide the large image into a grid of tiles and then classify the tiles instead of classifying pixels. Our proposed method uses the power of a pretrained convolutional neural network (CNN to first extract descriptive features from each tile. Next, a neural network classifier (composed of 2 fully connected layers is trained in a semisupervised fashion and used to classify all remaining tiles in the image. This basically presents a coarse classification of the image, which is sufficient for many RS application. The second contribution deals with the employment of the semisupervised learning to improve the classification accuracy. We present a novel semisupervised approach which exploits both the spectral and spatial relationships embedded in the remaining unlabelled tiles. In particular, we embed a spectral graph Laplacian in the hidden layer of the neural network. In addition, we apply regularization of the output labels using a spatial graph Laplacian and the random Walker algorithm. Experimental results obtained by testing the method on two large-scale images acquired by the IKONOS2 sensor reveal promising capabilities of this method in terms of classification accuracy even with less than ten training samples per class.

  10. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  11. Numerical Modeling of Large-Scale Rocky Coastline Evolution

    Science.gov (United States)

    Limber, P.; Murray, A. B.; Littlewood, R.; Valvo, L.

    2008-12-01

    Seventy-five percent of the world's ocean coastline is rocky. On large scales (i.e. greater than a kilometer), many intertwined processes drive rocky coastline evolution, including coastal erosion and sediment transport, tectonics, antecedent topography, and variations in sea cliff lithology. In areas such as California, an additional aspect of rocky coastline evolution involves submarine canyons that cut across the continental shelf and extend into the nearshore zone. These types of canyons intercept alongshore sediment transport and flush sand to abyssal depths during periodic turbidity currents, thereby delineating coastal sediment transport pathways and affecting shoreline evolution over large spatial and time scales. How tectonic, sediment transport, and canyon processes interact with inherited topographic and lithologic settings to shape rocky coastlines remains an unanswered, and largely unexplored, question. We will present numerical model results of rocky coastline evolution that starts with an immature fractal coastline. The initial shape is modified by headland erosion, wave-driven alongshore sediment transport, and submarine canyon placement. Our previous model results have shown that, as expected, an initial sediment-free irregularly shaped rocky coastline with homogeneous lithology will undergo smoothing in response to wave attack; headlands erode and mobile sediment is swept into bays, forming isolated pocket beaches. As this diffusive process continues, pocket beaches coalesce, and a continuous sediment transport pathway results. However, when a randomly placed submarine canyon is introduced to the system as a sediment sink, the end results are wholly different: sediment cover is reduced, which in turn increases weathering and erosion rates and causes the entire shoreline to move landward more rapidly. The canyon's alongshore position also affects coastline morphology. When placed offshore of a headland, the submarine canyon captures local sediment

  12. Impedance Scaling and Impedance Control

    International Nuclear Information System (INIS)

    Chou, W.; Griffin, J.

    1997-06-01

    When a machine becomes really large, such as the Very Large Hadron Collider (VLHC), of which the circumference could reach the order of megameters, beam instability could be an essential bottleneck. This paper studies the scaling of the instability threshold vs. machine size when the coupling impedance scales in a ''normal'' way. It is shown that the beam would be intrinsically unstable for the VLHC. As a possible solution to this problem, it is proposed to introduce local impedance inserts for controlling the machine impedance. In the longitudinal plane, this could be done by using a heavily detuned rf cavity (e.g., a biconical structure), which could provide large imaginary impedance with the right sign (i.e., inductive or capacitive) while keeping the real part small. In the transverse direction, a carefully designed variation of the cross section of a beam pipe could generate negative impedance that would partially compensate the transverse impedance in one plane

  13. Control problems in very large accelerators

    International Nuclear Information System (INIS)

    Crowley-Milling, M.C.

    1985-06-01

    There is no fundamental difference of kind in the control requirements between a small and a large accelerator since they are built of the same types of components, which individually have similar control inputs and outputs. The main difference is one of scale; the large machine has many more components of each type, and the distances involved are much greater. Both of these factors must be taken into account in determining the optimum way of carrying out the control functions. Small machines should use standard equipment and software for control as much as possible, as special developments for small quantities cannot normally be justified if all costs are taken into account. On the other hand, the very great number of devices needed for a large machine means that, if special developments can result in simplification, they may make possible an appreciable reduction in the control equipment costs. It is the purpose of this report to look at the special control problems of large accelerators, which the author shall arbitarily define as those with a length of circumference in excess of 10 km, and point out where special developments, or the adoption of developments from outside the accelerator control field, can be of assistance in minimizing the cost of the control system. Most of the first part of this report was presented as a paper to the 1985 Particle Accelerator Conference. It has now been extended to include a discussion on the special case of the controls for the SSC

  14. Effectiveness of Wii-based rehabilitation in stroke: A randomized controlled study

    OpenAIRE

    Ayça Utkan Karasu; Elif Balevi Batur; Gülçin Kaymak Karataş

    2018-01-01

    Objective: To investigate the efficacy of Nintendo Wii Fit®-based balance rehabilitation as an adjunc-tive therapy to conventional rehabilitation in stroke patients. Methods: During the study period, 70 stroke patients were evaluated. Of these, 23 who met the study criteria were randomly assigned to either the experimental group (n = 12) or the control group (n = 11) by block randomization. Primary outcome measures were Berg Balance Scale, Functional Reach Test, Postural Asses...

  15. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 mu m. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  16. A topological analysis of large-scale structure, studied using the CMASS sample of SDSS-III

    International Nuclear Information System (INIS)

    Parihar, Prachi; Gott, J. Richard III; Vogeley, Michael S.; Choi, Yun-Young; Kim, Juhan; Kim, Sungsoo S.; Speare, Robert; Brownstein, Joel R.; Brinkmann, J.

    2014-01-01

    We study the three-dimensional genus topology of large-scale structure using the northern region of the CMASS Data Release 10 (DR10) sample of the SDSS-III Baryon Oscillation Spectroscopic Survey. We select galaxies with redshift 0.452 < z < 0.625 and with a stellar mass M stellar > 10 11.56 M ☉ . We study the topology at two smoothing lengths: R G = 21 h –1 Mpc and R G = 34 h –1 Mpc. The genus topology studied at the R G = 21 h –1 Mpc scale results in the highest genus amplitude observed to date. The CMASS sample yields a genus curve that is characteristic of one produced by Gaussian random phase initial conditions. The data thus support the standard model of inflation where random quantum fluctuations in the early universe produced Gaussian random phase initial conditions. Modest deviations in the observed genus from random phase are as expected from shot noise effects and the nonlinear evolution of structure. We suggest the use of a fitting formula motivated by perturbation theory to characterize the shift and asymmetries in the observed genus curve with a single parameter. We construct 54 mock SDSS CMASS surveys along the past light cone from the Horizon Run 3 (HR3) N-body simulations, where gravitationally bound dark matter subhalos are identified as the sites of galaxy formation. We study the genus topology of the HR3 mock surveys with the same geometry and sampling density as the observational sample and find the observed genus topology to be consistent with ΛCDM as simulated by the HR3 mock samples. We conclude that the topology of the large-scale structure in the SDSS CMASS sample is consistent with cosmological models having primordial Gaussian density fluctuations growing in accordance with general relativity to form galaxies in massive dark matter halos.

  17. Large-scale exact diagonalizations reveal low-momentum scales of nuclei

    Science.gov (United States)

    Forssén, C.; Carlsson, B. D.; Johansson, H. T.; Sääf, D.; Bansal, A.; Hagen, G.; Papenbrock, T.

    2018-03-01

    Ab initio methods aim to solve the nuclear many-body problem with controlled approximations. Virtually exact numerical solutions for realistic interactions can only be obtained for certain special cases such as few-nucleon systems. Here we extend the reach of exact diagonalization methods to handle model spaces with dimension exceeding 1010 on a single compute node. This allows us to perform no-core shell model (NCSM) calculations for 6Li in model spaces up to Nmax=22 and to reveal the 4He+d halo structure of this nucleus. Still, the use of a finite harmonic-oscillator basis implies truncations in both infrared (IR) and ultraviolet (UV) length scales. These truncations impose finite-size corrections on observables computed in this basis. We perform IR extrapolations of energies and radii computed in the NCSM and with the coupled-cluster method at several fixed UV cutoffs. It is shown that this strategy enables information gain also from data that is not fully UV converged. IR extrapolations improve the accuracy of relevant bound-state observables for a range of UV cutoffs, thus making them profitable tools. We relate the momentum scale that governs the exponential IR convergence to the threshold energy for the first open decay channel. Using large-scale NCSM calculations we numerically verify this small-momentum scale of finite nuclei.

  18. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  19. Muscle energy technique compared to eccentric loading exercise in the management of achilles tendinitis: A pilot randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Hariharasudhan Ravichandran

    2017-01-01

    Full Text Available Background: Achilles tendinitis is a common overuse injury among both elite and recreational athletes involved in activities such as repetitive jumping and running. Aim: The aim of this single-blinded randomized study was to compare the efficacy of muscle energy technique (MET and eccentric loading exercise (ELE interventions on improving functional ability and pain reduction among athletes with Achilles tendinitis. Methods: A single-blinded, pilot, randomized study was conducted in the Department of Physical Therapy, Global Hospitals and Health City, India, with 6-week follow-up. A total of 30 patients with Achilles tendinitis were randomly allocated to receive either MET (n = 15 or ELE (n = 15 treatment. Treatment effects were evaluated by pre- and post-treatment assessment of visual analog scale (VAS and Victorian Institute of Sports Assessment-Achilles (VISA-A questionnaire. Measures were performed by single-blinded evaluators at baseline and at 2, 4, and after 6 weeks of treatment. Results: Both groups showed a significant difference in VAS after 6 weeks' ELE group showed a significant improvement during treatment at 2 and 4 weeks in comparison with MET group. The VISA-A scale score significantly improved in both groups. Yet, comparison of VISA scores between groups showed marginally significant difference (P = 0.012. Conclusion: This pilot randomized controlled trial (RCT showed the efficacy of ELE in reducing pain and improving functional ability among patients with Achilles tendinitis. The findings of this study provide the rationale for undertaking a large-scale RCT. A large sized trial is needed to establish evidence for clinical practice of ELE in Achilles tendinitis cases.

  20. The topology of large-scale structure. III. Analysis of observations

    International Nuclear Information System (INIS)

    Gott, J.R. III; Weinberg, D.H.; Miller, J.; Thuan, T.X.; Schneider, S.E.

    1989-01-01

    A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a meatball topology. 66 refs

  1. The topology of large-scale structure. III - Analysis of observations

    Science.gov (United States)

    Gott, J. Richard, III; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.; Weinberg, David H.; Gammie, Charles; Polk, Kevin; Vogeley, Michael; Jeffrey, Scott; Bhavsar, Suketu P.; Melott, Adrian L.; Giovanelli, Riccardo; Hayes, Martha P.; Tully, R. Brent; Hamilton, Andrew J. S.

    1989-05-01

    A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.

  2. Comparison of prestellar core elongations and large-scale molecular cloud structures in the Lupus I region

    Energy Technology Data Exchange (ETDEWEB)

    Poidevin, Frédérick [UCL, KLB, Department of Physics and Astronomy, Gower Place, London WC1E 6BT (United Kingdom); Ade, Peter A. R.; Hargrave, Peter C.; Nutter, David [School of Physics and Astronomy, Cardiff University, Queens Buildings, The Parade, Cardiff CF24 3AA (United Kingdom); Angile, Francesco E.; Devlin, Mark J.; Klein, Jeffrey [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Benton, Steven J.; Netterfield, Calvin B. [Department of Physics, University of Toronto, 60 St. George Street, Toronto, ON M5S 1A7 (Canada); Chapin, Edward L. [XMM SOC, ESAC, Apartado 78, E-28691 Villanueva de la Canãda, Madrid (Spain); Fissel, Laura M.; Gandilo, Natalie N. [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, ON M5S 3H4 (Canada); Fukui, Yasuo [Department of Physics, Nagoya University, Chikusa-ku, Nagoya, Aichi 464-8601 (Japan); Gundersen, Joshua O. [Department of Physics, University of Miami, 1320 Campo Sano Drive, Coral Gables, FL 33146 (United States); Korotkov, Andrei L. [Department of Physics, Brown University, 182 Hope Street, Providence, RI 02912 (United States); Matthews, Tristan G.; Novak, Giles [Department of Physics and Astronomy, Northwestern University, 2145 Sheridan Road, Evanston, IL 60208 (United States); Moncelsi, Lorenzo; Mroczkowski, Tony K. [California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Olmi, Luca, E-mail: fpoidevin@iac.es [Physics Department, University of Puerto Rico, Rio Piedras Campus, Box 23343, UPR station, San Juan, PR 00931 (United States); and others

    2014-08-10

    Turbulence and magnetic fields are expected to be important for regulating molecular cloud formation and evolution. However, their effects on sub-parsec to 100 parsec scales, leading to the formation of starless cores, are not well understood. We investigate the prestellar core structure morphologies obtained from analysis of the Herschel-SPIRE 350 μm maps of the Lupus I cloud. This distribution is first compared on a statistical basis to the large-scale shape of the main filament. We find the distribution of the elongation position angle of the cores to be consistent with a random distribution, which means no specific orientation of the morphology of the cores is observed with respect to the mean orientation of the large-scale filament in Lupus I, nor relative to a large-scale bent filament model. This distribution is also compared to the mean orientation of the large-scale magnetic fields probed at 350 μm with the Balloon-borne Large Aperture Telescope for Polarimetry during its 2010 campaign. Here again we do not find any correlation between the core morphology distribution and the average orientation of the magnetic fields on parsec scales. Our main conclusion is that the local filament dynamics—including secondary filaments that often run orthogonally to the primary filament—and possibly small-scale variations in the local magnetic field direction, could be the dominant factors for explaining the final orientation of each core.

  3. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  4. From polymers to quantum gravity: Triple-scaling in rectangular random matrix models

    International Nuclear Information System (INIS)

    Myers, R.C.; Periwal, V.

    1993-01-01

    Rectangular NxM matrix models can be solved in several qualitatively distinct large-N limits, since two independent parameters govern the size of the matrix. Regarded as models of random surfaces, these matrix models interpolate between branched polymer behaviour and two-dimensional quantum gravity. We solve such models in a 'triple-scaling' regime in this paper, with N and M becoming large independently. A correspondence between phase transitions and singularities of mappings from R 2 to R 2 is indicated. At different critical points, the scaling behaviour is determined by (i) two decoupled ordinary differential equations; (ii) an ordinary differential equation and a finite-difference equation; or (iii) two coupled partial differential equations. The Painleve II equation arises (in conjunction with a difference equation) at a point associated with branched polymers. For critical points described by partial differential equations, there are dual weak-coupling/strong-coupling expansions. It is conjectured that the new physics is related to microscopic topology fluctuations. (orig.)

  5. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  6. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  7. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.

  8. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  9. Volume measurement study for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Uchikoshi, Seiji; Watanabe, Yuichi; Tsujino, Takeshi

    1999-01-01

    Large Scale Tank Calibration (LASTAC) facility, including an experimental tank which has the same volume and structure as the input accountancy tank of Rokkasho Reprocessing Plant (RRP) was constructed in Nuclear Material Control Center of Japan. Demonstration experiments have been carried out to evaluate a precision of solution volume measurement and to establish the procedure of highly accurate pressure measurement for a large scale tank with dip-tube bubbler probe system to be applied to the input accountancy tank of RRP. Solution volume in a tank is determined from substitution the solution level for the calibration function obtained in advance, which express a relation between the solution level and its volume in the tank. Therefore, precise solution volume measurement needs a precise calibration function that is determined carefully. The LASTAC calibration experiments using pure water showed good result in reproducibility. (J.P.N.)

  10. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  11. The composing technique of fast and large scale nuclear data acquisition and control system with single chip microcomputers and PC computers

    International Nuclear Information System (INIS)

    Xu Zurun; Wu Shiying; Liu Haitao; Yao Yangsen; Wang Yingguan; Yang Chaowen

    1998-01-01

    The technique of employing single-chip microcomputers and PC computers to compose a fast and large scale nuclear data acquisition and control system was discussed in detail. The optimum composition mode of this kind of system, the acquisition and control circuit unit based on single-chip microcomputers, the real-time communication methods and the software composition under the Windows 3.2 were also described. One, two and three dimensional spectra measured by this system were demonstrated

  12. The composing technique of fast and large scale nuclear data acquisition and control system with single chip microcomputers and PC computers

    International Nuclear Information System (INIS)

    Xu Zurun; Wu Shiying; Liu Haitao; Yao Yangsen; Wang Yingguan; Yang Chaowen

    1997-01-01

    The technique of employing single-chip microcomputers and PC computers to compose a fast and large scale nuclear data acquisition and control system was discussed in detail. The optimum composition mode of this kind of system, the acquisition and control circuit unit based on single-chip microcomputers, the real-time communication methods and the software composition under the Windows 3.2 were also described. One, two and three dimensional spectra measured by this system were demonstrated

  13. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  14. Approaches to large scale unsaturated flow in heterogeneous, stratified, and fractured geologic media

    International Nuclear Information System (INIS)

    Ababou, R.

    1991-08-01

    This report develops a broad review and assessment of quantitative modeling approaches and data requirements for large-scale subsurface flow in radioactive waste geologic repository. The data review includes discussions of controlled field experiments, existing contamination sites, and site-specific hydrogeologic conditions at Yucca Mountain. Local-scale constitutive models for the unsaturated hydrodynamic properties of geologic media are analyzed, with particular emphasis on the effect of structural characteristics of the medium. The report further reviews and analyzes large-scale hydrogeologic spatial variability from aquifer data, unsaturated soil data, and fracture network data gathered from the literature. Finally, various modeling strategies toward large-scale flow simulations are assessed, including direct high-resolution simulation, and coarse-scale simulation based on auxiliary hydrodynamic models such as single equivalent continuum and dual-porosity continuum. The roles of anisotropy, fracturing, and broad-band spatial variability are emphasized. 252 refs

  15. Large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    Alexandrov; Kotov, V.; Mineev, M.; Roumiantsev, V.; Wolters, H.; Amorim, A.; Pedro, L.; Ribeiro, A.; Badescu, E.; Caprini, M.; Burckhart-Chromek, D.; Dobson, M.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Nassiakou, M.; Schweiger, D.; Soloviev, I.; Hart, R.; Ryabov, Y.; Moneta, L.

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Regular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system. Feedback is received and returned into the development process. Studies of the system behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size. Large scale and performance test of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software. Of particular interest were the run control state transitions in various configurations of the run control hierarchy. For the purpose of the tests, the software from other Trigger/DAQ sub-systems has been emulated. The author presents a brief overview of the online system structure, its components and the large scale integration tests and their results

  16. Assessment of long-term and large-scale even-odd license plate controlled plan effects on urban air quality and its implication

    Science.gov (United States)

    Zhao, Suping; Yu, Ye; Qin, Dahe; Yin, Daiying; He, Jianjun

    2017-12-01

    To solve traffic congestion and to improve urban air quality, long-lasting and large-scale even-odd license plate controlled plan was implemented by local government during 20 November to 26 December 2016 in urban Lanzhou, a semi-arid valley city of northwest China. The traffic control measures provided an invaluable opportunity to evaluate its effects on urban air quality in less developed cities of northwest China. Based on measured simultaneously air pollutants and meteorological parameters, the abatement of traffic-related pollutants induced by the implemented control measures such as CO, PM2.5 and PM10 (the particulate matter with diameter less than 2.5 μm and 10 μm) concentrations were firstly quantified by comparing the air quality data in urban areas with those in rural areas (uncontrolled zones). The concentrations of CO, NO2 from motor vehicles and fine particulate matter (PM2.5) were shown to have significant decreases of 15%-23% during traffic control period from those measured before control period with hourly maximum CO, PM2.5, and NO2/SO2 reduction of 43%, 35% and 141.4%, respectively. The influence of the control measures on AQI (air quality index) and ozone was less as compared to its effect on other air pollutants. Therefore, to alleviate serious winter haze pollution in China and to protect human health, the stringent long-term and large-scale even-odd license plate controlled plan should be implemented aperiodically in urban areas, especially for the periods with poor diffusion conditions.

  17. On a Game of Large-Scale Projects Competition

    Science.gov (United States)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  18. Addressing challenges in scaling up TB and HIV treatment integration in rural primary healthcare clinics in South Africa (SUTHI): a cluster randomized controlled trial protocol.

    Science.gov (United States)

    Naidoo, Kogieleum; Gengiah, Santhanalakshmi; Yende-Zuma, Nonhlanhla; Padayatchi, Nesri; Barker, Pierre; Nunn, Andrew; Subrayen, Priashni; Abdool Karim, Salim S

    2017-11-13

    A large and compelling clinical evidence base has shown that integrated TB and HIV services leads to reduction in human immunodeficiency virus (HIV)- and tuberculosis (TB)-associated mortality and morbidity. Despite official policies and guidelines recommending TB and HIV care integration, its poor implementation has resulted in TB and HIV remaining the commonest causes of death in several countries in sub-Saharan Africa, including South Africa. This study aims to reduce mortality due to TB-HIV co-infection through a quality improvement strategy for scaling up of TB and HIV treatment integration in rural primary healthcare clinics in South Africa. The study is designed as an open-label cluster randomized controlled trial. Sixteen clinic supervisors who oversee 40 primary health care (PHC) clinics in two rural districts of KwaZulu-Natal, South Africa will be randomized to either the control group (provision of standard government guidance for TB-HIV integration) or the intervention group (provision of standard government guidance with active enhancement of TB-HIV care integration through a quality improvement approach). The primary outcome is all-cause mortality among TB-HIV patients. Secondary outcomes include time to antiretroviral therapy (ART) initiation among TB-HIV co-infected patients, as well as TB and HIV treatment outcomes at 12 months. In addition, factors that may affect the intervention, such as conditions in the clinic and staff availability, will be closely monitored and documented. This study has the potential to address the gap between the establishment of TB-HIV care integration policies and guidelines and their implementation in the provision of integrated care in PHC clinics. If successful, an evidence-based intervention comprising change ideas, tools, and approaches for quality improvement could inform the future rapid scale up, implementation, and sustainability of improved TB-HIV integration across sub-Sahara Africa and other resource

  19. Internet-based cognitive-behavior therapy for procrastination: A randomized controlled trial.

    Science.gov (United States)

    Rozental, Alexander; Forsell, Erik; Svensson, Andreas; Andersson, Gerhard; Carlbring, Per

    2015-08-01

    Procrastination can be a persistent behavior pattern associated with personal distress. However, research investigating different treatment interventions is scarce, and no randomized controlled trial has examined the efficacy of cognitive-behavior therapy (CBT). Meanwhile, Internet-based CBT has been found promising for several conditions, but has not yet been used for procrastination. Participants (N = 150) were randomized to guided self-help, unguided self-help, and wait-list control. Outcome measures were administered before and after treatment, or weekly throughout the treatment period. They included the Pure Procrastination Scale, the Irrational Procrastination Scale, the Susceptibility to Temptation Scale, the Montgomery Åsberg Depression Rating Scale-Self-report version, the Generalized Anxiety Disorder Assessment, and the Quality of Life Inventory. The intention-to-treat principle was used for all statistical analyses. Mixed-effects models revealed moderate between-groups effect sizes comparing guided and unguided self-help with wait-list control; the Pure Procrastination Scale, Cohen's d = 0.70, 95% confidence interval (CI) [0.29, 1.10], and d = 0.50, 95% CI [0.10, 0.90], and the Irrational Procrastination Scale, d = 0.81 95% CI [0.40, 1.22], and d = 0.69 95% CI [0.29, 1.09]. Clinically significant change was achieved among 31.3-40.0% for guided self-help, compared with 24.0-36.0% for unguided self-help. Neither of the treatment conditions was found to be superior on any of the outcome measures, Fs(98, 65.17-72.55) .19. Internet-based CBT could be useful for managing self-reported difficulties due to procrastination, both with and without the guidance of a therapist. (c) 2015 APA, all rights reserved).

  20. Can long-term antibiotic treatment prevent progression of peripheral arterial occlusive disease? A large, randomized, double-blinded, placebo-controlled trial

    DEFF Research Database (Denmark)

    Joensen, J B; Juul, Svend; Henneberg, E

    2007-01-01

    PURPOSE: The purpose was to investigate in a large, randomized, double-blinded, placebo-controlled trial, whether antibiotic treatment can prevent progression of peripheral arterial disease (PAD). MATERIAL AND METHODS: Five hundred and seven patients were included; all patients had an established...... analyzed mainly by Cox regression and linear regression. RESULTS: Included patients with PAD were randomized. Two patients withdrew. Of the remaining, 248 received roxithromycin and 257 placebo. In the treatment group 55% were seropositive and 53% in the placebo group. Mean follow-up was 2.1 years (range 0.......06-5.1 years). In the placebo group, 26 died and 80 primary events occurred in total. In the treatment group, 28 died and 74 primary events were observed. The hazard ratio of death was 1.13 (95% CI: 0.68; 1.90), and of primary events 0.92 (95% CI: 0.67; 1.26). Also on secondary events and ABPI changes...

  1. Diffusion of charged particles in strong large-scale random and regular magnetic fields

    International Nuclear Information System (INIS)

    Mel'nikov, Yu.P.

    2000-01-01

    The nonlinear collision integral for the Green's function averaged over a random magnetic field is transformed using an iteration procedure taking account of the strong random scattering of particles on the correlation length of the random magnetic field. Under this transformation the regular magnetic field is assumed to be uniform at distances of the order of the correlation length. The single-particle Green's functions of the scattered particles in the presence of a regular magnetic field are investigated. The transport coefficients are calculated taking account of the broadening of the cyclotron and Cherenkov resonances as a result of strong random scattering. The mean-free path lengths parallel and perpendicular to the regular magnetic field are found for a power-law spectrum of the random field. The analytical results obtained are compared with the experimental data on the transport ranges of solar and galactic cosmic rays in the interplanetary magnetic field. As a result, the conditions for the propagation of cosmic rays in the interplanetary space and a more accurate idea of the structure of the interplanetary magnetic field are determined

  2. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  3. Efficacy of electroacupuncture for symptoms of menopausal transition: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Liu, Zhishun; Wang, Yang; Xu, Huanfang; Wu, Jiani; He, Liyun; Jiang, John Yi; Yan, Shiyan; Du, Ruosang; Liu, Baoyan

    2014-06-21

    Previous studies have shown that acupuncture can alleviate postmenopausal symptoms, such as hot flashes, but few studies have assessed symptoms during the menopausal transition (MT) period. Thus, the effect of acupuncture upon MT symptoms is unclear. We designed a large-scale trial aimed at evaluating the efficacy of electroacupuncture for MT symptoms compared with sham electroacupuncture and at observing the safety of electroacupuncture. In this multicenter randomized controlled trial, 360 women will be randomized to either an electroacupuncture group or a sham electroacupuncture group. During the 8-week-long treatment, a menopause rating scale, average 24-hour hot flash score, Menopause-Specific Quality of Life Questionnaire score, and level of female hormones will be observed. Follow-ups at the 20th and 32nd week will be made. Though there is no completely inert placebo acupuncture and blinding is difficult in acupuncture trials, the placebo effect of EA can still be partially excluded in this study. For the placebo control, we use non-points and a tailor-made sham needle. This needle is different from a retractable needle, which is usually used for sham acupuncture. The needle in this trial is more simply constructed and more acceptable to Chinese people. We expect to evaluate the efficacy of electroacupuncture for MT symptoms and clarify its effect on these symptoms. ClinicalTrials.gov Identifier: NCT01849172 (Date of registration: 05/05/2013).

  4. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  5. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  6. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  7. Linear velocity fields in non-Gaussian models for large-scale structure

    Science.gov (United States)

    Scherrer, Robert J.

    1992-01-01

    Linear velocity fields in two types of physically motivated non-Gaussian models are examined for large-scale structure: seed models, in which the density field is a convolution of a density profile with a distribution of points, and local non-Gaussian fields, derived from a local nonlinear transformation on a Gaussian field. The distribution of a single component of the velocity is derived for seed models with randomly distributed seeds, and these results are applied to the seeded hot dark matter model and the global texture model with cold dark matter. An expression for the distribution of a single component of the velocity in arbitrary local non-Gaussian models is given, and these results are applied to such fields with chi-squared and lognormal distributions. It is shown that all seed models with randomly distributed seeds and all local non-Guassian models have single-component velocity distributions with positive kurtosis.

  8. Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development

    Science.gov (United States)

    Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dunner, R.; Essinger-Hileman, T.; Eimer, J.; hide

    2015-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe approx.70% of the sky. A variable-delay polarization modulator provides modulation of the polarization at approx.10Hz to suppress the 1/f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.

  9. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Compensating active power imbalances in power system with large-scale wind power penetration

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Altin, Müfit

    2016-01-01

    Large-scale wind power penetration can affectthe supply continuity in the power system. This is a matterof high priority to investigate, as more regulating reservesand specified control strategies for generation control arerequired in the future power system with even more highwind power penetrat...

  11. Galaxies distribution in the universe: large-scale statistics and structures

    International Nuclear Information System (INIS)

    Maurogordato, Sophie

    1988-01-01

    This research thesis addresses the distribution of galaxies in the Universe, and more particularly large scale statistics and structures. Based on an assessment of the main used statistical techniques, the author outlines the need to develop additional tools to correlation functions in order to characterise the distribution. She introduces a new indicator: the probability of a volume randomly tested in the distribution to be void. This allows a characterisation of void properties at the work scales (until 10h"-"1 Mpc) in the Harvard Smithsonian Center for Astrophysics Redshift Survey, or CfA catalog. A systematic analysis of statistical properties of different sub-samples has then been performed with respect to the size and location, luminosity class, and morphological type. This analysis is then extended to different scenarios of structure formation. A program of radial speed measurements based on observations allows the determination of possible relationships between apparent structures. The author also presents results of the search for south extensions of Perseus supernova [fr

  12. Apparent scale correlations in a random multifractal process

    DEFF Research Database (Denmark)

    Cleve, Jochen; Schmiegel, Jürgen; Greiner, Martin

    2008-01-01

    We discuss various properties of a homogeneous random multifractal process, which are related to the issue of scale correlations. By design, the process has no built-in scale correlations. However, when it comes to observables like breakdown coefficients, which are based on a coarse......-graining of the multifractal field, scale correlations do appear. In the log-normal limit of the model process, the conditional distributions and moments of breakdown coefficients reproduce the observations made in fully developed small-scale turbulence. These findings help to understand several puzzling empirical details...

  13. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  14. Scaling of coercivity in a 3d random anisotropy model

    Energy Technology Data Exchange (ETDEWEB)

    Proctor, T.C., E-mail: proctortc@gmail.com; Chudnovsky, E.M., E-mail: EUGENE.CHUDNOVSKY@lehman.cuny.edu; Garanin, D.A.

    2015-06-15

    The random-anisotropy Heisenberg model is numerically studied on lattices containing over ten million spins. The study is focused on hysteresis and metastability due to topological defects, and is relevant to magnetic properties of amorphous and sintered magnets. We are interested in the limit when ferromagnetic correlations extend beyond the size of the grain inside which the magnetic anisotropy axes are correlated. In that limit the coercive field computed numerically roughly scales as the fourth power of the random anisotropy strength and as the sixth power of the grain size. Theoretical arguments are presented that provide an explanation of numerical results. Our findings should be helpful for designing amorphous and nanosintered materials with desired magnetic properties. - Highlights: • We study the random-anisotropy model on lattices containing up to ten million spins. • Irreversible behavior due to topological defects (hedgehogs) is elucidated. • Hysteresis loop area scales as the fourth power of the random anisotropy strength. • In nanosintered magnets the coercivity scales as the six power of the grain size.

  15. Modeling and Coordinated Control Strategy of Large Scale Grid-Connected Wind/Photovoltaic/Energy Storage Hybrid Energy Conversion System

    Directory of Open Access Journals (Sweden)

    Lingguo Kong

    2015-01-01

    Full Text Available An AC-linked large scale wind/photovoltaic (PV/energy storage (ES hybrid energy conversion system for grid-connected application was proposed in this paper. Wind energy conversion system (WECS and PV generation system are the primary power sources of the hybrid system. The ES system, including battery and fuel cell (FC, is used as a backup and a power regulation unit to ensure continuous power supply and to take care of the intermittent nature of wind and photovoltaic resources. Static synchronous compensator (STATCOM is employed to support the AC-linked bus voltage and improve low voltage ride through (LVRT capability of the proposed system. An overall power coordinated control strategy is designed to manage real-power and reactive-power flows among the different energy sources, the storage unit, and the STATCOM system in the hybrid system. A simulation case study carried out on Western System Coordinating Council (WSCC 3-machine 9-bus test system for the large scale hybrid energy conversion system has been developed using the DIgSILENT/Power Factory software platform. The hybrid system performance under different scenarios has been verified by simulation studies using practical load demand profiles and real weather data.

  16. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  17. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  18. Synthesizing large-scale pyroclastic flows: Experimental design, scaling, and first results from PELE

    Science.gov (United States)

    Lube, G.; Breard, E. C. P.; Cronin, S. J.; Jones, J.

    2015-03-01

    Pyroclastic flow eruption large-scale experiment (PELE) is a large-scale facility for experimental studies of pyroclastic density currents (PDCs). It is used to generate high-energy currents involving 500-6500 m3 natural volcanic material and air that achieve velocities of 7-30 m s-1, flow thicknesses of 2-4.5 m, and runouts of >35 m. The experimental PDCs are synthesized by a controlled "eruption column collapse" of ash-lapilli suspensions onto an instrumented channel. The first set of experiments are documented here and used to elucidate the main flow regimes that influence PDC dynamic structure. Four phases are identified: (1) mixture acceleration during eruption column collapse, (2) column-slope impact, (3) PDC generation, and (4) ash cloud diffusion. The currents produced are fully turbulent flows and scale well to natural PDCs including small to large scales of turbulent transport. PELE is capable of generating short, pulsed, and sustained currents over periods of several tens of seconds, and dilute surge-like PDCs through to highly concentrated pyroclastic flow-like currents. The surge-like variants develop a basal <0.05 m thick regime of saltating/rolling particles and shifting sand waves, capped by a 2.5-4.5 m thick, turbulent suspension that grades upward to lower particle concentrations. Resulting deposits include stratified dunes, wavy and planar laminated beds, and thin ash cloud fall layers. Concentrated currents segregate into a dense basal underflow of <0.6 m thickness that remains aerated. This is capped by an upper ash cloud surge (1.5-3 m thick) with 100 to 10-4 vol % particles. Their deposits include stratified, massive, normally and reversely graded beds, lobate fronts, and laterally extensive veneer facies beyond channel margins.

  19. Bilateral robotic priming before task-oriented approach in subacute stroke rehabilitation: a pilot randomized controlled trial.

    Science.gov (United States)

    Hsieh, Yu-Wei; Wu, Ching-Yi; Wang, Wei-En; Lin, Keh-Chung; Chang, Ku-Chou; Chen, Chih-Chi; Liu, Chien-Ting

    2017-02-01

    To investigate the treatment effects of bilateral robotic priming combined with the task-oriented approach on motor impairment, disability, daily function, and quality of life in patients with subacute stroke. A randomized controlled trial. Occupational therapy clinics in medical centers. Thirty-one subacute stroke patients were recruited. Participants were randomly assigned to receive bilateral priming combined with the task-oriented approach (i.e., primed group) or to the task-oriented approach alone (i.e., unprimed group) for 90 minutes/day, 5 days/week for 4 weeks. The primed group began with the bilateral priming technique by using a bimanual robot-aided device. Motor impairments were assessed by the Fugal-Meyer Assessment, grip strength, and the Box and Block Test. Disability and daily function were measured by the modified Rankin Scale, the Functional Independence Measure, and actigraphy. Quality of life was examined by the Stroke Impact Scale. The primed and unprimed groups improved significantly on most outcomes over time. The primed group demonstrated significantly better improvement on the Stroke Impact Scale strength subscale ( p = 0.012) and a trend for greater improvement on the modified Rankin Scale ( p = 0.065) than the unprimed group. Bilateral priming combined with the task-oriented approach elicited more improvements in self-reported strength and disability degrees than the task-oriented approach by itself. Further large-scale research with at least 31 participants in each intervention group is suggested to confirm the study findings.

  20. The relationship between large-scale and convective states in the tropics - Towards an improved representation of convection in large-scale models

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, Christian [Monash Univ., Melbourne, VIC (Australia)

    2015-02-26

    This report summarises an investigation into the relationship of tropical thunderstorms to the atmospheric conditions they are embedded in. The study is based on the use of radar observations at the Atmospheric Radiation Measurement site in Darwin run under the auspices of the DOE Atmospheric Systems Research program. Linking the larger scales of the atmosphere with the smaller scales of thunderstorms is crucial for the development of the representation of thunderstorms in weather and climate models, which is carried out by a process termed parametrisation. Through the analysis of radar and wind profiler observations the project made several fundamental discoveries about tropical storms and quantified the relationship of the occurrence and intensity of these storms to the large-scale atmosphere. We were able to show that the rainfall averaged over an area the size of a typical climate model grid-box is largely controlled by the number of storms in the area, and less so by the storm intensity. This allows us to completely rethink the way we represent such storms in climate models. We also found that storms occur in three distinct categories based on their depth and that the transition between these categories is strongly related to the larger scale dynamical features of the atmosphere more so than its thermodynamic state. Finally, we used our observational findings to test and refine a new approach to cumulus parametrisation which relies on the stochastic modelling of the area covered by different convective cloud types.

  1. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  2. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  3. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  4. Large-scale alcohol use and socioeconomic position of origin: longitudinal study from ages 15 to 19 years

    DEFF Research Database (Denmark)

    Andersen, Anette; Holstein, Bjørn E; Due, Pernille

    2008-01-01

    AIM: To examine socioeconomic position (SEP) of origin as predictor of large-scale alcohol use in adolescence. METHODS: The study population was a random sample of 15-year-olds at baseline (n=843) with a first follow-up 4 years later (n=729). Excess alcohol intake was assessed by consumption last...

  5. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  6. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  7. Collisionless magnetic reconnection in large-scale electron-positron plasmas

    International Nuclear Information System (INIS)

    Daughton, William; Karimabadi, Homa

    2007-01-01

    One of the most fundamental questions in reconnection physics is how the dynamical evolution will scale to macroscopic systems of physical relevance. This issue is examined for electron-positron plasmas using two-dimensional fully kinetic simulations with both open and periodic boundary conditions. The resulting evolution is complex and highly dynamic throughout the entire duration. The initial phase is distinguished by the coalescence of tearing islands to larger scale while the later phase is marked by the expansion of diffusion regions into elongated current layers that are intrinsically unstable to plasmoid generation. It appears that the repeated formation and ejection of plasmoids plays a key role in controlling the average structure of a diffusion region and preventing the further elongation of the layer. The reconnection rate is modulated in time as the current layers expand and new plasmoids are formed. Although the specific details of this evolution are affected by the boundary and initial conditions, the time averaged reconnection rate remains fast and is remarkably insensitive to the system size for sufficiently large systems. This dynamic scenario offers an alternative explanation for fast reconnection in large-scale systems

  8. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  9. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  10. Quantity and quality assessment of randomized controlled trials on orthodontic practice in PubMed.

    Science.gov (United States)

    Shimada, Tatsuo; Takayama, Hisako; Nakamura, Yoshiki

    2010-07-01

    To find current high-quality evidence for orthodontic practice within a reasonable time, we tested the performance of a PubMed search. PubMed was searched using publication type randomized controlled trial and medical subject heading term "orthodontics" for articles published between 2003 and 2007. The PubMed search results were compared with those from a hand search of four orthodontic journals to determine the sensitivity of PubMed search. We evaluated the precision of the PubMed search result and assessed the quality of individual randomized controlled trials using the Jadad scale. Sensitivity and precision were 97.46% and 58.12%, respectively. In PubMed, of the 277 articles retrieved, 161 (58.12%) were randomized controlled trials on orthodontic practice, and 115 of the 161 articles (71.42%) were published in four orthodontic journals: American Journal of Orthodontics and Dentofacial Orthopedics, The Angle Orthodontist, the European Journal of Orthodontics, and the Journal of Orthodontics. Assessment by the Jadad scale revealed 60 high-quality randomized controlled trials on orthodontic practice, of which 45 (75%) were published in these four journals. PubMed is a highly desirable search engine for evidence-based orthodontic practice. To stay current and get high-quality evidence, it is reasonable to look through four orthodontic journals: American Journal of Orthodontics and Dentofacial Orthopedics, The Angle Orthodontist, the European Journal of Orthodontics, and the Journal of Orthodontics.

  11. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    ]; Peach et al., 1998; DeSante et al., 2001 are generally co–ordinated by ringing centres such as those that make up the membership of EURING. In some countries volunteer census work (often called Breeding Bird Surveys is undertaken by the same organizations while in others different bodies may co–ordinate this aspect of the work. This session was concerned with the analysis of such extensive data sets and the approaches that are being developed to address the key theoretical and applied issues outlined above. The papers reflect the development of more spatially explicit approaches to analyses of data gathered at large spatial scales. They show that while the statistical tools that have been developed in recent years can be used to derive useful biological conclusions from such data, there is additional need for further developments. Future work should also consider how to best implement such analytical developments within future study designs. In his plenary paper Andy Royle (Royle, 2004 addresses this theme directly by describing a general framework for modelling spatially replicated abundance data. The approach is based on the idea that a set of spatially referenced local populations constitutes a metapopulation, within which local abundance is determined as a random process. This provides an elegant and general approach in which the metapopulation model as described above is combined with a data–generating model specific to the type of data being analysed to define a simple hierarchical model that can be analysed using conventional methods. It should be noted, however, that further software development will be needed if the approach is to be made readily available to biologists. The approach is well suited to dealing with sparse data and avoids the need for data aggregation prior to analysis. Spatial synchrony has received most attention in studies of species whose populations show cyclic fluctuations, particularly certain game birds and small mammals. However

  12. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  13. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  14. Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.

    Science.gov (United States)

    Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk

    2015-01-01

    Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system

  15. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  16. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  17. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  18. Preference towards Control in Risk Taking: Control, No Control, or Randomize?

    OpenAIRE

    Li, King King

    2010-01-01

    This paper experimentally investigates preference towards different methods of control in risk taking. Participants are asked to choose between different ways for choosing which numbers to bet on for a gamble. They can choose the numbers themselves (control), let the experimenter choose (no control), or randomize. It is found that in addition to the more conventional preference for control, some participants prefer not to control, or randomization. These preferences are robust as participants...

  19. Efficacy of a medical food in mild Alzheimer's disease: A randomized, controlled trial.

    Science.gov (United States)

    Scheltens, Philip; Kamphuis, Patrick J G H; Verhey, Frans R J; Olde Rikkert, Marcel G M; Wurtman, Richard J; Wilkinson, David; Twisk, Jos W R; Kurz, Alexander

    2010-01-01

    To investigate the effect of a medical food on cognitive function in people with mild Alzheimer's disease (AD). A total of 225 drug-naïve AD patients participated in this randomized, double-blind controlled trial. Patients were randomized to active product, Souvenaid, or a control drink, taken once-daily for 12 weeks. Primary outcome measures were the delayed verbal recall task of the Wechsler Memory Scale-revised, and the 13-item modified Alzheimer's Disease Assessment Scale-cognitive subscale at week 12. At 12 weeks, significant improvement in the delayed verbal recall task was noted in the active group compared with control (P = .021). Modified Alzheimer's Disease Assessment Scale-cognitive subscale and other outcome scores (e.g., Clinician Interview Based Impression of Change plus Caregiver Input, 12-item Neuropsychiatric Inventory, Alzheimer's disease Co-operative Study-Activities of Daily Living, Quality of Life in Alzheimer's Disease) were unchanged. The control group neither deteriorated nor improved. Compliance was excellent (95%) and the product was well tolerated. Supplementation with a medical food including phosphatide precursors and cofactors for 12 weeks improved memory (delayed verbal recall) in mild AD patients. This proof-of-concept study justifies further clinical trials. 2010 The Alzheimer's Association. All rights reserved.

  20. Scaling behaviour of randomly alternating surface growth processes

    International Nuclear Information System (INIS)

    Raychaudhuri, Subhadip; Shapir, Yonathan

    2002-01-01

    The scaling properties of the roughness of surfaces grown by two different processes randomly alternating in time are addressed. The duration of each application of the two primary processes is assumed to be independently drawn from given distribution functions. We analytically address processes in which the two primary processes are linear and extend the conclusions to nonlinear processes as well. The growth scaling exponent of the average roughness with the number of applications is found to be determined by the long time tail of the distribution functions. For processes in which both mean application times are finite, the scaling behaviour follows that of the corresponding cyclical process in which the uniform application time of each primary process is given by its mean. If the distribution functions decay with a small enough power law for the mean application times to diverge, the growth exponent is found to depend continuously on this power-law exponent. In contrast, the roughness exponent does not depend on the timing of the applications. The analytical results are supported by numerical simulations of various pairs of primary processes and with different distribution functions. Self-affine surfaces grown by two randomly alternating processes are common in nature (e.g., due to randomly changing weather conditions) and in man-made devices such as rechargeable batteries

  1. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  2. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  3. Water hammer and column separation due to accidental simultaneous closure of control valves in a large scale two-phase flow experimental test rig

    NARCIS (Netherlands)

    Bergant, A.; Westende, van 't J.M.C.; Koppel, T.; Gale, J.; Hou, Q.; Pandula, Z.; Tijsseling, A.S.

    2010-01-01

    A large-scale pipeline test rig at Deltares, Delft, The Netherlands has been used for filling and emptying experiments. Tests have been conducted in a horizontal 250 mm diameter PVC pipe of 258 m length with control valves at the downstream and upstream ends. This paper investigates the accidental

  4. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  5. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  6. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  7. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  8. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  9. A Decentralized Multivariable Robust Adaptive Voltage and Speed Regulator for Large-Scale Power Systems

    Science.gov (United States)

    Okou, Francis A.; Akhrif, Ouassima; Dessaint, Louis A.; Bouchard, Derrick

    2013-05-01

    This papter introduces a decentralized multivariable robust adaptive voltage and frequency regulator to ensure the stability of large-scale interconnnected generators. Interconnection parameters (i.e. load, line and transormer parameters) are assumed to be unknown. The proposed design approach requires the reformulation of conventiaonal power system models into a multivariable model with generator terminal voltages as state variables, and excitation and turbine valve inputs as control signals. This model, while suitable for the application of modern control methods, introduces problems with regards to current design techniques for large-scale systems. Interconnection terms, which are treated as perturbations, do not meet the common matching condition assumption. A new adaptive method for a certain class of large-scale systems is therefore introduces that does not require the matching condition. The proposed controller consists of nonlinear inputs that cancel some nonlinearities of the model. Auxiliary controls with linear and nonlinear components are used to stabilize the system. They compensate unknown parametes of the model by updating both the nonlinear component gains and excitation parameters. The adaptation algorithms involve the sigma-modification approach for auxiliary control gains, and the projection approach for excitation parameters to prevent estimation drift. The computation of the matrix-gain of the controller linear component requires the resolution of an algebraic Riccati equation and helps to solve the perturbation-mismatching problem. A realistic power system is used to assess the proposed controller performance. The results show that both stability and transient performance are considerably improved following a severe contingency.

  10. Stabilization of Continuous-Time Random Switching Systems via a Fault-Tolerant Controller

    Directory of Open Access Journals (Sweden)

    Guoliang Wang

    2017-01-01

    Full Text Available This paper focuses on the stabilization problem of continuous-time random switching systems via exploiting a fault-tolerant controller, where the dwell time of each subsystem consists of a fixed part and random part. It is known from the traditional design methods that the computational complexity of LMIs related to the quantity of fault combination is very large; particularly system dimension or amount of subsystems is large. In order to reduce the number of the used fault combinations, new sufficient LMI conditions for designing such a controller are established by a robust approach, which are fault-free and could be solved directly. Moreover, the fault-tolerant stabilization realized by a mode-independent controller is considered and suitably applied to a practical case without mode information. Finally, a numerical example is used to demonstrate the effectiveness and superiority of the proposed methods.

  11. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  12. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  13. Large scale replacement of fuel channels in the Pickering CANDU reactor using a man-in-the-loop remote control system

    International Nuclear Information System (INIS)

    Stratton, D.

    1991-01-01

    Spar Aerospace Limited of Toronto is presently under contract to Ontario Hydro to design a Remote Manipulation and Control System (RMCS) to be used during the large scale replacement of the fuel channels in the Pickering A Nuclear Generating Station. The system is designed to support the replacement of all 390 fuel channels in each of the four reactors at the Pickering A station in a safe manner that minimizes worker radiation exposure and unit outage time

  14. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  15. After-School Multifamily Groups: A Randomized Controlled Trial Involving Low-Income, Urban, Latino Children

    Science.gov (United States)

    McDonald, Lynn; Moberg, D. Paul; Brown, Roger; Rodriguez-Espiricueta, Ismael; Flores, Nydia I.; Burke, Melissa P.; Coover, Gail

    2006-01-01

    This randomized controlled trial evaluated a culturally representative parent engagement strategy with Latino parents of elementary school children. Ten urban schools serving low-income children from mixed cultural backgrounds participated in a large study. Classrooms were randomly assigned either either to an after-school, multifamily support…

  16. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  17. Coordinated SLNR based Precoding in Large-Scale Heterogeneous Networks

    KAUST Repository

    Boukhedimi, Ikram

    2017-03-06

    This work focuses on the downlink of large-scale two-tier heterogeneous networks composed of a macro-cell overlaid by micro-cell networks. Our interest is on the design of coordinated beamforming techniques that allow to mitigate the inter-cell interference. Particularly, we consider the case in which the coordinating base stations (BSs) have imperfect knowledge of the channel state information. Under this setting, we propose a regularized SLNR based precoding design in which the regularization factor is used to allow better resilience with respect to the channel estimation errors. Based on tools from random matrix theory, we provide an analytical analysis of the SINR and SLNR performances. These results are then exploited to propose a proper setting of the regularization factor. Simulation results are finally provided in order to validate our findings and to confirm the performance of the proposed precoding scheme.

  18. Coordinated SLNR based Precoding in Large-Scale Heterogeneous Networks

    KAUST Repository

    Boukhedimi, Ikram; Kammoun, Abla; Alouini, Mohamed-Slim

    2017-01-01

    This work focuses on the downlink of large-scale two-tier heterogeneous networks composed of a macro-cell overlaid by micro-cell networks. Our interest is on the design of coordinated beamforming techniques that allow to mitigate the inter-cell interference. Particularly, we consider the case in which the coordinating base stations (BSs) have imperfect knowledge of the channel state information. Under this setting, we propose a regularized SLNR based precoding design in which the regularization factor is used to allow better resilience with respect to the channel estimation errors. Based on tools from random matrix theory, we provide an analytical analysis of the SINR and SLNR performances. These results are then exploited to propose a proper setting of the regularization factor. Simulation results are finally provided in order to validate our findings and to confirm the performance of the proposed precoding scheme.

  19. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  20. A novel adaptive synchronization control of a class of master-slave large-scale systems with unknown channel time-delay

    Science.gov (United States)

    Shen, Qikun; Zhang, Tianping

    2015-05-01

    The paper addresses a practical issue for adaptive synchronization in master-slave large-scale systems with constant channel time-delay., and a novel adaptive synchronization control scheme is proposed to guarantee the synchronization errors asymptotically converge to the origin, in which the matching condition as in the related literatures is not necessary. The real value of channel time-delay can be estimated online by a proper adaptation mechanism, which removes the conditions that the channel time-delay should be known exactly as in existing works. Finally, simulation results demonstrate the effectiveness of the approach.

  1. The Effect of India's Total Sanitation Campaign on Defecation Behaviors and Child Health in Rural Madhya Pradesh: A Cluster Randomized Controlled Trial

    Science.gov (United States)

    Patil, Sumeet R.; Arnold, Benjamin F.; Salvatore, Alicia L.; Briceno, Bertha; Ganguly, Sandipan; Colford, John M.; Gertler, Paul J.

    2014-01-01

    Background Poor sanitation is thought to be a major cause of enteric infections among young children. However, there are no previously published randomized trials to measure the health impacts of large-scale sanitation programs. India's Total Sanitation Campaign (TSC) is one such program that seeks to end the practice of open defecation by changing social norms and behaviors, and providing technical support and financial subsidies. The objective of this study was to measure the effect of the TSC implemented with capacity building support from the World Bank's Water and Sanitation Program in Madhya Pradesh on availability of individual household latrines (IHLs), defecation behaviors, and child health (diarrhea, highly credible gastrointestinal illness [HCGI], parasitic infections, anemia, growth). Methods and Findings We conducted a cluster-randomized, controlled trial in 80 rural villages. Field staff collected baseline measures of sanitation conditions, behaviors, and child health (May–July 2009), and revisited households 21 months later (February–April 2011) after the program was delivered. The study enrolled a random sample of 5,209 children <5 years old from 3,039 households that had at least one child <24 months at the beginning of the study. A random subsample of 1,150 children <24 months at enrollment were tested for soil transmitted helminth and protozoan infections in stool. The randomization successfully balanced intervention and control groups, and we estimated differences between groups in an intention to treat analysis. The intervention increased percentage of households in a village with improved sanitation facilities as defined by the WHO/UNICEF Joint Monitoring Programme by an average of 19% (95% CI for difference: 12%–26%; group means: 22% control versus 41% intervention), decreased open defecation among adults by an average of 10% (95% CI for difference: 4%–15%; group means: 73% intervention versus 84% control). However, the intervention

  2. The effect of India's total sanitation campaign on defecation behaviors and child health in rural Madhya Pradesh: a cluster randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Sumeet R Patil

    2014-08-01

    Full Text Available Poor sanitation is thought to be a major cause of enteric infections among young children. However, there are no previously published randomized trials to measure the health impacts of large-scale sanitation programs. India's Total Sanitation Campaign (TSC is one such program that seeks to end the practice of open defecation by changing social norms and behaviors, and providing technical support and financial subsidies. The objective of this study was to measure the effect of the TSC implemented with capacity building support from the World Bank's Water and Sanitation Program in Madhya Pradesh on availability of individual household latrines (IHLs, defecation behaviors, and child health (diarrhea, highly credible gastrointestinal illness [HCGI], parasitic infections, anemia, growth.We conducted a cluster-randomized, controlled trial in 80 rural villages. Field staff collected baseline measures of sanitation conditions, behaviors, and child health (May-July 2009, and revisited households 21 months later (February-April 2011 after the program was delivered. The study enrolled a random sample of 5,209 children <5 years old from 3,039 households that had at least one child <24 months at the beginning of the study. A random subsample of 1,150 children <24 months at enrollment were tested for soil transmitted helminth and protozoan infections in stool. The randomization successfully balanced intervention and control groups, and we estimated differences between groups in an intention to treat analysis. The intervention increased percentage of households in a village with improved sanitation facilities as defined by the WHO/UNICEF Joint Monitoring Programme by an average of 19% (95% CI for difference: 12%-26%; group means: 22% control versus 41% intervention, decreased open defecation among adults by an average of 10% (95% CI for difference: 4%-15%; group means: 73% intervention versus 84% control. However, the intervention did not improve child health

  3. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  4. Distributed Model Predictive Control over Multiple Groups of Vehicles in Highway Intelligent Space for Large Scale System

    Directory of Open Access Journals (Sweden)

    Tang Xiaofeng

    2014-01-01

    Full Text Available The paper presents the three time warning distances for solving the large scale system of multiple groups of vehicles safety driving characteristics towards highway tunnel environment based on distributed model prediction control approach. Generally speaking, the system includes two parts. First, multiple vehicles are divided into multiple groups. Meanwhile, the distributed model predictive control approach is proposed to calculate the information framework of each group. Each group of optimization performance considers the local optimization and the neighboring subgroup of optimization characteristics, which could ensure the global optimization performance. Second, the three time warning distances are studied based on the basic principles used for highway intelligent space (HIS and the information framework concept is proposed according to the multiple groups of vehicles. The math model is built to avoid the chain avoidance of vehicles. The results demonstrate that the proposed highway intelligent space method could effectively ensure driving safety of multiple groups of vehicles under the environment of fog, rain, or snow.

  5. Development of a Shipboard Remote Control and Telemetry Experimental System for Large-Scale Model's Motions and Loads Measurement in Realistic Sea Waves.

    Science.gov (United States)

    Jiao, Jialong; Ren, Huilong; Adenya, Christiaan Adika; Chen, Chaohe

    2017-10-29

    Wave-induced motion and load responses are important criteria for ship performance evaluation. Physical experiments have long been an indispensable tool in the predictions of ship's navigation state, speed, motions, accelerations, sectional loads and wave impact pressure. Currently, majority of the experiments are conducted in laboratory tank environment, where the wave environments are different from the realistic sea waves. In this paper, a laboratory tank testing system for ship motions and loads measurement is reviewed and reported first. Then, a novel large-scale model measurement technique is developed based on the laboratory testing foundations to obtain accurate motion and load responses of ships in realistic sea conditions. For this purpose, a suite of advanced remote control and telemetry experimental system was developed in-house to allow for the implementation of large-scale model seakeeping measurement at sea. The experimental system includes a series of technique sensors, e.g., the Global Position System/Inertial Navigation System (GPS/INS) module, course top, optical fiber sensors, strain gauges, pressure sensors and accelerometers. The developed measurement system was tested by field experiments in coastal seas, which indicates that the proposed large-scale model testing scheme is capable and feasible. Meaningful data including ocean environment parameters, ship navigation state, motions and loads were obtained through the sea trial campaign.

  6. Bioinspired large-scale aligned porous materials assembled with dual temperature gradients.

    Science.gov (United States)

    Bai, Hao; Chen, Yuan; Delattre, Benjamin; Tomsia, Antoni P; Ritchie, Robert O

    2015-12-01

    Natural materials, such as bone, teeth, shells, and wood, exhibit outstanding properties despite being porous and made of weak constituents. Frequently, they represent a source of inspiration to design strong, tough, and lightweight materials. Although many techniques have been introduced to create such structures, a long-range order of the porosity as well as a precise control of the final architecture remain difficult to achieve. These limitations severely hinder the scale-up fabrication of layered structures aimed for larger applications. We report on a bidirectional freezing technique to successfully assemble ceramic particles into scaffolds with large-scale aligned, lamellar, porous, nacre-like structure and long-range order at the centimeter scale. This is achieved by modifying the cold finger with a polydimethylsiloxane (PDMS) wedge to control the nucleation and growth of ice crystals under dual temperature gradients. Our approach could provide an effective way of manufacturing novel bioinspired structural materials, in particular advanced materials such as composites, where a higher level of control over the structure is required.

  7. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  8. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  9. Randomized controlled pilot study to compare Homeopathy and Conventional therapy in Acute Otitis Media.

    Science.gov (United States)

    Sinha, M N; Siddiqui, V A; Nayak, C; Singh, Vikram; Dixit, Rupali; Dewan, Deepti; Mishra, Alok

    2012-01-01

    To compare the effectiveness of Homeopathy and Conventional therapy in Acute Otitis Media (AOM). A randomized placebo-controlled parallel group pilot study of homeopathic vs conventional treatment for AOM was conducted in Jaipur, India. Patients were randomized by a computer generated random number list to receive either individualized homeopathic medicines in fifty millesimal (LM) potencies, or conventional treatment including analgesics, antipyretics and anti-inflammatory drugs. Patients who did not improve were prescribed antibiotics at the 3rd day. Outcomes were assessed by the Acute Otitis Media-Severity of Symptoms (AOM-SOS) Scale and Tympanic Membrane Examination over 21 days. 81 patients were included, 80 completed follow-up: 41 for conventional and 40 for homeopathic treatment. In the Conventional group, all 40 (100%) patients were cured, in the Homeopathy group, 38 (95%) patients were cured while 02 (5%) patients were lost to the last two follow-up. By the 3rd day of treatment, 4 patients were cured in Homeopathy group but in Conventional group only one patient was cured. In the Conventional group antibiotics were prescribed in 39 (97.5%), no antibiotics were required in the Homeopathy group. 85% of patients were prescribed six homeopathic medicines. Individualized homeopathy is an effective conventional treatment in AOM, there were no significant differences between groups in the main outcome. Symptomatic improvement was quicker in the Homeopathy group, and there was a large difference in antibiotic requirements, favouring homeopathy. Further work on a larger scale should be conducted. Copyright © 2011 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  10. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  11. Bio-inspired wooden actuators for large scale applications.

    Science.gov (United States)

    Rüggeberg, Markus; Burgert, Ingo

    2015-01-01

    Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules.

  12. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  13. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  14. Local properties of the large-scale peaks of the CMB temperature

    Energy Technology Data Exchange (ETDEWEB)

    Marcos-Caballero, A.; Martínez-González, E.; Vielva, P., E-mail: marcos@ifca.unican.es, E-mail: martinez@ifca.unican.es, E-mail: vielva@ifca.unican.es [Instituto de Física de Cantabria, CSIC-Universidad de Cantabria, Avda. de los Castros s/n, 39005 Santander (Spain)

    2017-05-01

    In the present work, we study the largest structures of the CMB temperature measured by Planck in terms of the most prominent peaks on the sky, which, in particular, are located in the southern galactic hemisphere. Besides these large-scale features, the well-known Cold Spot anomaly is included in the analysis. All these peaks would contribute significantly to some of the CMB large-scale anomalies, as the parity and hemispherical asymmetries, the dipole modulation, the alignment between the quadrupole and the octopole, or in the case of the Cold Spot, to the non-Gaussianity of the field. The analysis of the peaks is performed by using their multipolar profiles, which characterize the local shape of the peaks in terms of the discrete Fourier transform of the azimuthal angle. In order to quantify the local anisotropy of the peaks, the distribution of the phases of the multipolar profiles is studied by using the Rayleigh random walk methodology. Finally, a direct analysis of the 2-dimensional field around the peaks is performed in order to take into account the effect of the galactic mask. The results of the analysis conclude that, once the peak amplitude and its first and second order derivatives at the centre are conditioned, the rest of the field is compatible with the standard model. In particular, it is observed that the Cold Spot anomaly is caused by the large value of curvature at the centre.

  15. Finite-size scaling of the entanglement entropy of the quantum Ising chain with homogeneous, periodically modulated and random couplings

    International Nuclear Information System (INIS)

    Iglói, Ferenc; Lin, Yu-Cheng

    2008-01-01

    Using free-fermionic techniques we study the entanglement entropy of a block of contiguous spins in a large finite quantum Ising chain in a transverse field, with couplings of different types: homogeneous, periodically modulated and random. We carry out a systematic study of finite-size effects at the quantum critical point, and evaluate subleading corrections both for open and for periodic boundary conditions. For a block corresponding to a half of a finite chain, the position of the maximum of the entropy as a function of the control parameter (e.g. the transverse field) can define the effective critical point in the finite sample. On the basis of homogeneous chains, we demonstrate that the scaling behavior of the entropy near the quantum phase transition is in agreement with the universality hypothesis, and calculate the shift of the effective critical point, which has different scaling behaviors for open and for periodic boundary conditions

  16. Characterizing Android apps’ behavior for effective detection of malapps at large scale

    KAUST Repository

    Wang, Xing

    2017-05-06

    Android malicious applications (malapps) have surged and been sophisticated, posing a great threat to users. How to characterize, understand and detect Android malapps at a large scale is thus a big challenge. In this work, we are motivated to discover the discriminatory and persistent features extracted from Android APK files for automated malapp detection at a large scale. To achieve this goal, firstly we extract a very large number of features from each app and categorize the features into two groups, namely, app-specific features as well as platform-defined features. These feature sets will then be fed into four classifiers (i.e., Logistic Regression, linear SVM, Decision Tree and Random Forest) for the detection of malapps. Secondly, we evaluate the persistence of app-specific and platform-defined features on classification performance with two data sets collected in different time periods. Thirdly, we comprehensively analyze the relevant features selected by Logistic Regression classifier to identify the contributions of each feature set. We conduct extensive experiments on large real-world app sets consisting of 213,256 benign apps collected from six app markets, 4,363 benign apps from Google Play market, and 18,363 malapps. The experimental results and our analysis give insights regarding what discriminatory features are most effective to characterize malapps for building an effective and efficient malapp detection system. With the selected discriminatory features, the Logistic Regression classifier yields the best true positive rate as 96% with a false positive rate as 0.06%.

  17. A polymer, random walk model for the size-distribution of large DNA fragments after high linear energy transfer radiation

    Science.gov (United States)

    Ponomarev, A. L.; Brenner, D.; Hlatky, L. R.; Sachs, R. K.

    2000-01-01

    DNA double-strand breaks (DSBs) produced by densely ionizing radiation are not located randomly in the genome: recent data indicate DSB clustering along chromosomes. Stochastic DSB clustering at large scales, from > 100 Mbp down to simulations and analytic equations. A random-walk, coarse-grained polymer model for chromatin is combined with a simple track structure model in Monte Carlo software called DNAbreak and is applied to data on alpha-particle irradiation of V-79 cells. The chromatin model neglects molecular details but systematically incorporates an increase in average spatial separation between two DNA loci as the number of base-pairs between the loci increases. Fragment-size distributions obtained using DNAbreak match data on large fragments about as well as distributions previously obtained with a less mechanistic approach. Dose-response relations, linear at small doses of high linear energy transfer (LET) radiation, are obtained. They are found to be non-linear when the dose becomes so large that there is a significant probability of overlapping or close juxtaposition, along one chromosome, for different DSB clusters from different tracks. The non-linearity is more evident for large fragments than for small. The DNAbreak results furnish an example of the RLC (randomly located clusters) analytic formalism, which generalizes the broken-stick fragment-size distribution of the random-breakage model that is often applied to low-LET data.

  18. Large-scale compositional heterogeneity in the Earth's mantle

    Science.gov (United States)

    Ballmer, M.

    2017-12-01

    Seismic imaging of subducted Farallon and Tethys lithosphere in the lower mantle has been taken as evidence for whole-mantle convection, and efficient mantle mixing. However, cosmochemical constraints point to a lower-mantle composition that has a lower Mg/Si compared to upper-mantle pyrolite. Moreover, geochemical signatures of magmatic rocks indicate the long-term persistence of primordial reservoirs somewhere in the mantle. In this presentation, I establish geodynamic mechanisms for sustaining large-scale (primordial) heterogeneity in the Earth's mantle using numerical models. Mantle flow is controlled by rock density and viscosity. Variations in intrinsic rock density, such as due to heterogeneity in basalt or iron content, can induce layering or partial layering in the mantle. Layering can be sustained in the presence of persistent whole mantle convection due to active "unmixing" of heterogeneity in low-viscosity domains, e.g. in the transition zone or near the core-mantle boundary [1]. On the other hand, lateral variations in intrinsic rock viscosity, such as due to heterogeneity in Mg/Si, can strongly affect the mixing timescales of the mantle. In the extreme case, intrinsically strong rocks may remain unmixed through the age of the Earth, and persist as large-scale domains in the mid-mantle due to focusing of deformation along weak conveyor belts [2]. That large-scale lateral heterogeneity and/or layering can persist in the presence of whole-mantle convection can explain the stagnation of some slabs, as well as the deflection of some plumes, in the mid-mantle. These findings indeed motivate new seismic studies for rigorous testing of model predictions. [1] Ballmer, M. D., N. C. Schmerr, T. Nakagawa, and J. Ritsema (2015), Science Advances, doi:10.1126/sciadv.1500815. [2] Ballmer, M. D., C. Houser, J. W. Hernlund, R. Wentzcovitch, and K. Hirose (2017), Nature Geoscience, doi:10.1038/ngeo2898.

  19. Large Scale Simulation Platform for NODES Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Sotorrio, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Qin, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Min, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and light commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.

  20. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  1. HiQuant: Rapid Postquantification Analysis of Large-Scale MS-Generated Proteomics Data.

    Science.gov (United States)

    Bryan, Kenneth; Jarboui, Mohamed-Ali; Raso, Cinzia; Bernal-Llinares, Manuel; McCann, Brendan; Rauch, Jens; Boldt, Karsten; Lynn, David J

    2016-06-03

    Recent advances in mass-spectrometry-based proteomics are now facilitating ambitious large-scale investigations of the spatial and temporal dynamics of the proteome; however, the increasing size and complexity of these data sets is overwhelming current downstream computational methods, specifically those that support the postquantification analysis pipeline. Here we present HiQuant, a novel application that enables the design and execution of a postquantification workflow, including common data-processing steps, such as assay normalization and grouping, and experimental replicate quality control and statistical analysis. HiQuant also enables the interpretation of results generated from large-scale data sets by supporting interactive heatmap analysis and also the direct export to Cytoscape and Gephi, two leading network analysis platforms. HiQuant may be run via a user-friendly graphical interface and also supports complete one-touch automation via a command-line mode. We evaluate HiQuant's performance by analyzing a large-scale, complex interactome mapping data set and demonstrate a 200-fold improvement in the execution time over current methods. We also demonstrate HiQuant's general utility by analyzing proteome-wide quantification data generated from both a large-scale public tyrosine kinase siRNA knock-down study and an in-house investigation into the temporal dynamics of the KSR1 and KSR2 interactomes. Download HiQuant, sample data sets, and supporting documentation at http://hiquant.primesdb.eu .

  2. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  3. Randomized Trial of Anger Control Training for Adolescents with Tourette's Syndrome and Disruptive Behavior

    Science.gov (United States)

    Sukhdolsky, Denis G.; Vitulano, Lawrence A.; Carroll, Deirdre H.; McGuire, Joseph; Leckman, James F.; Scahill, Lawrence

    2009-01-01

    A randomized trial to examine the efficacy of anger control training for treating adolescents with Tourette's syndrome and disruptive behavior reveals that those administered with the anger control training showed a decrease in their Disruptive Behavior Rating Scale score by 52 percent as compared with a decrease of 11 percent in the treatment as…

  4. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  5. Acupuncture lowering blood pressure for secondary prevention of stroke: a study protocol for a multicenter randomized controlled trial.

    Science.gov (United States)

    Du, Yu-Zheng; Gao, Xin-Xin; Wang, Cheng-Ting; Zheng, Hai-Zhen; Lei, Yun; Wu, Meng-Han; Shi, Xue-Min; Ban, Hai-Peng; Gu, Wen-Long; Meng, Xiang-Gang; Wei, Mao-Ti; Hu, Chun-Xiao

    2017-09-15

    Stroke is the prime cause of morbidity and mortality in the general population, and hypertension will increase the recurrence and mortality of stroke. We report a protocol of a pragmatic randomized controlled trial (RCT) using blood pressure (BP)-lowering acupuncture add-on treatment to treat patients with hypertension and stroke. This is a large-scale, multicenter, subject-, assessor- and analyst-blinded, pragmatic RCT. A total of 480 patients with hypertension and ischemic stroke will be randomly assigned to two groups: an experimental group and a control group. The experimental group will receive "HuoXueSanFeng" acupuncture combined with one antihypertensive medication in addition to routine ischemic stroke treatment. The control group will only receive one antihypertensive medication and basic treatments for ischemic stroke. HuoXueSanFeng acupuncture will be given for six sessions weekly for the first 6 weeks and three times weekly for the next 6 weeks. A 9-month follow-up will, thereafter, be conducted. Antihypertensive medication will be adjusted based on BP levels. The primary outcome will be the recurrence of stroke. The secondary outcomes including 24-h ambulatory BP, the TCM syndrome score, the Short Form 36-item Health Survey (SF-36), the National Institute of Health Stroke Scale (NIHSS), as well as the Barthel Index (BI) scale will be assessed at baseline, 6 weeks and 12 weeks post initiating treatments; cardiac ultrasound, carotid artery ultrasound, transcranial Doppler, and lower extremity ultrasound will be evaluated at baseline and 12 weeks after treatment. The safety of acupuncture will also be assessed. We aim to determine the clinical effects of controlling BP for secondary prevention of stroke with acupuncture add-on treatment. ClinicalTrials.gov, ID: NCT02967484 . Registered on 13 February 2017; last updated on 27 June 2017.

  6. Topology of Large-Scale Structure by Galaxy Type: Hydrodynamic Simulations

    Science.gov (United States)

    Gott, J. Richard, III; Cen, Renyue; Ostriker, Jeremiah P.

    1996-07-01

    The topology of large-scale structure is studied as a function of galaxy type using the genus statistic. In hydrodynamical cosmological cold dark matter simulations, galaxies form on caustic surfaces (Zeldovich pancakes) and then slowly drain onto filaments and clusters. The earliest forming galaxies in the simulations (defined as "ellipticals") are thus seen at the present epoch preferentially in clusters (tending toward a meatball topology), while the latest forming galaxies (defined as "spirals") are seen currently in a spongelike topology. The topology is measured by the genus (number of "doughnut" holes minus number of isolated regions) of the smoothed density-contour surfaces. The measured genus curve for all galaxies as a function of density obeys approximately the theoretical curve expected for random- phase initial conditions, but the early-forming elliptical galaxies show a shift toward a meatball topology relative to the late-forming spirals. Simulations using standard biasing schemes fail to show such an effect. Large observational samples separated by galaxy type could be used to test for this effect.

  7. Effect of an Ecological Executive Skill Training Program for School-aged Children with Attention Deficit Hyperactivity Disorder: A Randomized Controlled Clinical Trial.

    Science.gov (United States)

    Qian, Ying; Chen, Min; Shuai, Lan; Cao, Qing-Jiu; Yang, Li; Wang, Yu-Feng

    2017-07-05

    As medication does not normalize outcomes of children with attention deficit hyperactivity disorder (ADHD), especially in real-life functioning, nonpharmacological methods are important to target this field. This randomized controlled clinical trial was designed to evaluate the effects of a comprehensive executive skill training program for school-aged children with ADHD in a relatively large sample. The children (aged 6-12 years) with ADHD were randomized to the intervention or waitlist groups. A healthy control group was composed of gender- and age-matched healthy children. The intervention group received a 12-session training program for multiple executive skills. Executive function (EF), ADHD symptoms, and social functioning in the intervention and waitlist groups were evaluated at baseline and the end of the final training session. The healthy controls (HCs) were only assessed once at baseline. Repeated measures analyses of variance were used to compare EF, ADHD symptoms, and social function between intervention and waitlist groups. Thirty-eight children with ADHD in intervention group, 30 in waitlist group, and 23 healthy children in healthy control group were included in final analysis. At posttreatment, intervention group showed significantly lower Behavior Rating Inventory of Executive Function (BRIEF) total score (135.89 ± 16.80 vs. 146.09 ± 23.92, P= 0.04) and monitoring score (18.05 ± 2.67 vs. 19.77 ± 3.10, P= 0.02), ADHD-IV overall score (41.11 ± 7.48 vs. 47.20 ± 8.47, PADHD-IV overall score (F = 21.72, PADHD-rating scale-IV, and WEISS Functional Impairment Scale-Parent form (WFIRS-P) among the intervention and waitlist groups at posttreatment and HCs at baseline. This randomized controlled study on executive skill training in a relatively large sample provided some evidences that the training could improve EF deficits, reduce problematic symptoms, and potentially enhance the social functioning in school-aged children with ADHD. http

  8. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  9. The topology of large-scale structure. III - Analysis of observations. [in universe

    Science.gov (United States)

    Gott, J. Richard, III; Weinberg, David H.; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.

    1989-01-01

    A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.

  10. Combination Analgesia for Neonatal Circumcision: A Randomized Controlled Trial.

    Science.gov (United States)

    Sharara-Chami, Rana; Lakissian, Zavi; Charafeddine, Lama; Milad, Nadine; El-Hout, Yaser

    2017-12-01

    There is no consensus on the most effective pain management for neonatal circumcision. We sought to compare different modalities. This is a double-blinded randomized controlled trial comparing 3 combination analgesics used during circumcision (EMLA + sucrose; EMLA + sucrose + dorsal penile nerve block [DPNB]; EMLA + sucrose + ring block [RB]) with the traditional topical analgesic cream EMLA alone. The trial was set in the normal nursery of a teaching hospital. The sample included 70 healthy male newborns, randomly assigned to intervention and control groups at a 2:1 ratio. Infants were videotaped (face and torso) during the procedure for assessment of pain by 2 blinded, independent reviewers. The primary outcome measure is the Neonatal Infant Pain Scale score. Secondary outcomes include heart rate, oxygen saturation, and crying time. Neonatal Infant Pain Scale scores were significantly lower in the intervention groups (EMLA + sucrose, mean [SD]: 3.1 [1.33]; EMLA + sucrose + DPNB: 3 [1.33]; EMLA + sucrose + RB: 2.45 [1.27]) compared with the control (5.5 [0.53]). Between-group analyses showed RB + EMLA + sucrose to be significantly more effective than EMLA + sucrose; EMLA + sucrose + DPNB ( P = .009 and P = .002, respectively). Interrater reliability was κ = 0.843. Significant increase in heart rate (139.27 [9.63] to 163 [13.23] beats per minute) and crying time (5.78 [6.4] to 45.37 [12.39] seconds) were noted in the EMLA group. During neonatal circumcision in boys, the most effective analgesia is RB combined with oral sucrose and EMLA cream. Copyright © 2017 by the American Academy of Pediatrics.

  11. Large head metal-on-metal cementless total hip arthroplasty versus 28mm metal-on-polyethylene cementless total hip arthroplasty: design of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    van Raaij Jos JAM

    2008-10-01

    Full Text Available Abstract Background Osteoarthritis of the hip is successfully treated by total hip arthroplasty with metal-on-polyethylene articulation. Polyethylene wear debris can however lead to osteolysis, aseptic loosening and failure of the implant. Large head metal-on-metal total hip arthroplasty may overcome polyethylene wear induced prosthetic failure, but can increase systemic cobalt and chromium ion concentrations. The objective of this study is to compare two cementless total hip arthroplasties: a conventional 28 mm metal-on-polyethylene articulation and a large head metal-on-metal articulation. We hypothesize that the latter arthroplasties show less bone density loss and higher serum metal ion concentrations. We expect equal functional scores, greater range of motion, fewer dislocations, fewer periprosthetic radiolucencies and increased prosthetic survival with the metal-on-metal articulation. Methods A randomized controlled trial will be conducted. Patients to be included suffer from non-inflammatory degenerative joint disease of the hip, are aged between 18 and 80 and are admitted for primary cementless unilateral total hip arthroplasty. Patients in the metal-on-metal group will receive a cementless titanium alloy acetabular component with a cobalt-chromium liner and a cobalt-chromium femoral head varying from 38 to 60 mm. Patients in the metal-on-polyethylene group will receive a cementless titanium alloy acetabular component with a polyethylene liner and a 28 mm cobalt-chromium femoral head. We will assess acetabular bone mineral density by dual energy x-ray absorptiometry (DEXA, serum ion concentrations of cobalt, chromium and titanium, self reported functional status (Oxford hip score, physician reported functional status and range of motion (Harris hip score, number of dislocations and prosthetic survival. Measurements will take place preoperatively, perioperatively, and postoperatively (6 weeks, 1 year, 5 years and 10 years. Discussion

  12. Time-sliced perturbation theory for large scale structure I: general formalism

    Energy Technology Data Exchange (ETDEWEB)

    Blas, Diego; Garny, Mathias; Sibiryakov, Sergey [Theory Division, CERN, CH-1211 Genève 23 (Switzerland); Ivanov, Mikhail M., E-mail: diego.blas@cern.ch, E-mail: mathias.garny@cern.ch, E-mail: mikhail.ivanov@cern.ch, E-mail: sergey.sibiryakov@cern.ch [FSB/ITP/LPPC, École Polytechnique Fédérale de Lausanne, CH-1015, Lausanne (Switzerland)

    2016-07-01

    We present a new analytic approach to describe large scale structure formation in the mildly non-linear regime. The central object of the method is the time-dependent probability distribution function generating correlators of the cosmological observables at a given moment of time. Expanding the distribution function around the Gaussian weight we formulate a perturbative technique to calculate non-linear corrections to cosmological correlators, similar to the diagrammatic expansion in a three-dimensional Euclidean quantum field theory, with time playing the role of an external parameter. For the physically relevant case of cold dark matter in an Einstein-de Sitter universe, the time evolution of the distribution function can be found exactly and is encapsulated by a time-dependent coupling constant controlling the perturbative expansion. We show that all building blocks of the expansion are free from spurious infrared enhanced contributions that plague the standard cosmological perturbation theory. This paves the way towards the systematic resummation of infrared effects in large scale structure formation. We also argue that the approach proposed here provides a natural framework to account for the influence of short-scale dynamics on larger scales along the lines of effective field theory.

  13. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  14. Distributed Random Process for a Large-Scale Peer-to-Peer Lottery

    OpenAIRE

    Grumbach, Stéphane; Riemann, Robert

    2017-01-01

    International audience; Most online lotteries today fail to ensure the verifiability of the random process and rely on a trusted third party. This issue has received little attention since the emergence of distributed protocols like Bitcoin that demonstrated the potential of protocols with no trusted third party. We argue that the security requirements of online lotteries are similar to those of online voting, and propose a novel distributed online lottery protocol that applies techniques dev...

  15. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  16. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  17. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  18. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  19. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  20. On the random cascading model study of anomalous scaling in multiparticle production with continuously diminishing scale

    International Nuclear Information System (INIS)

    Liu Lianshou; Zhang Yang; Wu Yuanfang

    1996-01-01

    The anomalous scaling of factorial moments with continuously diminishing scale is studied using a random cascading model. It is shown that the model currently used have the property of anomalous scaling only for descrete values of elementary cell size. A revised model is proposed which can give good scaling property also for continuously varying scale. It turns out that the strip integral has good scaling property provided the integral regions are chosen correctly, and that this property is insensitive to the concrete way of self-similar subdivision of phase space in the models. (orig.)

  1. RECOVERY OF LARGE ANGULAR SCALE CMB POLARIZATION FOR INSTRUMENTS EMPLOYING VARIABLE-DELAY POLARIZATION MODULATORS

    Energy Technology Data Exchange (ETDEWEB)

    Miller, N. J.; Marriage, T. A.; Appel, J. W.; Bennett, C. L.; Eimer, J.; Essinger-Hileman, T.; Harrington, K.; Rostem, K.; Watts, D. J. [Department of Physics and Astronomy, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218 (United States); Chuss, D. T. [Department of Physics, Villanova University, 800 E Lancaster, Villanova, PA 19085 (United States); Wollack, E. J.; Fixsen, D. J.; Moseley, S. H.; Switzer, E. R., E-mail: Nathan.J.Miller@nasa.gov [Observational Cosmology Laboratory, Code 665, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States)

    2016-02-20

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residual modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r = 0.01. Indeed, r < 0.01 is achievable with commensurately improved characterizations and controls.

  2. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  3. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  4. Expected Future Conditions for Secure Power Operation with Large Scale of RES Integration

    International Nuclear Information System (INIS)

    Majstrovic, G.; Majstrovic, M.; Sutlovic, E.

    2015-01-01

    EU energy strategy is strongly focused on the large scale integration of renewable energy sources. The most dominant part here is taken by variable sources - wind power plants. Grid integration of intermittent sources along with keeping the system stable and secure is one of the biggest challenges for the TSOs. This part is often neglected by the energy policy makers, so this paper deals with expected future conditions for secure power system operation with large scale wind integration. It gives an overview of expected wind integration development in EU, as well as expected P/f regulation and control needs. The paper is concluded with several recommendations. (author).

  5. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  6. Behavioral effects of neurofeedback in adolescents with ADHD: a randomized controlled trial.

    Science.gov (United States)

    Bink, Marleen; van Nieuwenhuizen, Chijs; Popma, Arne; Bongers, Ilja L; van Boxtel, Geert J M

    2015-09-01

    Neurofeedback has been proposed as a potentially effective intervention for reducing Attention Deficit Hyperactivity Disorder (ADHD) symptoms. However, it remains unclear whether neurofeedback is of additional value to treatment as usual (TAU) for adolescents with clinical ADHD symptoms. Using a multicenter parallel-randomized controlled trial design, adolescents with ADHD symptoms were randomized to receive either a combination of TAU and neurofeedback (NFB + TAU, n = 45) or TAU-only (n = 26). Randomization was computer generated and stratified for age group (ages 12 through 16, 16 through 20, 20 through 24). Neurofeedback treatment consisted of approximately 37 sessions of theta/sensorimotor rhythm (SMR)-training on the vertex (Cz). Primary behavioral outcome measures included the ADHD-rating scale, Youth Self Report, and Child Behavior Checklist all assessed pre- and post-intervention. Behavioral problems decreased equally for both groups with medium to large effect sizes, range of partial η2 = 0.08-0.31, p neurofeedback and TAU was as effective as TAU-only for adolescents with ADHD symptoms. Considering the absence of additional behavioral effects in the current study, in combination with the limited knowledge of specific treatment effects, it is questionable whether theta/SMR neurofeedback for adolescents with ADHD and comorbid disorders in clinical practice should be used. Further research is warranted to investigate possible working mechanisms and (long-term) specific treatment effects of neurofeedback.

  7. Large scale structures in liquid crystal/clay colloids

    Science.gov (United States)

    van Duijneveldt, Jeroen S.; Klein, Susanne; Leach, Edward; Pizzey, Claire; Richardson, Robert M.

    2005-04-01

    Suspensions of three different clays in K15, a thermotropic liquid crystal, have been studied by optical microscopy and small angle x-ray scattering. The three clays were claytone AF, a surface treated natural montmorillonite, laponite RD, a synthetic hectorite, and mined sepiolite. The claytone and laponite were sterically stabilized whereas sepiolite formed a relatively stable suspension in K15 without any surface treatment. Micrographs of the different suspensions revealed that all three suspensions contained large scale structures. The nature of these aggregates was investigated using small angle x-ray scattering. For the clays with sheet-like particles, claytone and laponite, the flocs contain a mixture of stacked and single platelets. The basal spacing in the stacks was independent of particle concentration in the suspension and the phase of the solvent. The number of platelets in the stack and their percentage in the suspension varied with concentration and the aspect ratio of the platelets. The lath shaped sepiolite did not show any tendency to organize into ordered structures. Here the aggregates are networks of randomly oriented single rods.

  8. Large scale structures in liquid crystal/clay colloids

    International Nuclear Information System (INIS)

    Duijneveldt, Jeroen S van; Klein, Susanne; Leach, Edward; Pizzey, Claire; Richardson, Robert M

    2005-01-01

    Suspensions of three different clays in K15, a thermotropic liquid crystal, have been studied by optical microscopy and small angle x-ray scattering. The three clays were claytone AF, a surface treated natural montmorillonite, laponite RD, a synthetic hectorite, and mined sepiolite. The claytone and laponite were sterically stabilized whereas sepiolite formed a relatively stable suspension in K15 without any surface treatment. Micrographs of the different suspensions revealed that all three suspensions contained large scale structures. The nature of these aggregates was investigated using small angle x-ray scattering. For the clays with sheet-like particles, claytone and laponite, the flocs contain a mixture of stacked and single platelets. The basal spacing in the stacks was independent of particle concentration in the suspension and the phase of the solvent. The number of platelets in the stack and their percentage in the suspension varied with concentration and the aspect ratio of the platelets. The lath shaped sepiolite did not show any tendency to organize into ordered structures. Here the aggregates are networks of randomly oriented single rods

  9. CoDuSe group exercise programme improves balance and reduces falls in people with multiple sclerosis: A multi-centre, randomized, controlled pilot study.

    Science.gov (United States)

    Carling, Anna; Forsberg, Anette; Gunnarsson, Martin; Nilsagård, Ylva

    2017-09-01

    Imbalance leading to falls is common in people with multiple sclerosis (PwMS). To evaluate the effects of a balance group exercise programme (CoDuSe) on balance and walking in PwMS (Expanded Disability Status Scale, 4.0-7.5). A multi-centre, randomized, controlled single-blinded pilot study with random allocation to early or late start of exercise, with the latter group serving as control group for the physical function measures. In total, 14 supervised 60-minute exercise sessions were delivered over 7 weeks. Pretest-posttest analyses were conducted for self-reported near falls and falls in the group starting late. Primary outcome was Berg Balance Scale (BBS). A total of 51 participants were initially enrolled; three were lost to follow-up. Post-intervention, the exercise group showed statistically significant improvement ( p = 0.015) in BBS and borderline significant improvement in MS Walking Scale ( p = 0.051), both with large effect sizes (3.66; -2.89). No other significant differences were found between groups. In the group starting late, numbers of falls and near falls were statistically significantly reduced after exercise compared to before ( p balance and reduced perceived walking limitations, compared to no exercise. The intervention reduced falls and near falls frequency.

  10. The effect of India's total sanitation campaign on defecation behaviors and child health in rural Madhya Pradesh: a cluster randomized controlled trial.

    Science.gov (United States)

    Patil, Sumeet R; Arnold, Benjamin F; Salvatore, Alicia L; Briceno, Bertha; Ganguly, Sandipan; Colford, John M; Gertler, Paul J

    2014-08-01

    Poor sanitation is thought to be a major cause of enteric infections among young children. However, there are no previously published randomized trials to measure the health impacts of large-scale sanitation programs. India's Total Sanitation Campaign (TSC) is one such program that seeks to end the practice of open defecation by changing social norms and behaviors, and providing technical support and financial subsidies. The objective of this study was to measure the effect of the TSC implemented with capacity building support from the World Bank's Water and Sanitation Program in Madhya Pradesh on availability of individual household latrines (IHLs), defecation behaviors, and child health (diarrhea, highly credible gastrointestinal illness [HCGI], parasitic infections, anemia, growth). We conducted a cluster-randomized, controlled trial in 80 rural villages. Field staff collected baseline measures of sanitation conditions, behaviors, and child health (May-July 2009), and revisited households 21 months later (February-April 2011) after the program was delivered. The study enrolled a random sample of 5,209 children village with improved sanitation facilities as defined by the WHO/UNICEF Joint Monitoring Programme by an average of 19% (95% CI for difference: 12%-26%; group means: 22% control versus 41% intervention), decreased open defecation among adults by an average of 10% (95% CI for difference: 4%-15%; group means: 73% intervention versus 84% control). However, the intervention did not improve child health measured in terms of multiple health outcomes (diarrhea, HCGI, helminth infections, anemia, growth). Limitations of the study included a relatively short follow-up period following implementation, evidence for contamination in ten of the 40 control villages, and bias possible in self-reported outcomes for diarrhea, HCGI, and open defecation behaviors. The intervention led to modest increases in availability of IHLs and even more modest reductions in open

  11. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  12. Bio-inspired wooden actuators for large scale applications.

    Directory of Open Access Journals (Sweden)

    Markus Rüggeberg

    Full Text Available Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules.

  13. A randomized placebo-controlled trial to evaluate a novel noninjectable anesthetic gel with thermosetting agent during scaling and root planing in chronic periodontitis patients.

    Science.gov (United States)

    Dayakar, M M; Akbar, S M

    2016-01-01

    To study the efficacy of a noninjectable anesthetic gel with a thermosetting agent in the reduction of pain during scaling and root planing (SRP) in untreated chronic periodontitis patients. This study is a randomized, double-masked, split-mouth, placebo-controlled trial. Thirty patients were enrolled who underwent SRP in a split-mouth (right side/left side) manner. Before commencement of SRP, both quadrants on each side were isolated and had a randomized gel (either placebo or test gel) placed in the periodontal pockets for 30 s. The pain was measured using numerical rating scale (NRS) and verbal rating scale (VRS). The median NRS pain score for the patients treated with the anesthetic test gel was 1 (range: 0-4) as opposed to 5 (range: 3-7) in the placebo treated patients. The mean rank of pain score using NRS in test gel was 16.18 as compared to 44.82 in placebo treated sites. Hence, significant reduction in pain was found in test gel as compared to placebo using NRS (P < 0.001). The VRS showed that the majority of patients reported no pain or mild pain with a median of 1 as compared to placebo treated sites with a median of 2 suggestive of moderate pain. The NRS and VRS pain scores showed that the side treated with anesthetic gel was statistically more effective than the placebo in reducing pain during SRP.

  14. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  15. A permutation test to analyse systematic bias and random measurement errors of medical devices via boosting location and scale models.

    Science.gov (United States)

    Mayr, Andreas; Schmid, Matthias; Pfahlberg, Annette; Uter, Wolfgang; Gefeller, Olaf

    2017-06-01

    Measurement errors of medico-technical devices can be separated into systematic bias and random error. We propose a new method to address both simultaneously via generalized additive models for location, scale and shape (GAMLSS) in combination with permutation tests. More precisely, we extend a recently proposed boosting algorithm for GAMLSS to provide a test procedure to analyse potential device effects on the measurements. We carried out a large-scale simulation study to provide empirical evidence that our method is able to identify possible sources of systematic bias as well as random error under different conditions. Finally, we apply our approach to compare measurements of skin pigmentation from two different devices in an epidemiological study.

  16. Challenges and options for large scale integration of wind power

    International Nuclear Information System (INIS)

    Tande, John Olav Giaever

    2006-01-01

    Challenges and options for large scale integration of wind power are examined. Immediate challenges are related to weak grids. Assessment of system stability requires numerical simulation. Models are being developed - validation is essential. Coordination of wind and hydro generation is a key for allowing more wind power capacity in areas with limited transmission corridors. For the case study grid depending on technology and control the allowed wind farm size is increased from 50 to 200 MW. The real life example from 8 January 2005 demonstrates that existing marked based mechanisms can handle large amounts of wind power. In wind integration studies it is essential to take account of the controllability of modern wind farms, the power system flexibility and the smoothing effect of geographically dispersed wind farms. Modern wind farms contribute to system adequacy - combining wind and hydro constitutes a win-win system (ml)

  17. Six-month exercise training program to treat post-thrombotic syndrome: a randomized controlled two-centre trial

    Science.gov (United States)

    Kahn, Susan R.; Shrier, Ian; Shapiro, Stan; Houweling, Adrielle H.; Hirsch, Andrew M.; Reid, Robert D.; Kearon, Clive; Rabhi, Khalil; Rodger, Marc A.; Kovacs, Michael J.; Anderson, David R.; Wells, Philip S.

    2011-01-01

    Background Exercise training may have the potential to improve post-thrombotic syndrome, a frequent, chronic complication of deep venous thrombosis. We conducted a randomized controlled two-centre pilot trial to assess the feasibility of a multicentre-based evaluation of a six-month exercise training program to treat post-thrombotic syndrome and to obtain preliminary data on the effectiveness of such a program. Methods Patients were randomized to receive exercise training (a six-month trainer-supervised program) or control treatment (an education session with monthly phone follow-ups). Levels of eligibility, consent, adherence and retention were used as indicators of study feasibility. Primary outcomes were change from baseline to six months in venous disease-specific quality of life (as measured using the Venous Insufficiency Epidemiological and Economic Study Quality of Life [VEINES-QOL] questionnaire) and severity of post-thrombotic syndrome (as measured by scores on the Villalta scale) in the exercise training group versus the control group, assessed by t tests. Secondary outcomes were change in generic quality of life (as measured using the Short-Form Health Survey-36 [SF-36] questionnaire), category of severity of post-thrombotic syndrome, leg strength, leg flexibility and time on treadmill. Results Of 95 patients with post-thrombotic syndrome, 69 were eligible, 43 consented and were randomized, and 39 completed the study. Exercise training was associated with improvement in VEINES-QOL scores (exercise training mean change 6.0, standard deviation [SD] 5.1 v. control mean change 1.4, SD 7.2; difference 4.6, 95% CI 0.54 to 8.7; p = 0.027) and improvement in scores on the Villalta scale (exercise training mean change −3.6, SD 3.7 v. control mean change −1.6, SD 4.3; difference −2.0, 95% CI −4.6 to 0.6; p = 0.14). Most secondary outcomes also showed greater improvement in the exercise training group. Interpretation Exercise training may improve post

  18. Sham-controlled, randomized, feasibility trial of acupuncture for prevention of radiation-induced xerostomia among patients with nasopharyngeal carcinoma

    Science.gov (United States)

    Meng, Zhiqiang; Garcia, M. Kay; Hu, Chaosu; Chiang, Joseph; Chambers, Mark; Rosenthal, David I.; Peng, Huiting; Wu, Caijun; Zhao, Qi; Zhao, Genming; Liu, Luming; Spelman, Amy; Palmer, J. Lynn; Wei, Qi; Cohen, Lorenzo

    2013-01-01

    Background Xerostomia (dry mouth) after head/neck radiation is a common problem among cancer patients. Quality of life (QOL) is impaired, and available treatments are of little benefit. This trial determined the feasibility of conducting a sham-controlled trial of acupuncture and whether acupuncture could prevent xerostomia among head/neck patients undergoing radiotherapy. Methods A sham controlled, feasibility trial was conducted at Fudan University Shanghai Cancer Center, Shanghai, China among patients with nasopharyngeal carcinoma undergoing radiotherapy. To determine feasibility of a sham procedure, 23 patients were randomized to real acupuncture (N = 11) or to sham acupuncture (N = 12). Patients were treated 3 times/week during their course of radiotherapy. Subjective measures were the Xerostomia Questionnaire (XQ) and MD Anderson Symptom Inventory for Head and Neck Cancer (MDASI-HN). Objective measures were unstimulated whole salivary flow rates (UWSFR) and stimulated salivary flow rates (SSFR). Patients were followed for 1 month after radiotherapy. Results XQ scores for acupuncture were significantly lower than sham controls starting in week 3 and lasted through the 1-month follow-up (all P’s xerostomia symptoms and improved QOL when compared with sham acupuncture. Large-scale, multi-center, randomized, placebo-controlled trials are now needed. PMID:22285177

  19. Large-scale electrophysiology: acquisition, compression, encryption, and storage of big data.

    Science.gov (United States)

    Brinkmann, Benjamin H; Bower, Mark R; Stengel, Keith A; Worrell, Gregory A; Stead, Matt

    2009-05-30

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single-neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single-neuron action potentials, high frequency oscillations, and high amplitude ultra-slow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range-encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information.

  20. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  1. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  2. Randomized controlled trial of a patient decision-making aid for orthodontics.

    Science.gov (United States)

    Parker, Kate; Cunningham, Susan J; Petrie, Aviva; Ryan, Fiona S

    2017-08-01

    Patient decision-making aids (PDAs) are instruments that facilitate shared decision making and enable patients to reach informed, individual decisions regarding health care. The objective of this study was to assess the efficacy of a PDA compared with traditional information provision for adolescent patients considering fixed appliance orthodontic treatment. Before treatment, orthodontic patients were randomly allocated into 2 groups: the intervention group received the PDA and standard information regarding fixed appliances, and the control group received the standard information only. Decisional conflict was measured using the Decisional Conflict Scale, and the levels of decisional conflict were compared between the 2 groups. Seventy-two patients were recruited and randomized in a ratio of 1:1 to the PDA and control groups. Seventy-one patients completed the trial (control group, 36; PDA group, 35); this satisfied the sample size calculation. The median total Decisional Conflict Scale score in the PDA group was lower than in the control group (15.63 and 19.53, respectively). However, this difference was not statistically significant (difference between groups, 3.90; 95% confidence interval of the difference, -4.30 to 12.11). Sex, ethnicity, age, and the time point at which patients were recruited did not have significant effects on Decisional Conflict Scale scores. No harm was observed or reported for any participant in the study. The results of this study showed that the provision of a PDA to adolescents before they consented for fixed appliances did not significantly reduce decisional conflict. There may be a benefit in providing a PDA for some patients, but it is not yet possible to say how these patients could be identified. This trial was registered with the Harrow National Research Ethics Committee (reference 12/LO/0279). The protocol was not published before trial commencement. Copyright © 2017. Published by Elsevier Inc.

  3. A randomized controlled trial of single point acupuncture in primary dysmenorrhea.

    Science.gov (United States)

    Liu, Cun-Zhi; Xie, Jie-Ping; Wang, Lin-Peng; Liu, Yu-Qi; Song, Jia-Shan; Chen, Yin-Ying; Shi, Guang-Xia; Zhou, Wei; Gao, Shu-Zhong; Li, Shi-Liang; Xing, Jian-Min; Ma, Liang-Xiao; Wang, Yan-Xia; Zhu, Jiang; Liu, Jian-Ping

    2014-06-01

    Acupuncture is often used for primary dysmenorrhea. But there is no convincing evidence due to low methodological quality. We aim to assess immediate effect of acupuncture at specific acupoint compared with unrelated acupoint and nonacupoint on primary dysmenorrhea. The Acupuncture Analgesia Effect in Primary Dysmenorrhoea-II is a multicenter controlled trial conducted in six large hospitals of China. Patients who met inclusion criteria were randomly assigned to classic acupoint (N = 167), unrelated acupoint (N = 167), or non-acupoint (N = 167) group on a 1:1:1 basis. They received three sessions with electro-acupuncture at a classic acupoint (Sanyinjiao, SP6), or an unrelated acupoint (Xuanzhong, GB39), or nonacupoint location, respectively. The primary outcome was subjective pain as measured by a 100-mm visual analog scale (VAS). Measurements were obtained at 0, 5, 10, 30, and 60 minutes following the first intervention. In addition, patients scored changes of general complaints using Cox retrospective symptom scales (RSS-Cox) and 7-point verbal rating scale (VRS) during three menstrual cycles. Secondary outcomes included VAS score for average pain, pain total time, additional in-bed time, and proportion of participants using analgesics during three menstrual cycles. Five hundred and one people underwent random assignment. The primary comparison of VAS scores following the first intervention demonstrated that classic acupoint group was more effective both than unrelated acupoint (-4.0 mm, 95% CI -7.1 to -0.9, P = 0.010) and nonacupoint (-4.0 mm, 95% CI -7.0 to -0.9, P = 0.012) groups. However, no significant differences were detected among the three acupuncture groups for RSS-Cox or VRS outcomes. The per-protocol analysis showed similar pattern. No serious adverse events were noted. Specific acupoint acupuncture produced a statistically, but not clinically, significant effect compared with unrelated acupoint and nonacupoint acupuncture in

  4. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  5. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  6. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  7. Large Neighborhood Search and Adaptive Randomized Decompositions for Flexible Jobshop Scheduling

    DEFF Research Database (Denmark)

    Pacino, Dario; Van Hentenryck, Pascal

    2011-01-01

    This paper considers a constraint-based scheduling approach to the flexible jobshop, a generalization of the traditional jobshop scheduling where activities have a choice of machines. It studies both large neighborhood (LNS) and adaptive randomized de- composition (ARD) schemes, using random...

  8. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  9. Electroacupuncture treatment for pancreatic cancer pain: a randomized controlled trial.

    Science.gov (United States)

    Chen, Hao; Liu, Tang-Yi; Kuai, Le; Zhu, Ji; Wu, Cai-Jun; Liu, Lu-Ming

    2013-01-01

    Pancreatic cancer is often accompanied by severe abdominal or back pain. It's the first study to evaluate the analgesic effect of electroacupuncture on pancreatic cancer pain. A randomized controlled trial compared electroacupuncture with control acupuncture using the placebo needle. Sixty patients with pancreatic cancer pain were randomly assigned to the electroacupuncture group (n = 30) and the placebo control group (n = 30). Patients were treated on Jiaji (Ex-B2) points T8-T12 bilaterally for 30 min once a day for 3 days. Pain intensity was assessed with numerical rated scales (NRS) before the treatment (Baseline), after 3 treatments, and 2 days follow-up. Baseline characteristics were similar in the two groups. After 3 treatment, pain intensity on NRS decreased compared with Baseline (-1.67, 95% confidence interval [CI] -1.46 to -1.87) in the electroacupuncture group; there was little change (-0.13, 95% CI 0.08 to -0.35) in control group; the difference between two groups was statistically significant (P electroacupuncture group compared with the control group (P Electroacupuncture was an effective treatment for relieving pancreatic cancer pain. Copyright © 2013 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  10. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  11. Utility of the Conners' Adult ADHD Rating Scale validity scales in identifying simulated attention-deficit hyperactivity disorder and random responding.

    Science.gov (United States)

    Walls, Brittany D; Wallace, Elizabeth R; Brothers, Stacey L; Berry, David T R

    2017-12-01

    Recent concern about malingered self-report of symptoms of attention-deficit hyperactivity disorder (ADHD) in college students has resulted in an urgent need for scales that can detect feigning of this disorder. The present study provided further validation data for a recently developed validity scale for the Conners' Adult ADHD Rating Scale (CAARS), the CAARS Infrequency Index (CII), as well as for the Inconsistency Index (INC). The sample included 139 undergraduate students: 21 individuals with diagnoses of ADHD, 29 individuals responding honestly, 54 individuals responding randomly (full or half), and 35 individuals instructed to feign. Overall, the INC showed moderate sensitivity to random responding (.44-.63) and fairly high specificity to ADHD (.86-.91). The CII demonstrated modest sensitivity to feigning (.31-.46) and excellent specificity to ADHD (.91-.95). Sequential application of validity scales had correct classification rates of honest (93.1%), ADHD (81.0%), feigning (57.1%), half random (42.3%), and full random (92.9%). The present study suggests that the CII is modestly sensitive (true positive rate) to feigned ADHD symptoms, and highly specific (true negative rate) to ADHD. Additionally, this study highlights the utility of applying the CAARS validity scales in a sequential manner for identifying feigning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. A randomized controlled trial evaluating a brief parenting program with children with autism spectrum disorders.

    Science.gov (United States)

    Tellegen, Cassandra L; Sanders, Matthew R

    2014-12-01

    This randomized controlled trial evaluated the efficacy of Primary Care Stepping Stones Triple P, a brief individualized parenting program, in a sample of parents of children with autism spectrum disorder (ASD). Sixty-four parents of children aged 2-9 years (M = 5.67, SD = 2.14) with an ASD diagnosis participated in the study. Eighty-six percent of children were male, and 89% of parents identified their child's ethnicity as Australian/White. Families were randomly assigned to 1 of 2 conditions (intervention or care-as-usual) and were assessed at 3 time points (preintervention, postintervention, and 6-month follow-up). Parents completed a range of questionnaires to assess changes in child behavior (Eyberg Child Behavior Inventory) and parent outcomes (Parenting Scale, Depression Anxiety Stress Scale-21, Parent Problem Checklist, Relationship Quality Inventory, Parental Stress Scale) and 30-min home observations of parent-child interactions. Relative to the care-as-usual group, significant short-term improvements were found in the intervention group on parent-reported child behavior problems, dysfunctional parenting styles, parenting confidence, and parental stress, parental conflict, and relationship happiness. No significant intervention effects were found on levels of parental depression or anxiety, or on observed child disruptive and parent aversive behavior. The effect sizes for significant variables ranged from medium to large. Short-term effects were predominantly maintained at 6-month follow-up, and parents reported high levels of goal achievement and satisfaction with the program. The results indicate that a brief low intensity version of Stepping Stones Triple P is an efficacious intervention for parents of children with ASD.

  13. Network Partitioning Domain Knowledge Multiobjective Application Mapping for Large-Scale Network-on-Chip

    Directory of Open Access Journals (Sweden)

    Yin Zhen Tei

    2014-01-01

    Full Text Available This paper proposes a multiobjective application mapping technique targeted for large-scale network-on-chip (NoC. As the number of intellectual property (IP cores in multiprocessor system-on-chip (MPSoC increases, NoC application mapping to find optimum core-to-topology mapping becomes more challenging. Besides, the conflicting cost and performance trade-off makes multiobjective application mapping techniques even more complex. This paper proposes an application mapping technique that incorporates domain knowledge into genetic algorithm (GA. The initial population of GA is initialized with network partitioning (NP while the crossover operator is guided with knowledge on communication demands. NP reduces the large-scale application mapping complexity and provides GA with a potential mapping search space. The proposed genetic operator is compared with state-of-the-art genetic operators in terms of solution quality. In this work, multiobjective optimization of energy and thermal-balance is considered. Through simulation, knowledge-based initial mapping shows significant improvement in Pareto front compared to random initial mapping that is widely used. The proposed knowledge-based crossover also shows better Pareto front compared to state-of-the-art knowledge-based crossover.

  14. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  15. Output regulation of large-scale hydraulic networks with minimal steady state power consumption

    NARCIS (Netherlands)

    Jensen, Tom Nørgaard; Wisniewski, Rafał; De Persis, Claudio; Kallesøe, Carsten Skovmose

    2014-01-01

    An industrial case study involving a large-scale hydraulic network is examined. The hydraulic network underlies a district heating system, with an arbitrary number of end-users. The problem of output regulation is addressed along with a optimization criterion for the control. The fact that the

  16. Effectiveness of aerobic gymnastic exercise on stress, fatigue, and sleep quality during postpartum: A pilot randomized controlled trial.

    Science.gov (United States)

    Yang, Chiu-Ling; Chen, Chung-Hey

    2018-01-01

    Gymnastics is a preferable safe exercise for postnatal women performing regularly. The aim of this pilot randomized controlled trial was to determine whether the aerobic gymnastic exercise improves stress, fatigue, sleep quality and depression in postpartum women. Single-blinded, randomized controlled trial held from December 2014 until September 2015. Postnatal clinic of a medical center in southern Taiwan. 140 eligible postnatal women were systematically assigned, with a random start to experimental (n=70) or a control (n=70) group. Engage in aerobic gymnastic exercise at least three times (15min per section) a week for three months using compact disc in the home. Perceived Stress Scale, Postpartum Fatigue Scale, Postpartum Sleep Quality Scale, and Edinburgh Postnatal Depression Scale. In a two-way ANOVA with repeated measures, the aerobic gymnastic exercise group showed significant decrease in fatigue after practicing exercise 4 weeks and the positive effects extended to the 12-week posttests. Paired t-tests revealed that aerobic gymnastic exercise participants had improved significantly in perceived stress and fatigue after 4 weeks gymnastic exercise; these positive effects extended to the 12-week posttests. In addition, the changes in physical symptoms-related sleep inefficiency after 12 weeks gymnastic exercise were significantly decreased in the experimental group compared with the control group. The findings can be used to encourage postnatal women to perform moderate-intensity gymnastic exercise in their daily life to reduce their stress, fatigue and improve sleep quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Internet treatment for depression: a randomized controlled trial comparing clinician vs. technician assistance.

    Science.gov (United States)

    Titov, Nickolai; Andrews, Gavin; Davies, Matthew; McIntyre, Karen; Robinson, Emma; Solley, Karen

    2010-06-08

    Internet-based cognitive behavioural therapy (iCBT) for depression is effective when guided by a clinician, less so if unguided. Would guidance from a technician be as effective as guidance from a clinician? Randomized controlled non-inferiority trial comparing three groups: Clinician-assisted vs. technician-assisted vs. delayed treatment. Community-based volunteers applied to the VirtualClinic (www.virtualclinic.org.au) research program, and 141 participants with major depressive disorder were randomized. Participants in the clinician- and technician-assisted groups received access to an iCBT program for depression comprising 6 online lessons, weekly homework assignments, and weekly supportive contact over a treatment period of 8 weeks. Participants in the clinician-assisted group also received access to a moderated online discussion forum. The main outcome measures were the Beck Depression Inventory (BDI-II) and the Patient Health QUESTIONnaire-9 Item (PHQ-9). Completion rates were high, and at post-treatment, both treatment groups reduced scores on the BDI-II (ptechnician-assisted groups respectively, and on the PHQ-9, were 1.54 and 1.60 respectively. At 4-month follow-up participants in the technician group had made further improvements and had significantly lower scores on the PHQ-9 than those in the clinician group. A total of approximately 60 minutes of clinician or technician time was required per participant during the 8-week treatment program. Both clinician- and technician-assisted treatment resulted in large effect sizes and clinically significant improvements comparable to those associated with face-to-face treatment, while a delayed treatment control group did not improve. These results provide support for large scale trials to determine the clinical effectiveness and acceptability of technician-assisted iCBT programs for depression. This form of treatment has potential to increase the capacity of existing mental health services. Australian New

  18. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  19. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  20. Affect-focused psychodynamic psychotherapy for depression and anxiety through the Internet: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Robert Johansson

    2013-07-01

    Full Text Available Background. Psychodynamic psychotherapy is a psychological treatment approach that has a growing empirical base. Research has indicated an association between therapist-facilitated affective experience and outcome in psychodynamic therapy. Affect-phobia therapy (APT, as outlined by McCullough et al., is a psychodynamic treatment that emphasizes a strong focus on expression and experience of affect. This model has neither been evaluated for depression nor anxiety disorders in a randomized controlled trial. While Internet-delivered psychodynamic treatments for depression and generalized anxiety disorder exist, they have not been based on APT. The aim of this randomized controlled trial was to investigate the efficacy of an Internet-based, psychodynamic, guided self-help treatment based on APT for depression and anxiety disorders.Methods. One hundred participants with diagnoses of mood and anxiety disorders participated in a randomized (1:1 ratio controlled trial of an active group versus a control condition. The treatment group received a 10-week, psychodynamic, guided self-help treatment based on APT that was delivered through the Internet. The treatment consisted of eight text-based treatment modules and included therapist contact (9.5 min per client and week, on average in a secure online environment. Participants in the control group also received online therapist support and clinical monitoring of symptoms, but received no treatment modules. Outcome measures were the 9-item Patient Health Questionnaire Depression Scale (PHQ-9 and the 7-item Generalized Anxiety Disorder Scale (GAD-7. Process measures were also included. All measures were administered weekly during the treatment period and at a 7-month follow-up.Results. Mixed models analyses using the full intention-to-treat sample revealed significant interaction effects of group and time on all outcome measures, when comparing treatment to the control group. A large between-group effect size

  1. Effectiveness of Wii-based rehabilitation in stroke: A randomized controlled study.

    Science.gov (United States)

    Karasu, Ayça Utkan; Batur, Elif Balevi; Karataş, Gülçin Kaymak

    2018-05-08

    To investigate the efficacy of Nintendo Wii Fit®-based balance rehabilitation as an adjunc-tive therapy to conventional rehabilitation in stroke patients. During the study period, 70 stroke patients were evaluated. Of these, 23 who met the study criteria were randomly assigned to either the experimental group (n = 12) or the control group (n = 11) by block randomization. Primary outcome measures were Berg Balance Scale, Functional Reach Test, Postural Assessment Scale for Stroke Patients, Timed Up and Go Test and Static Balance Index. Secondary outcome measures were postural sway, as assessed with Emed-X, Functional Independence Measure Transfer and Ambulation Scores. An evaluator who was blinded to the groups made assessments immediately before (baseline), immediately after (post-treatment), and 4 weeks after completion of the study (follow-up). Group-time interaction was significant in the Berg Balance Scale, Functional Reach Test, anteroposterior and mediolateral centre of pressure displacement with eyes open, anteroposterior centre of pressure displacement with eyes closed, centre of pressure displacement during weight shifting to affected side, to unaffected side and total centre of pressure displacement during weight shifting. Demonstrating significant group-time interaction in those parameters suggests that, while both groups exhibited significant improvement, the experimental group showed greater improvement than the control group. Virtual reality exercises with the Nintendo Wii system could represent a useful adjunctive therapy to traditional treatment to improve static and dynamic balance in stroke patients.

  2. Effectiveness of Wii-based rehabilitation in stroke: A randomized controlled study

    Directory of Open Access Journals (Sweden)

    Ayça Utkan Karasu

    2018-03-01

    Full Text Available Objective: To investigate the efficacy of Nintendo Wii Fit®-based balance rehabilitation as an adjunc-tive therapy to conventional rehabilitation in stroke patients. Methods: During the study period, 70 stroke patients were evaluated. Of these, 23 who met the study criteria were randomly assigned to either the experimental group (n = 12 or the control group (n = 11 by block randomization. Primary outcome measures were Berg Balance Scale, Functional Reach Test, Postural Assessment Scale for Stroke Patients, Timed Up and Go Test and Static Balance Index. Secondary outcome measures were postural sway, as assessed with Emed-X, Functional Independence Measure Transfer and Ambulation Scores. An evaluator who was blinded to the groups made assessments immediately before (baseline, immediately after (post-treatment, and 4 weeks after completion of the study (follow-up. Results: Group-time interaction was significant in the Berg Balance Scale, Functional Reach Test, anteroposterior and mediolateral centre of pressure displacement with eyes open, anteroposterior centre of pressure displacement with eyes closed, centre of pressure displacement during weight shifting to affected side, to unaffected side and total centre of pressure displacement during weight shifting. Demonstrating significant group-time interaction in those parameters suggests that, while both groups exhibited significant improvement, the experimental group showed greater improvement than the control group. Conclusion: Virtual reality exercises with the Nintendo Wii system could represent a useful adjunctive therapy to traditional treatment to improve static and dynamic balance in stroke patients.

  3. Large scale silver nanowires network fabricated by MeV hydrogen (H+) ion beam irradiation

    International Nuclear Information System (INIS)

    S, Honey; S, Naseem; A, Ishaq; M, Maaza; M T, Bhatti; D, Wan

    2016-01-01

    A random two-dimensional large scale nano-network of silver nanowires (Ag-NWs) is fabricated by MeV hydrogen (H + ) ion beam irradiation. Ag-NWs are irradiated under H +  ion beam at different ion fluences at room temperature. The Ag-NW network is fabricated by H + ion beam-induced welding of Ag-NWs at intersecting positions. H +  ion beam induced welding is confirmed by transmission electron microscopy (TEM) and scanning electron microscopy (SEM). Moreover, the structure of Ag NWs remains stable under H +  ion beam, and networks are optically transparent. Morphology also remains stable under H +  ion beam irradiation. No slicings or cuttings of Ag-NWs are observed under MeV H +  ion beam irradiation. The results exhibit that the formation of Ag-NW network proceeds through three steps: ion beam induced thermal spikes lead to the local heating of Ag-NWs, the formation of simple junctions on small scale, and the formation of a large scale network. This observation is useful for using Ag-NWs based devices in upper space where protons are abandoned in an energy range from MeV to GeV. This high-quality Ag-NW network can also be used as a transparent electrode for optoelectronics devices. (paper)

  4. Potassium supplementation and heart rate : A meta-analysis of randomized controlled trials

    NARCIS (Netherlands)

    Gijsbers, L.; Moelenberg, F. J. M.; Bakker, S. J. L.; Geleijnse, J. M.

    Background and aims: Increasing the intake of potassium has been shown to lower blood pressure, but whether it also affects heart rate (HR) is largely unknown. We therefore assessed the effect of potassium supplementation on HR in a meta-analysis of randomized controlled trials. Methods and results:

  5. Communication and Low Mood (CALM): a randomized controlled trial of behavioural therapy for stroke patients with aphasia.

    Science.gov (United States)

    Thomas, Shirley A; Walker, Marion F; Macniven, Jamie A; Haworth, Helen; Lincoln, Nadina B

    2013-05-01

    The aim was to evaluate behavioural therapy as a treatment for low mood in people with aphasia. A randomized controlled trial comparing behavioural therapy plus usual care with a usual care control. Potential participants with aphasia after stroke were screened for the presence of low mood. Those who met the criteria and gave consent were randomly allocated. Participants were recruited from hospital wards, community rehabilitation, speech and language therapy services and stroke groups. Of 511 people with aphasia identified, 105 had low mood and were recruited. Behavioural therapy was offered for up to three months. Outcomes were assessed three and six months after random allocation. Stroke Aphasic Depression Questionnaire, Visual Analog Mood Scales 'sad' item, and Visual Analogue Self-Esteem Scale. Participants were aged 29 to 94 years (mean 67.0, SD 13.5) and 66 (63%) were men. Regression analysis showed that at three months, when baseline values and communication impairment were controlled for, group allocation was a significant predictor of the Stroke Aphasic Depression Questionnaire (P aphasia.

  6. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  7. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  8. Radiation Transport in Random Media With Large Fluctuations

    Science.gov (United States)

    Olson, Aaron; Prinja, Anil; Franke, Brian

    2017-09-01

    Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

  9. Acupuncture in subjects with cold hands sensation: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Seo, Jung-Chul; Lee, Hyun-jong; Kwak, Min-Ah; Park, Sung-Hoon; Shin, ImHee; Yun, Woo-Sung; Park, Kihyuk

    2014-09-04

    Cold hands sensation is a common disorder within the Korean population. Many Korean family physicians believe that it is a mild early manifestation of Raynaud's phenomenon (RP), or may be related to RP. RP is characterized by reversible digital vasospasm provoked by cold temperatures and/or emotional stress, and doctors often prescribe medications that are used in treatment of RP for subjects with cold hands. However, this has not shown a clear benefit, and these medications can cause unwanted side effects. It is also reported that traditional Korean medicine, including acupuncture, is widely used to treat cold hands, although the current level of evidence for this approach is also poor and to date, there have been no published randomized controlled clinical trials (RCTs) evaluating the efficacy and safety of acupuncture for cold hands. We have therefore designed a pilot RCT to obtain information for the design of a further full-scale trial. The proposed study is a five-week pilot RCT. A total of 14 subjects will be recruited and randomly allocated to two groups: an acupuncture plus medication group (experimental group) and a medication-only group (control group). All subjects will take nifedipine (5 mg once daily) and beraprost (20 mg three times daily) for three weeks. The experimental group will receive additional treatment with three acupuncture sessions per week for three weeks (nine sessions total). The primary outcome will be measured using a visual analogue scale. Secondary outcomes will be measured by blood perfusion in laser Doppler perfusion imaging of the hands, frequency and duration of episodes of cold hands, and heart rate variability. Assessments will be made at baseline and at one, three, and five weeks thereafter. This study will provide an indication of the feasibility and a clinical foundation for a future large-scale trial. This study was registered at Korean Clinical Research Information Service (CRIS) registry on 5 August 2013 with the

  10. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  11. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  12. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  13. Adsorption of Caesium in Urine on Copper Hexacyanoferrate(II) - A Contamination Control Kit for Large-Scale In-Situ Use

    International Nuclear Information System (INIS)

    Johansson, L.; Samuelsson, C.; Holm, E.

    1999-01-01

    A kit containing copper hexacyanoferrate(II) was created for large scale distribution to caesium-contaminated subjects for in situ sampling in the event of radiocaesium release from a nuclear accident of other nuclear activities. This kit is to be used for screening the internal contamination level of a population exposed to radiocaesium fallout and could be seen as a fast method of whole-body counting, suitable for large scale determinations. The separation of caesium from urine by adsorption on the copper compound was studied and it was determined that caesium efficiently adsorbed from urine. The contamination control kit is a practical alternative to urine sampling since caesium is concentrated to a small volume, by the subject using the kit in situ, gaining advantages in handling, distribution, storage and measuring geometry in the subsequent gamma ray analysis. The kit consists of cotton filters impregnated with copper hexacyanoferrate(II) held by plastic filter holders and performs a rapid flow-through. In order to obtain full caesium adsorption, less than 0.5 g of the compound is required for a 2 litre urine sample. No chemical preparation or change in pH of the urine sample is needed before adsorption. When using the kit in an authentic internal caesium contamination situation, the adsorbed fraction of caesium was 97 ± 3% (SD) in ten samples. (author)

  14. Precipitation patterns control the distribution and export of large wood at the catchment scale

    OpenAIRE

    Il Seo, Jung; Nakamura, Futoshi; Chun, Kun Woo; Kim, Suk Woo; Grant, Gordon E.

    2015-01-01

    Large wood (LW) plays an important role in river ecosystems, but LW-laden floods may cause serious damage to human lives and property. The relationship between precipitation patterns and variations in LW distribution and export at the watershed scale is poorly understood. To explore these linkages, we examined differences in LW distribution as a function of channel morphologies in six watersheds located in southern and northern Japan and analysed the impacts of different precipitation pattern...

  15. Study of multi-functional precision optical measuring system for large scale equipment

    Science.gov (United States)

    Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi

    2017-10-01

    The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.

  16. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  17. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    Science.gov (United States)

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  18. Development of a Shipboard Remote Control and Telemetry Experimental System for Large-Scale Model’s Motions and Loads Measurement in Realistic Sea Waves

    Directory of Open Access Journals (Sweden)

    Jialong Jiao

    2017-10-01

    Full Text Available Wave-induced motion and load responses are important criteria for ship performance evaluation. Physical experiments have long been an indispensable tool in the predictions of ship’s navigation state, speed, motions, accelerations, sectional loads and wave impact pressure. Currently, majority of the experiments are conducted in laboratory tank environment, where the wave environments are different from the realistic sea waves. In this paper, a laboratory tank testing system for ship motions and loads measurement is reviewed and reported first. Then, a novel large-scale model measurement technique is developed based on the laboratory testing foundations to obtain accurate motion and load responses of ships in realistic sea conditions. For this purpose, a suite of advanced remote control and telemetry experimental system was developed in-house to allow for the implementation of large-scale model seakeeping measurement at sea. The experimental system includes a series of technique sensors, e.g., the Global Position System/Inertial Navigation System (GPS/INS module, course top, optical fiber sensors, strain gauges, pressure sensors and accelerometers. The developed measurement system was tested by field experiments in coastal seas, which indicates that the proposed large-scale model testing scheme is capable and feasible. Meaningful data including ocean environment parameters, ship navigation state, motions and loads were obtained through the sea trial campaign.

  19. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  20. Synthetically chemical-electrical mechanism for controlling large scale reversible deformation of liquid metal objects

    Science.gov (United States)

    Zhang, Jie; Sheng, Lei; Liu, Jing

    2014-11-01

    Reversible deformation of a machine holds enormous promise across many scientific areas ranging from mechanical engineering to applied physics. So far, such capabilities are still hard to achieve through conventional rigid materials or depending mainly on elastomeric materials, which however own rather limited performances and require complicated manipulations. Here, we show a basic strategy which is fundamentally different from the existing ones to realize large scale reversible deformation through controlling the working materials via the synthetically chemical-electrical mechanism (SCHEME). Such activity incorporates an object of liquid metal gallium whose surface area could spread up to five times of its original size and vice versa under low energy consumption. Particularly, the alterable surface tension based on combination of chemical dissolution and electrochemical oxidation is ascribed to the reversible shape transformation, which works much more flexible than many former deformation principles through converting electrical energy into mechanical movement. A series of very unusual phenomena regarding the reversible configurational shifts are disclosed with dominant factors clarified. This study opens a generalized way to combine the liquid metal serving as shape-variable element with the SCHEME to compose functional soft machines, which implies huge potential for developing future smart robots to fulfill various complicated tasks.

  1. Iterative learning-based decentralized adaptive tracker for large-scale systems: a digital redesign approach.

    Science.gov (United States)

    Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua

    2011-07-01

    In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Coupled continuous time-random walks in quenched random environment

    Science.gov (United States)

    Magdziarz, M.; Szczotka, W.

    2018-02-01

    We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.

  3. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  4. Genetic and Environmental Influences on Self-Control : Assessing Self-Control with the ASEBA Self-Control Scale

    NARCIS (Netherlands)

    Willems, Yayouk E; Dolan, Conor V.; van Beijsterveldt, Catharina E M; de Zeeuw, Eveline L; Boomsma, Dorret I; Bartels, Meike; Finkenauer, Catrin|info:eu-repo/dai/nl/182572382

    This study used a theoretically-derived set of items of the Achenbach System of Empirically Based Assessment to develop the Achenbach Self-Control Scale (ASCS) for 7-16 year olds. Using a large dataset of over 20,000 children, who are enrolled in the Netherlands Twin Register, we demonstrated the

  5. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  6. Efficacy of the epilepsy nurse: Results of a randomized controlled study.

    Science.gov (United States)

    Pfäfflin, Margarete; Schmitz, Bettina; May, Theodor W

    2016-07-01

    We investigated the efficacy of epilepsy nurses on satisfaction with counseling about epilepsy in a randomized, controlled, prospective trial. Patients with epilepsy treated by neurologists in outpatient clinics were consecutively enrolled and randomly allocated to either the epilepsy nurse (EN) group (n = 92) or the control group (n = 95). Patients in the EN group were advised according to their needs by epilepsy nurses. The control group received routine care without additional counseling. The EN group completed the questionnaires before the first consultation (T1) and 6 months later (T2); the control group completed the questionnaires twice with an interval of 6 months. Primary outcome measure was satisfaction of patients with information and support. Secondary outcome measures were satisfaction with patient-doctor relationship, organization of treatment, epilepsy knowledge, coping, and restrictions in daily life. Anxiety and depression (Hospital Anxiety and Depression Scale) and global Quality of Life (item from QOLIE-31) were also assessed. Statistical analysis included generalized estimating equation (GEE) and nonparametric tests. Satisfaction with information and support improved significantly in the EN group compared to the control group (GEE, interaction group × time, p = 0.001). In addition, Epilepsy Knowledge (p = 0.014) and Coping (subscale Information Seeking) (p = 0.023) improved. Increase in satisfaction with counseling was dependent on patients' needs for information and on the amount of received information (Jonckheere-Terpstra test, p < 0.001). No differences between the groups were observed on other epilepsy-specific scales. A reliable questionnaire for satisfaction with epilepsy care has been developed. Epilepsy nurses improve the satisfaction of patients with counseling and information about epilepsy and concomitant problems. Wiley Periodicals, Inc. © 2016 International League Against Epilepsy.

  7. Oral analgesia vs intravenous conscious sedation during Essure Micro-Insert sterilization procedure: randomized, double-blind, controlled trial.

    Science.gov (United States)

    Thiel, John A; Lukwinski, Angelina; Kamencic, Huse; Lim, Hyung

    2011-01-01

    To compare the pain reported by patients during the Essure Micro-Insert sterilization procedure using either intravenous conscious sedation or oral analgesia. Randomized, double-blind, placebo-controlled trial (Canadian Task Force classification I). Tertiary care ambulatory women's clinic. Eighty women of reproductive age women requesting permanent sterilization. Hysteroscopic placement of the Essure Micro-Insert permanent birth control system. Patients undergoing placement of the Essure Micro-Insert system for permanent contraception were randomized to receive either intravenous conscious sedation, oral analgesia, or placebo. During the procedure, pain scores were recorded using a visual analog scale. Patients in the oral analgesia group reported slightly more pain during insertion of the hysteroscope and placement of the second micro-insert; the groups were otherwise equivalent. They were also equivalent when all visual analog scale scores were combined. Oral analgesia is an effective method of pain control during placement of the Essure Micro-Insert permanent birth control system. Copyright © 2011 AAGL. Published by Elsevier Inc. All rights reserved.

  8. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  9. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  10. Using Distributed Fiber Optic Sensing to Monitor Large Scale Permafrost Transitions: Preliminary Results from a Controlled Thaw Experiment

    Science.gov (United States)

    Ajo Franklin, J. B.; Wagner, A. M.; Lindsey, N.; Dou, S.; Bjella, K.; Daley, T. M.; Freifeld, B. M.; Ulrich, C.; Gelvin, A.; Morales, A.; James, S. R.; Saari, S.; Ekblaw, I.; Wood, T.; Robertson, M.; Martin, E. R.

    2016-12-01

    In a warming world, permafrost landscapes are being rapidly transformed by thaw, yielding surface subsidence and groundwater flow alteration. The same transformations pose a threat to arctic infrastructure and can induce catastrophic failure of the roads, runways, and pipelines on which human habitation depends. Scalable solutions to monitoring permafrost thaw dynamics are required to both quantitatively understand biogeochemical feedbacks as well as to protect built infrastructure from damage. Unfortunately, permafrost alteration happens over the time scale of climate change, years to decades, a decided challenge for testing new sensing technologies in a limited context. One solution is to engineer systems capable of rapidly thawing large permafrost units to allow short duration experiments targeting next-generation sensing approaches. We present preliminary results from a large-scale controlled permafrost thaw experiment designed to evaluate the utility of different geophysical approaches for tracking the cause, precursors, and early phases of thaw subsidence. We focus on the use of distributed fiber optic sensing for this challenge and deployed distributed temperature (DTS), strain (DSS), and acoustic (DAS) sensing systems in a 2D array to detect thaw signatures. A 10 x 15 x 1 m section of subsurface permafrost was heated using an array of 120 downhole heaters (60 w) at an experimental site near Fairbanks, AK. Ambient noise analysis of DAS datasets collected at the plot, coupled to shear wave inversion, was utilized to evaluate changes in shear wave velocity associated with heating and thaw. These measurements were confirmed by seismic surveys collected using a semi-permanent orbital seismic source activated on a daily basis. Fiber optic measurements were complemented by subsurface thermistor and thermocouple arrays, timelapse total station surveys, LIDAR, secondary seismic measurements (geophone and broadband recordings), timelapse ERT, borehole NMR, soil

  11. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  12. The method of measurement and synchronization control for large-scale complex loading system

    International Nuclear Information System (INIS)

    Liao Min; Li Pengyuan; Hou Binglin; Chi Chengfang; Zhang Bo

    2012-01-01

    With the development of modern industrial technology, measurement and control system was widely used in high precision, complex industrial control equipment and large-tonnage loading device. The measurement and control system is often used to analyze the distribution of stress and displacement in the complex bearing load or the complex nature of the mechanical structure itself. In ITER GS mock-up with 5 flexible plates, for each load combination, detect and measure potential slippage between the central flexible plate and the neighboring spacers is necessary as well as the potential slippage between each pre-stressing bar and its neighboring plate. The measurement and control system consists of seven sets of EDC controller and board, computer system, 16-channel quasi-dynamic strain gauge, 25 sets of displacement sensors, 7 sets of load and displacement sensors in the cylinders. This paper demonstrates the principles and methods of EDC220 digital controller to achieve synchronization control, and R and D process of multi-channel loading control software and measurement software. (authors)

  13. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.

    2008-01-01

    Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another

  15. Massage therapy for fibromyalgia: a systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Li, Yan-hui; Wang, Feng-yun; Feng, Chun-qing; Yang, Xia-feng; Sun, Yi-hua

    2014-01-01

    Although some studies evaluated the effectiveness of massage therapy for fibromyalgia (FM), the role of massage therapy in the management of FM remained controversial. The purpose of this systematic review is to evaluate the evidence of massage therapy for patients with FM. Electronic databases (up to June 2013) were searched to identify relevant studies. The main outcome measures were pain, anxiety, depression, and sleep disturbance. Two reviewers independently abstracted data and appraised risk of bias. The risk of bias of eligible studies was assessed based on Cochrane tools. Standardised mean difference (SMD) and 95% confidence intervals (CI) were calculated by more conservative random-effects model. And heterogeneity was assessed based on the I(2) statistic. Nine randomized controlled trials involving 404 patients met the inclusion criteria. The meta-analyses showed that massage therapy with duration ≥ 5 weeks significantly improved pain (SMD, 0.62; 95% CI 0.05 to 1.20; p = 0.03), anxiety (SMD, 0.44; 95% CI 0.09 to 0.78; p = 0.01), and depression (SMD, 0.49; 95% CI 0.15 to 0.84; p = 0.005) in patients with FM, but not on sleep disturbance (SMD, 0.19; 95% CI -0.38 to 0.75; p = 0.52). Massage therapy with duration ≥ 5 weeks had beneficial immediate effects on improving pain, anxiety, and depression in patients with FM. Massage therapy should be one of the viable complementary and alternative treatments for FM. However, given fewer eligible studies in subgroup meta-analyses and no evidence on follow-up effects, large-scale randomized controlled trials with long follow-up are warrant to confirm the current findings.

  16. Control Coordination of Large Scale Hereditary Systems.

    Science.gov (United States)

    1985-07-01

    Theory - A Hilbert Space Approach, (Academic Press, New York, 1982). [4] W. Findeisen , F. N. Bailey, M. Brdys, K Malinowski, P. Tatjewski and A. Wozniak... Findeisen et al. (1980), in the sense that local models are used in the design of component control laws and a higher level coordination problem is...Vol. 1, pp. 590-591, 1985. 3. W. Findeisen , F.N. Bailley, M. Brdys, K. Malinowski, P. Tatjewski and A. Wozniak, Control Coordination in Hierarchical

  17. Dry cupping for plantar fasciitis: a randomized controlled trial.

    Science.gov (United States)

    Ge, Weiqing; Leson, Chelsea; Vukovic, Corey

    2017-05-01

    [Purpose] The purpose of this study was to determine the effects of dry cupping on pain and function of patients with plantar fasciitis. [Subjects and Methods] Twenty-nine subjects (age 15 to 59 years old, 20 females and 9 males), randomly assigned into the two groups (dry cupping therapy and electrical stimulation therapy groups), participated in this study. The research design was a randomized controlled trial (RCT). Treatments were provided to the subjects twice a week for 4 weeks. Outcome measurements included the Visual Analogue Pain Scale (VAS) (at rest, first in the morning, and with activities), the Foot and Ankle Ability Measure (FAAM), the Lower Extremity Functional Scale (LEFS), as well as the pressure pain threshold. [Results]The data indicated that both dry cupping therapy and electrical stimulation therapy could reduce pain and increase function significantly in the population tested, as all the 95% Confidence Intervals (CIs) did not include 0 except for the pressure pain threshold. There was no significant difference between the dry cupping therapy and electrical stimulation groups in all the outcome measurements. [Conclusion] These results support that both dry cupping therapy and electrical stimulation therapy could reduce pain and increase function in the population tested.

  18. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  19. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  20. On the Fluctuating Component of the Sun's Large-Scale Magnetic Field

    Science.gov (United States)

    Wang, Y.-M.; Sheeley, N. R., Jr.

    2003-06-01

    The Sun's large-scale magnetic field and its proxies are known to undergo substantial variations on timescales much less than a solar cycle but longer than a rotation period. Examples of such variations include the double activity maximum inferred by Gnevyshev, the large peaks in the interplanetary field strength observed in 1982 and 1991, and the 1.3-1.4 yr periodicities detected over limited time intervals in solar wind speed and geomagnetic activity. We consider the question of the extent to which these variations are stochastic in nature. For this purpose, we simulate the evolution of the Sun's equatorial dipole strength and total open flux under the assumption that the active region sources (BMRs) are distributed randomly in longitude. The results are then interpreted with the help of a simple random walk model including dissipation. We find that the equatorial dipole and open flux generally exhibit multiple peaks during each 11 yr cycle, with the highest peak as likely to occur during the declining phase as at sunspot maximum. The widths of the peaks are determined by the timescale τ~1 yr for the equatorial dipole to decay through the combined action of meridional flow, differential rotation, and supergranular diffusion. The amplitudes of the fluctuations depend on the strengths and longitudinal phase relations of the BMRs, as well as on the relative rates of flux emergence and decay. We conclude that stochastic processes provide a viable explanation for the ``Gnevyshev gaps'' and for the existence of quasi periodicities in the range ~1-3 yr.

  1. The Physiotherapy for Femoroacetabular Impingement Rehabilitation STudy (physioFIRST): A Pilot Randomized Controlled Trial.

    Science.gov (United States)

    Kemp, Joanne L; Coburn, Sally L; Jones, Denise M; Crossley, Kay M

    2018-04-01

    Study Design A pilot double-blind randomized controlled trial (RCT). Background The effectiveness of physical therapy for femoroacetabular impingement syndrome (FAIS) is unknown. Objectives To determine the feasibility of an RCT investigating the effectiveness of a physical therapy intervention for FAIS. Methods Participants were 17 women and 7 men (mean ± SD age, 37 ± 8 years; body mass index, 25.4 ± 3.4 kg/m 2 ) with FAIS who received physical therapy interventions provided over 12 weeks. The FAIS-specific physical therapy group received personalized progressive strengthening and functional retraining. The control group received standardized stretching exercises. In addition, both groups received manual therapy, progressive physical activity, and education. The primary outcome was feasibility, including integrity of the protocol, recruitment and retention, outcome measures, randomization procedure, and sample-size estimate. Secondary outcomes included hip pain and function (international Hip Outcome Tool-33 [iHOT-33]) and hip muscle strength. Poststudy interviews were conducted to determine potential improvements for future studies. Results Twenty-four (100%) patients with known eligibility agreed to participate. Four patients (17%) were lost to follow-up. All participants and the tester remained blinded, and the control intervention was acceptable to participants. The between-group mean differences in change scores were 16 (95% confidence interval [CI]: -9, 38) for the iHOT-33 and 0.24 (95% CI: 0.02, 0.47) Nm/kg for hip adduction strength, favoring the FAIS-specific physical therapy group. Using an effect size of 0.61, between-group improvements for the iHOT-33 suggest that 144 participants are required for a full-scale RCT. Conclusion A full-scale RCT of physical therapy for FAIS is feasible. A FAIS-specific physical therapy program has the potential for a moderate to large positive effect on hip pain, function, and hip adductor strength. Level of Evidence

  2. Mixed methods evaluation of a randomized control pilot trial targeting sugar-sweetened beverage behaviors.

    Science.gov (United States)

    Zoellner, Jamie; Cook, Emily; Chen, Yvonnes; You, Wen; Davy, Brenda; Estabrooks, Paul

    2013-02-01

    This Excessive sugar-sweetened beverage (SSB) consumption and low health literacy skills have emerged as two public health concerns in the United States (US); however, there is limited research on how to effectively address these issues among adults. As guided by health literacy concepts and the Theory of Planned Behavior (TPB), this randomized controlled pilot trial applied the RE-AIM framework and a mixed methods approach to examine a sugar-sweetened beverage (SSB) intervention (SipSmartER), as compared to a matched-contact control intervention targeting physical activity (MoveMore). Both 5-week interventions included two interactive group sessions and three support telephone calls. Executing a patient-centered developmental process, the primary aim of this paper was to evaluate patient feedback on intervention content and structure. The secondary aim was to understand the potential reach (i.e., proportion enrolled, representativeness) and effectiveness (i.e. health behaviors, theorized mediating variables, quality of life) of SipSmartER. Twenty-five participants were randomized to SipSmartER (n=14) or MoveMore (n=11). Participants' intervention feedback was positive, ranging from 4.2-5.0 on a 5-point scale. Qualitative assessments reavealed several opportunties to improve clarity of learning materials, enhance instructions and communication, and refine research protocols. Although SSB consumption decreased more among the SipSmartER participants (-256.9 ± 622.6 kcals), there were no significant group differences when compared to control participants (-199.7 ± 404.6 kcals). Across both groups, there were significant improvements for SSB attitudes, SSB behavioral intentions, and two media literacy constructs. The value of using a patient-centered approach in the developmental phases of this intervention was apparent, and pilot findings suggest decreased SSB may be achieved through targeted health literacy and TPB strategies. Future efforts are needed to examine

  3. EFT of large scale structures in redshift space

    Science.gov (United States)

    Lewandowski, Matthew; Senatore, Leonardo; Prada, Francisco; Zhao, Cheng; Chuang, Chia-Hsun

    2018-03-01

    We further develop the description of redshift-space distortions within the effective field theory of large scale structures. First, we generalize the counterterms to include the effect of baryonic physics and primordial non-Gaussianity. Second, we evaluate the IR resummation of the dark matter power spectrum in redshift space. This requires us to identify a controlled approximation that makes the numerical evaluation straightforward and efficient. Third, we compare the predictions of the theory at one loop with the power spectrum from numerical simulations up to ℓ=6 . We find that the IR resummation allows us to correctly reproduce the baryon acoustic oscillation peak. The k reach—or, equivalently, the precision for a given k —depends on additional counterterms that need to be matched to simulations. Since the nonlinear scale for the velocity is expected to be longer than the one for the overdensity, we consider a minimal and a nonminimal set of counterterms. The quality of our numerical data makes it hard to firmly establish the performance of the theory at high wave numbers. Within this limitation, we find that the theory at redshift z =0.56 and up to ℓ=2 matches the data at the percent level approximately up to k ˜0.13 h Mpc-1 or k ˜0.18 h Mpc-1 , depending on the number of counterterms used, with a potentially large improvement over former analytical techniques.

  4. Moderator Roles of Optimism and Weight Control on the Impact of Playing Exergames on Happiness: The Perspective of Social Cognitive Theory Using A Randomized Controlled Trial.

    Science.gov (United States)

    Nguyen, Huynh Van; Huang, Han-Chung; Wong, May-Kuen; Yang, Ya-Hui; Huang, Tzu-Ling; Teng, Ching-I

    2018-04-30

    The literature on exergames has examined their impact on user-perceived psychological health (i.e., user-perceived happiness), but little is known about whether such an impact depends on user characteristics. Therefore, this study used the perspective of social cognitive theory (SCT) to identify potential moderators (i.e., whether the user is optimistic or attempting to control his or her weight) of the impact of playing exergames on user-perceived happiness. This large-scale randomized controlled trial recruited 337 college students (of whom 57.3% were female and all were aged 20-40 years) as participants. The participants were randomly assigned to the intervention group or the control group. Participants in the intervention group were asked to use an Xbox 360 to play 1 of 10 exergame programs for 30 minutes once a week for 2 weeks. Participants in the control group were not required to do so. Repeated-measures ANOVA was used for the analyses. The analytical results indicate that playing exergames helped to maintain happiness levels and prevented them from decreasing. The maintained happiness was more prominent among participants who were trying to control their weight, but did not differ between participants who were highly optimistic and those who were less optimistic. This study is the first using SCT to explain the contingent effect of playing exergames on user happiness. Exergames can maintain happiness among users, and such maintenance can be the strongest among users who are trying to control their weight.

  5. The impact of occupational therapy in Parkinson's disease: a randomized controlled feasibility study.

    Science.gov (United States)

    Sturkenboom, Ingrid H; Graff, Maud J; Borm, George F; Veenhuizen, Yvonne; Bloem, Bastiaan R; Munneke, Marten; Nijhuis-van der Sanden, Maria W

    2013-02-01

    To evaluate the feasibility of a randomized controlled trial including process and potential impact of occupational therapy in Parkinson's disease. Process and outcome were quantitatively and qualitatively evaluated in an exploratory multicentre, two-armed randomized controlled trial at three months. Forty-three community-dwelling patients with Parkinson's disease and difficulties in daily activities, their primary caregivers and seven occupational therapists. Ten weeks of home-based occupational therapy according to the Dutch guidelines of occupational therapy in Parkinson's disease versus no occupational therapy in the control group. Process evaluation measured accrual, drop-out, intervention delivery and protocol adherence. Primary outcome measures of patients assessed daily functioning: Canadian Occupational Performance Measure (COPM) and Assessment of Motor and Process Skills. Primary outcome for caregivers was caregiver burden: Zarit Burden Inventory. Participants' perspectives of the intervention were explored using questionnaires and in-depth interviews. Inclusion was 23% (43/189), drop-out 7% (3/43) and unblinding of assessors 33% (13/40). Full intervention protocol adherence was 74% (20/27), but only 60% (71/119) of baseline Canadian Occupational Performance Measure priorities were addressed in the intervention. The outcome measures revealed negligible to small effects in favour of the intervention group. Almost all patients and caregivers of the intervention group were satisfied with the results. They perceived: 'more grip on the situation' and used 'practical advices that make life easier'. Therapists were satisfied, but wished for a longer intervention period. The positive perceived impact of occupational therapy warrants a large-scale trial. Adaptations in instructions and training are needed to use the Canadian Occupational Performance Measure as primary outcome measure.

  6. The health system and population health implications of large-scale diabetes screening in India: a microsimulation model of alternative approaches.

    Directory of Open Access Journals (Sweden)

    Sanjay Basu

    2015-05-01

    Full Text Available Like a growing number of rapidly developing countries, India has begun to develop a system for large-scale community-based screening for diabetes. We sought to identify the implications of using alternative screening instruments to detect people with undiagnosed type 2 diabetes among diverse populations across India.We developed and validated a microsimulation model that incorporated data from 58 studies from across the country into a nationally representative sample of Indians aged 25-65 y old. We estimated the diagnostic and health system implications of three major survey-based screening instruments and random glucometer-based screening. Of the 567 million Indians eligible for screening, depending on which of four screening approaches is utilized, between 158 and 306 million would be expected to screen as "high risk" for type 2 diabetes, and be referred for confirmatory testing. Between 26 million and 37 million of these people would be expected to meet international diagnostic criteria for diabetes, but between 126 million and 273 million would be "false positives." The ratio of false positives to true positives varied from 3.9 (when using random glucose screening to 8.2 (when using a survey-based screening instrument in our model. The cost per case found would be expected to be from US$5.28 (when using random glucose screening to US$17.06 (when using a survey-based screening instrument, presenting a total cost of between US$169 and US$567 million. The major limitation of our analysis is its dependence on published cohort studies that are unlikely fully to capture the poorest and most rural areas of the country. Because these areas are thought to have the lowest diabetes prevalence, this may result in overestimation of the efficacy and health benefits of screening.Large-scale community-based screening is anticipated to produce a large number of false-positive results, particularly if using currently available survey-based screening

  7. Psychosocial benefits of workplace physical exercise: cluster randomized controlled trial.

    Science.gov (United States)

    Jakobsen, Markus D; Sundstrup, Emil; Brandt, Mikkel; Andersen, Lars L

    2017-10-10

    While benefits of workplace physical exercise on physical health is well known, little is known about the psychosocial effects of such initiatives. This study evaluates the effect of workplace versus home-based physical exercise on psychosocial factors among healthcare workers. A total of 200 female healthcare workers (Age: 42.0, BMI: 24.1) from 18 departments at three hospitals were cluster-randomized to 10 weeks of: 1) home-based physical exercise (HOME) performed alone during leisure time for 10 min 5 days per week or 2) workplace physical exercise (WORK) performed in groups during working hours for 10 min 5 days per week and up to 5 group-based coaching sessions on motivation for regular physical exercise. Vitality and mental health (SF-36, scale 0-100), psychosocial work environment (COPSOQ, scale 0-100), work- and leisure disability (DASH, 0-100), control- (Bournemouth, scale 0-10) and concern about pain (Pain Catastrophizing Scale, scale 0-10) were assessed at baseline and at 10-week follow-up. Vitality as well as control and concern about pain improved more following WORK than HOME (all p health remained unchanged. Between-group differences at follow-up (WORK vs. HOME) were 7 [95% confidence interval (95% CI) 3 to 10] for vitality, -0.8 [95% CI -1.3 to -0.3] for control of pain and -0.9 [95% CI -1.4 to -0.5] for concern about pain, respectively. Performing physical exercise together with colleagues during working hours was more effective than home-based exercise in improving vitality and concern and control of pain among healthcare workers. These benefits occurred in spite of increased work pace. NCT01921764 at ClinicalTrials.gov . Registered 10 August 2013.

  8. A scaling analysis of electronic localization in two-dimensional random media

    International Nuclear Information System (INIS)

    Ye Zhen

    2003-01-01

    By an improved scaling analysis, we suggest that there may appear two possibilities concerning the electronic localization in two-dimensional random media. The first is that all electronic states are localized in two dimensions, as conjectured previously. The second possibility is that electronic behaviors in two- and three-dimensional random systems are similar, in agreement with a recent calculation based on a direct calculation of the conductance with the use of the Kubo formula. In this case, non-localized states are possible in two dimensions, and have some peculiar properties. A few predictions are proposed. Moreover, the present analysis accommodates results from the previous scaling analysis

  9. GAT: a graph-theoretical analysis toolbox for analyzing between-group differences in large-scale structural and functional brain networks.

    Science.gov (United States)

    Hosseini, S M Hadi; Hoeft, Fumiko; Kesler, Shelli R

    2012-01-01

    In recent years, graph theoretical analyses of neuroimaging data have increased our understanding of the organization of large-scale structural and functional brain networks. However, tools for pipeline application of graph theory for analyzing topology of brain networks is still lacking. In this report, we describe the development of a graph-analysis toolbox (GAT) that facilitates analysis and comparison of structural and functional network brain networks. GAT provides a graphical user interface (GUI) that facilitates construction and analysis of brain networks, comparison of regional and global topological properties between networks, analysis of network hub and modules, and analysis of resilience of the networks to random failure and targeted attacks. Area under a curve (AUC) and functional data analyses (FDA), in conjunction with permutation testing, is employed for testing the differences in network topologies; analyses that are less sensitive to the thresholding process. We demonstrated the capabilities of GAT by investigating the differences in the organization of regional gray-matter correlation networks in survivors of acute lymphoblastic leukemia (ALL) and healthy matched Controls (CON). The results revealed an alteration in small-world characteristics of the brain networks in the ALL survivors; an observation that confirm our hypothesis suggesting widespread neurobiological injury in ALL survivors. Along with demonstration of the capabilities of the GAT, this is the first report of altered large-scale structural brain networks in ALL survivors.

  10. GAT: a graph-theoretical analysis toolbox for analyzing between-group differences in large-scale structural and functional brain networks.

    Directory of Open Access Journals (Sweden)

    S M Hadi Hosseini

    Full Text Available In recent years, graph theoretical analyses of neuroimaging data have increased our understanding of the organization of large-scale structural and functional brain networks. However, tools for pipeline application of graph theory for analyzing topology of brain networks is still lacking. In this report, we describe the development of a graph-analysis toolbox (GAT that facilitates analysis and comparison of structural and functional network brain networks. GAT provides a graphical user interface (GUI that facilitates construction and analysis of brain networks, comparison of regional and global topological properties between networks, analysis of network hub and modules, and analysis of resilience of the networks to random failure and targeted attacks. Area under a curve (AUC and functional data analyses (FDA, in conjunction with permutation testing, is employed for testing the differences in network topologies; analyses that are less sensitive to the thresholding process. We demonstrated the capabilities of GAT by investigating the differences in the organization of regional gray-matter correlation networks in survivors of acute lymphoblastic leukemia (ALL and healthy matched Controls (CON. The results revealed an alteration in small-world characteristics of the brain networks in the ALL survivors; an observation that confirm our hypothesis suggesting widespread neurobiological injury in ALL survivors. Along with demonstration of the capabilities of the GAT, this is the first report of altered large-scale structural brain networks in ALL survivors.

  11. The influence of Seychelles Dome on the large scale Tropical Variability

    Science.gov (United States)

    Manola, Iris; Selten, Frank; Hazeleger, Wilco

    2013-04-01

    The Seychelles Dome (SD) is the thermocline ridge just South of the equator in the Western Indian Ocean basin. It is characterized by strong atmospheric convection and a shallow thermocline and is associated with large intraseasonal convection and SST variability (Harrison and Vecchi 2001). The SD is influenced by surface and subsurface processes, such as air-sea fluxes, Ekman upwelling from wind stress curl, ocean dynamics (vertical mixing) and oceanic Rossby waves from southeastern Indian Ocean. The favoring season for a strong SD is the boreal winter, where the thermocline is most shallow. Then the southeasterly trade winds converge with the northwesterly monsoonal winds over the intertropical convergence zone and cause cyclonic wind stress curl that drives Ekman divergence and a ridging of the thermocline. It is found that the subseasonal and interranual variability of the SD is influenced by large scale events, such as the Indian Ocean Dipole (IOD), the ENSO and the Madden-Julian Oscillation (MJO) (Tozuka et al., 2010, Lloyd and Vecchi, 2010). The SD is enhanced by cooling events in the Western Indian Ocean and easterly winds that raise the thermocline and increase the upwelling. This can be associated with a strong Walker circulation, like negative IOD conditions or La Nina-like conditions. So far the studies focus on the origins of the SD variability, but the influence of the SD itself on regional or large scale climate is largely unknown. In this study we focus on the influence of the SD variations on the large scale tropical circulation. We analyze the covariance of the SD variations and the tropical circulation in a 200 year control imulation of the climate model EC-EARTH and perform idealized SST forced simulations to study the character of the atmospheric response and its relation to ENSO, IOD and MJO. References -Harrison, D. E. and G. A. Vecchi, 2001: January 1999 Indian Ocean cooling event. Geophys. Res. Lett., 28, 3717-3720. -Lloyd, I. D., and G. A

  12. Prednisolone and acupuncture in Bell's palsy: study protocol for a randomized, controlled trial

    Directory of Open Access Journals (Sweden)

    Wang Kangjun

    2011-06-01

    Full Text Available Abstract Background There are a variety of treatment options for Bell's palsy. Evidence from randomized controlled trials indicates corticosteroids can be used as a proven therapy for Bell's palsy. Acupuncture is one of the most commonly used methods to treat Bell's palsy in China. Recent studies suggest that staging treatment is more suitable for Bell's palsy, according to different path-stages of this disease. The aim of this study is to compare the effects of prednisolone and staging acupuncture in the recovery of the affected facial nerve, and to verify whether prednisolone in combination with staging acupuncture is more effective than prednisolone alone for Bell's palsy in a large number of patients. Methods/Design In this article, we report the design and protocol of a large sample multi-center randomized controlled trial to treat Bell's palsy with prednisolone and/or acupuncture. In total, 1200 patients aged 18 to 75 years within 72 h of onset of acute, unilateral, peripheral facial palsy will be assessed. There are six treatment groups, with four treated according to different path-stages and two not. These patients are randomly assigned to be in one of the following six treatment groups, i.e. 1 placebo prednisolone group, 2 prednisolone group, 3 placebo prednisolone plus acute stage acupuncture group, 4 prednisolone plus acute stage acupuncture group, 5 placebo prednisolone plus resting stage acupuncture group, 6 prednisolone plus resting stage acupuncture group. The primary outcome is the time to complete recovery of facial function, assessed by Sunnybrook system and House-Brackmann scale. The secondary outcomes include the incidence of ipsilateral pain in the early stage of palsy (and the duration of this pain, the proportion of patients with severe pain, the occurrence of synkinesis, facial spasm or contracture, and the severity of residual facial symptoms during the study period. Discussion The result of this trial will assess the

  13. Randomness confidence bands of fractal scaling exponents for financial price returns

    International Nuclear Information System (INIS)

    Ibarra-Valdez, C.; Alvarez, J.; Alvarez-Ramirez, J.

    2016-01-01

    Highlights: • A robust test for randomness of price returns is proposed. • The DFA scaling exponent is contrasted against confidence bands for random sequences. • The size of the band depends of the sequence length. • Crude oil and USA stock markets have been rarely inefficient. - Abstract: The weak-form of the efficient market hypothesis (EMH) establishes that price returns behave as a pure random process and so their outcomes cannot be forecasted. The detrended fluctuation analysis (DFA) has been widely used to test the weak-form of the EMH by showing that time series of price returns are serially uncorrelated. In this case, the DFA scaling exponent exhibits deviations from the theoretical value of 0.5. This work considers the test of the EMH for DFA implementation on a sliding window, which is an approach that is intended to monitor the evolution of markets. Under these conditions, the scaling exponent exhibits important variations over the scrutinized period that can offer valuable insights in the behavior of the market provided the estimated scaling value is kept within strict statistical tests to verify the presence or not of serial correlations in the price returns. In this work, the statistical tests are based on comparing the estimated scaling exponent with the values obtained from pure Gaussian sequences with the length of the real time series. In this way, the presence of serial correlations can be guaranteed only in terms of the confidence bands of a pure Gaussian process. The crude oil (WTI) and the USA stock (DJIA) markets are used to illustrate the methodology.

  14. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  15. Incipient multiple fault diagnosis in real time with applications to large-scale systems

    International Nuclear Information System (INIS)

    Chung, H.Y.; Bien, Z.; Park, J.H.; Seon, P.H.

    1994-01-01

    By using a modified signed directed graph (SDG) together with the distributed artificial neutral networks and a knowledge-based system, a method of incipient multi-fault diagnosis is presented for large-scale physical systems with complex pipes and instrumentations such as valves, actuators, sensors, and controllers. The proposed method is designed so as to (1) make a real-time incipient fault diagnosis possible for large-scale systems, (2) perform the fault diagnosis not only in the steady-state case but also in the transient case as well by using a concept of fault propagation time, which is newly adopted in the SDG model, (3) provide with highly reliable diagnosis results and explanation capability of faults diagnosed as in an expert system, and (4) diagnose the pipe damage such as leaking, break, or throttling. This method is applied for diagnosis of a pressurizer in the Kori Nuclear Power Plant (NPP) unit 2 in Korea under a transient condition, and its result is reported to show satisfactory performance of the method for the incipient multi-fault diagnosis of such a large-scale system in a real-time manner

  16. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  17. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  18. Talking About The Smokes: a large-scale, community-based participatory research project.

    Science.gov (United States)

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  19. Stochastic inflation lattice simulations: Ultra-large scale structure of the universe

    International Nuclear Information System (INIS)

    Salopek, D.S.

    1990-11-01

    Non-Gaussian fluctuations for structure formation may arise in inflation from the nonlinear interaction of long wavelength gravitational and scalar fields. Long wavelength fields have spatial gradients α -1 triangledown small compared to the Hubble radius, and they are described in terms of classical random fields that are fed by short wavelength quantum noise. Lattice Langevin calculations are given for a ''toy model'' with a scalar field interacting with an exponential potential where one can obtain exact analytic solutions of the Fokker-Planck equation. For single scalar field models that are consistent with current microwave background fluctuations, the fluctuations are Gaussian. However, for scales much larger than our observable Universe, one expects large metric fluctuations that are non-Guassian. This example illuminates non-Gaussian models involving multiple scalar fields which are consistent with current microwave background limits. 21 refs., 3 figs

  20. Randomized central limit theorems: A unified theory.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  1. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  2. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  3. The analysis of MAI in large scale MIMO-CDMA system

    Science.gov (United States)

    Berceanu, Madalina-Georgiana; Voicu, Carmen; Halunga, Simona

    2016-12-01

    Recently, technological development imposed a rapid growth in the use of data carried by cellular services, which also implies the necessity of higher data rates and lower latency. To meet the users' demands, it was brought into discussion a series of new data processing techniques. In this paper, we approached the MIMO technology that uses multiple antennas at the receiver and transmitter ends. To study the performances obtained by this technology, we proposed a MIMO-CDMA system, where image transmission has been used instead of random data transmission to take benefit of a larger range of quality indicators. In the simulations we increased the number of antennas, we observed how the performances of the system are modified and, based on that, we were able to make a comparison between a conventional MIMO and a Large Scale MIMO system, in terms of BER and MSSIM index, which is a metric that compares the quality of the image before transmission with the received one.

  4. Acupuncture intervention in ischemic stroke: a randomized controlled prospective study.

    Science.gov (United States)

    Shen, Peng-Fei; Kong, Li; Ni, Li-Wei; Guo, Hai-Long; Yang, Sha; Zhang, Li-Li; Zhang, Zhi-Long; Guo, Jia-Kui; Xiong, Jie; Zhen, Zhong; Shi, Xue-Min

    2012-01-01

    Stroke is one of the most common causes of death and few pharmacological therapies show benefits in ischemic stroke. In this study, 290 patients aged 40-75 years old with first onset of acute ischemic stroke (more than 24 hours but within 14 days) were treated with standard treatments, and then were randomly allocated into an intervention group (treated with resuscitating acupuncture) and a control group (treated using sham-acupoints). Primary outcome measures included Barthel Index (BI), relapse and death up to six months. For the 290 patients in both groups, one case in the intervention group died, and two cases in the control group died from the disease (p = 0.558). Six patients of the 144 cases in the intervention group had relapse, whereas 34 of 143 patients had relapse in the control group (p two groups, respectively (p two groups for the National Institute of Health Stroke Scale (NIHSS), not at two weeks (7.03 ± 3.201 vs. 8.13 ± 3.634; p = 0.067), but at four weeks (4.15 ± 2.032 vs. 6.35 ± 3.131, p Stroke Scale (CSS) at four weeks showed more improvement in the intervention group than that in the control group (9.40 ± 4.51 vs. 13.09 ± 5.80, p Stroke Specific Quality of Life Scale (SS-QOL) at six months was higher in the intervention group (166.63 ± 45.70) than the control group (143.60 ± 50.24; p < 0.01). The results of this clinical trial showed a clinically relevant decrease of relapse in patients treated with resuscitating acupuncture intervention by the end of six months, compared with needling at the sham-acupoints. The resuscitating acupuncture intervention could also improve self-care ability and quality of life, evaluated with BI, NIHSS, CSS, Oxford Handicap Scale (OHS), and SS-QOL.

  5. Synthesis of ordered large-scale ZnO nanopore arrays

    International Nuclear Information System (INIS)

    Ding, G.Q.; Shen, W.Z.; Zheng, M.J.; Fan, D.H.

    2006-01-01

    An effective approach is demonstrated for growing ordered large-scale ZnO nanopore arrays through radio-frequency magnetron sputtering deposition on porous alumina membranes (PAMs). The realization of highly ordered hexagonal ZnO nanopore arrays benefits from the unique properties of ZnO (hexagonal structure, polar surfaces, and preferable growth directions) and PAMs (controllable hexagonal nanopores and localized negative charges). Further evidence has been shown through the effects of nanorod size and thermal treatment of PAMs on the yielded morphology of ZnO nanopore arrays. This approach opens the possibility of creating regular semiconducting nanopore arrays for the application of filters, sensors, and templates

  6. Quantum Coherence and Random Fields at Mesoscopic Scales

    International Nuclear Information System (INIS)

    Rosenbaum, Thomas F.

    2016-01-01

    We seek to explore and exploit model, disordered and geometrically frustrated magnets where coherent spin clusters stably detach themselves from their surroundings, leading to extreme sensitivity to finite frequency excitations and the ability to encode information. Global changes in either the spin concentration or the quantum tunneling probability via the application of an external magnetic field can tune the relative weights of quantum entanglement and random field effects on the mesoscopic scale. These same parameters can be harnessed to manipulate domain wall dynamics in the ferromagnetic state, with technological possibilities for magnetic information storage. Finally, extensions from quantum ferromagnets to antiferromagnets promise new insights into the physics of quantum fluctuations and effective dimensional reduction. A combination of ac susceptometry, dc magnetometry, noise measurements, hole burning, non-linear Fano experiments, and neutron diffraction as functions of temperature, magnetic field, frequency, excitation amplitude, dipole concentration, and disorder address issues of stability, overlap, coherence, and control. We have been especially interested in probing the evolution of the local order in the progression from spin liquid to spin glass to long-range-ordered magnet.

  7. Quantum Coherence and Random Fields at Mesoscopic Scales

    Energy Technology Data Exchange (ETDEWEB)

    Rosenbaum, Thomas F. [Univ. of Chicago, IL (United States)

    2016-03-01

    We seek to explore and exploit model, disordered and geometrically frustrated magnets where coherent spin clusters stably detach themselves from their surroundings, leading to extreme sensitivity to finite frequency excitations and the ability to encode information. Global changes in either the spin concentration or the quantum tunneling probability via the application of an external magnetic field can tune the relative weights of quantum entanglement and random field effects on the mesoscopic scale. These same parameters can be harnessed to manipulate domain wall dynamics in the ferromagnetic state, with technological possibilities for magnetic information storage. Finally, extensions from quantum ferromagnets to antiferromagnets promise new insights into the physics of quantum fluctuations and effective dimensional reduction. A combination of ac susceptometry, dc magnetometry, noise measurements, hole burning, non-linear Fano experiments, and neutron diffraction as functions of temperature, magnetic field, frequency, excitation amplitude, dipole concentration, and disorder address issues of stability, overlap, coherence, and control. We have been especially interested in probing the evolution of the local order in the progression from spin liquid to spin glass to long-range-ordered magnet.

  8. Research on the impacts of large-scale electric vehicles integration into power grid

    Science.gov (United States)

    Su, Chuankun; Zhang, Jian

    2018-06-01

    Because of its special energy driving mode, electric vehicles can improve the efficiency of energy utilization and reduce the pollution to the environment, which is being paid more and more attention. But the charging behavior of electric vehicles is random and intermittent. If the electric vehicle is disordered charging in a large scale, it causes great pressure on the structure and operation of the power grid and affects the safety and economic operation of the power grid. With the development of V2G technology in electric vehicle, the study of the charging and discharging characteristics of electric vehicles is of great significance for improving the safe operation of the power grid and the efficiency of energy utilization.

  9. Promoting Handwashing Behavior: The Effects of Large-scale Community and School-level Interventions.

    Science.gov (United States)

    Galiani, Sebastian; Gertler, Paul; Ajzenman, Nicolas; Orsola-Vidal, Alexandra

    2016-12-01

    This paper analyzes a randomized experiment that uses novel strategies to promote handwashing with soap at critical points in time in Peru. It evaluates a large-scale comprehensive initiative that involved both community and school activities in addition to communication campaigns. The analysis indicates that the initiative was successful in reaching the target audience and in increasing the treated population's knowledge about appropriate handwashing behavior. These improvements translated into higher self-reported and observed handwashing with soap at critical junctures. However, no significant improvements in the health of children under the age of 5 years were observed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  11. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  12. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  13. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  14. Evaluation of Co-Q10 anti-gingivitis effect on plaque induced gingivitis: A randomized controlled clinical trial

    Directory of Open Access Journals (Sweden)

    Anirban Chatterjee

    2012-01-01

    Full Text Available Background: Deficiency of Co-Q10 has been found to be responsible for periodontal destruction; therefore, this study was undertaken to evaluate the anti-gingivitis effect of Co-Q10 on plaque induced gingivitis. Materials and Methods: Thirty subjects with plaque induced gingivitis were enrolled in a split mouth randomized controlled trial. For each subject, scaling was randomly performed for any two quadrants, followed by the topical application of Co-Q10 randomly in a previously scaled and as an unscaled quadrant for a period of 28 days. Four treatment options were planned: option A: scaling only; option B: Co-Q10 along with scaling; option C: Co-Q10. Results: Marked reduction in gingival, bleeding, and plaque scores were recorded at the sites where C0-Q10 was applied. Mean±S.D of aforementioned periodontal parameters at 28th day showed significant reduction for option A, B, and C when compared with baseline. Conclusion: Promising results were obtained after the solitary application of Co-Q10 as well as when it was used as an adjunct to scaling and root planing for treatment of plaque induced gingivitis.

  15. Scaling Limit of Symmetric Random Walk in High-Contrast Periodic Environment

    Science.gov (United States)

    Piatnitski, A.; Zhizhina, E.

    2017-11-01

    The paper deals with the asymptotic properties of a symmetric random walk in a high contrast periodic medium in Z^d, d≥1. From the existing homogenization results it follows that under diffusive scaling the limit behaviour of this random walk need not be Markovian. The goal of this work is to show that if in addition to the coordinate of the random walk in Z^d we introduce an extra variable that characterizes the position of the random walk inside the period then the limit dynamics of this two-component process is Markov. We describe the limit process and observe that the components of the limit process are coupled. We also prove the convergence in the path space for the said random walk.

  16. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    Science.gov (United States)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2018-03-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  17. The effect of adaptive versus static practicing on student learning - evidence from a randomized field experiment

    NARCIS (Netherlands)

    van Klaveren, Chris; Vonk, Sebastiaan; Cornelisz, Ilja

    2017-01-01

    Schools and governments are increasingly investing in adaptive practice software. To date, the evidence whether adaptivity improves learning outcomes is limited and mixed. A large-scale randomized control trial is conducted in Dutch secondary schools to evaluate the effectiveness of an adaptive

  18. Beneficial Effect of Mindfulness-Based Art Therapy in Patients with Breast Cancer-A Randomized Controlled Trial.

    Science.gov (United States)

    Jang, Seung-Ho; Kang, Seung-Yeon; Lee, Hye-Jin; Lee, Sang-Yeol

    2016-01-01

    Mindfulness-based art therapy (MBAT) induces emotional relaxation in cancer patients and is a treatment known to improve psychological stability. The objective of this research was to evaluate the treatment effects of MBAT for breast cancer patients. Overall, 24 breast cancer patients were selected as subjects of the study. Two groups, the MBAT group and control group with 12 patients each, were randomly assigned. The patients in the MBAT group were given 12 sessions of treatments. To measure depression and anxiety, low scales of the personality assessment inventory (PAI) was used. Health-related quality of life was evaluated using the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire (EORTCQLQ-C30). The treatment results were analyzed using analysis of covariance (ANCOVA) and two-way repeated measures analysis of variance (ANOVA). The results showed that depression and anxiety decreased significantly and health-related quality of life improved significantly in the MBAT group. In the control group, however, there was no significant change. MBAT can be seen as an effective treatment method that improves breast cancer patients׳ psychological stability and quality of life. Evaluation of treatment effects using program development and large-scale research for future clinical application is needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  20. Canopy-scale biophysical controls on transpiration and evaporation in the Amazon Basin

    DEFF Research Database (Denmark)

    Mallick, Kaniska; Trebs, Ivonne; Bøgh, Eva

    2016-01-01

    to directly quantify the canopy-scale biophysical controls on λET and λEE over multiple plant functional types (PFTs) in the Amazon Basin. Combining data from six LBA (Large-scale Biosphere-Atmosphere Experiment in Amazonia) eddy covariance tower sites and a TR-driven physically based modeling approach, we...

  1. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  2. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  3. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  4. CLASS: The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Essinger-Hileman, Thomas; Ali, Aamir; Amiri, Mandana; Appel, John W.; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; hide

    2014-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an experiment to measure the signature of a gravitational wave background from inflation in the polarization of the cosmic microwave background (CMB). CLASS is a multi-frequency array of four telescopes operating from a high-altitude site in the Atacama Desert in Chile. CLASS will survey 70% of the sky in four frequency bands centered at 38, 93, 148, and 217 GHz, which are chosen to straddle the Galactic-foreground minimum while avoiding strong atmospheric emission lines. This broad frequency coverage ensures that CLASS can distinguish Galactic emission from the CMB. The sky fraction of the CLASS survey will allow the full shape of the primordial B-mode power spectrum to be characterized, including the signal from reionization at low-length. Its unique combination of large sky coverage, control of systematic errors, and high sensitivity will allow CLASS to measure or place upper limits on the tensor-to-scalar ratio at a level of r = 0:01 and make a cosmic-variance-limited measurement of the optical depth to the surface of last scattering, tau. (c) (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  5. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  6. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  7. Low frequency steady-state brain responses modulate large scale functional networks in a frequency-specific means.

    Science.gov (United States)

    Wang, Yi-Feng; Long, Zhiliang; Cui, Qian; Liu, Feng; Jing, Xiu-Juan; Chen, Heng; Guo, Xiao-Nan; Yan, Jin H; Chen, Hua-Fu

    2016-01-01

    Neural oscillations are essential for brain functions. Research has suggested that the frequency of neural oscillations is lower for more integrative and remote communications. In this vein, some resting-state studies have suggested that large scale networks function in the very low frequency range (frequency characteristics of brain networks because both resting-state studies and conventional frequency tagging approaches cannot simultaneously capture multiple large scale networks in controllable cognitive activities. In this preliminary study, we aimed to examine whether large scale networks can be modulated by task-induced low frequency steady-state brain responses (lfSSBRs) in a frequency-specific pattern. In a revised attention network test, the lfSSBRs were evoked in the triple network system and sensory-motor system, indicating that large scale networks can be modulated in a frequency tagging way. Furthermore, the inter- and intranetwork synchronizations as well as coherence were increased at the fundamental frequency and the first harmonic rather than at other frequency bands, indicating a frequency-specific modulation of information communication. However, there was no difference among attention conditions, indicating that lfSSBRs modulate the general attention state much stronger than distinguishing attention conditions. This study provides insights into the advantage and mechanism of lfSSBRs. More importantly, it paves a new way to investigate frequency-specific large scale brain activities. © 2015 Wiley Periodicals, Inc.

  8. Ancillary Frequency Control of Direct Drive Full-Scale Converter Based Wind Power Plants

    DEFF Research Database (Denmark)

    Hu, Weihao; Su, Chi; Fang, Jiakun

    2013-01-01

    This paper presents a simulation model of a wind power plant based on a MW-level variable speed wind turbine with a full-scale back-to-back power converter developed in the simulation tool of DIgSILENT Power Factory. Three different kinds of ancillary frequency control strategies, namely inertia...... control strategies are effective means for providing ancillary frequency control of variable speed wind turbines with full-scale back-to-back power converters....... emulation, primary frequency control and secondary frequency control, are proposed in order to improve the frequency stability of power systems. The modified IEEE 39-bus test system with a large-scale wind power penetration is chosen as the studied power system. Simulation results show that the proposed...

  9. Lumbar Sympathetic Plexus Block as a Treatment for Postamputation Pain: Methodology for a Randomized Controlled Trial.

    Science.gov (United States)

    McCormick, Zachary L; Hendrix, Andrew; Dayanim, David; Clay, Bryan; Kirsling, Amy; Harden, Norman

    2018-03-08

    We present a technical protocol for rigorous assessment of patient-reported outcomes and psychophysical testing relevant to lumbar sympathetic blocks for the treatment of postamputation pain (PAP). This description is intended to inform future prospective investigation. Series of four participants from a blinded randomized sham-controlled trial. Tertiary, urban, academic pain medicine center. Four participants with a single lower limb amputation and associated chronic PAP. Participants were randomized to receive a lumbar sympathetic block with 0.25% bupivacaine or sham needle placement. Patient-rated outcome measures included the numerical rating scale (NRS) for pain, the McGill Pain Questionnaire-Short Form, Center for Epidemiological Studies Depression Scale, Pain and Anxiety Symptoms Scale-short version, and Pain Disability Index (PDI). Psychophysical and biometric testing was also performed, which included vibration sensation testing, pinprick sensation testing, brush sensation testing, Von Frey repeated weighted pinprick sensation, and thermal quantitative sensory testing. In the four described cases, treatment of PAP with a single lumbar sympathetic block but not sham intervention resulted in reduction of both residual limb pain and phantom limb pain as well as perceived disability on the PDI at three-month follow-up. An appropriately powered randomized controlled study using this methodology may not only aid in determining the possible clinical efficacy of lumbar sympathetic block in PAP, but could also improve our understanding of underlying pathophysiologic mechanisms of PAP.

  10. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  11. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  12. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  13. Stochastically Estimating Modular Criticality in Large-Scale Logic Circuits Using Sparsity Regularization and Compressive Sensing

    Directory of Open Access Journals (Sweden)

    Mohammed Alawad

    2015-03-01

    Full Text Available This paper considers the problem of how to efficiently measure a large and complex information field with optimally few observations. Specifically, we investigate how to stochastically estimate modular criticality values in a large-scale digital circuit with a very limited number of measurements in order to minimize the total measurement efforts and time. We prove that, through sparsity-promoting transform domain regularization and by strategically integrating compressive sensing with Bayesian learning, more than 98% of the overall measurement accuracy can be achieved with fewer than 10% of measurements as required in a conventional approach that uses exhaustive measurements. Furthermore, we illustrate that the obtained criticality results can be utilized to selectively fortify large-scale digital circuits for operation with narrow voltage headrooms and in the presence of soft-errors rising at near threshold voltage levels, without excessive hardware overheads. Our numerical simulation results have shown that, by optimally allocating only 10% circuit redundancy, for some large-scale benchmark circuits, we can achieve more than a three-times reduction in its overall error probability, whereas if randomly distributing such 10% hardware resource, less than 2% improvements in the target circuit’s overall robustness will be observed. Finally, we conjecture that our proposed approach can be readily applied to estimate other essential properties of digital circuits that are critical to designing and analyzing them, such as the observability measure in reliability analysis and the path delay estimation in stochastic timing analysis. The only key requirement of our proposed methodology is that these global information fields exhibit a certain degree of smoothness, which is universally true for almost any physical phenomenon.

  14. A dynamically adaptive wavelet approach to stochastic computations based on polynomial chaos - capturing all scales of random modes on independent grids

    International Nuclear Information System (INIS)

    Ren Xiaoan; Wu Wenquan; Xanthis, Leonidas S.

    2011-01-01

    Highlights: → New approach for stochastic computations based on polynomial chaos. → Development of dynamically adaptive wavelet multiscale solver using space refinement. → Accurate capture of steep gradients and multiscale features in stochastic problems. → All scales of each random mode are captured on independent grids. → Numerical examples demonstrate the need for different space resolutions per mode. - Abstract: In stochastic computations, or uncertainty quantification methods, the spectral approach based on the polynomial chaos expansion in random space leads to a coupled system of deterministic equations for the coefficients of the expansion. The size of this system increases drastically when the number of independent random variables and/or order of polynomial chaos expansions increases. This is invariably the case for large scale simulations and/or problems involving steep gradients and other multiscale features; such features are variously reflected on each solution component or random/uncertainty mode requiring the development of adaptive methods for their accurate resolution. In this paper we propose a new approach for treating such problems based on a dynamically adaptive wavelet methodology involving space-refinement on physical space that allows all scales of each solution component to be refined independently of the rest. We exemplify this using the convection-diffusion model with random input data and present three numerical examples demonstrating the salient features of the proposed method. Thus we establish a new, elegant and flexible approach for stochastic problems with steep gradients and multiscale features based on polynomial chaos expansions.

  15. An assessment of the effectiveness of a large, national-scale invasive alien plant control strategy in South Africa

    CSIR Research Space (South Africa)

    Van Wilgen, BW

    2012-04-01

    Full Text Available extent of invasive species control operations, assessments of the effectiveness of biological control, and smaller-scale studies. The 19 most important invasive taxa, mainly trees, in terrestrial biomes were identified. The effectiveness of control...

  16. A Dynamic Optimization Strategy for the Operation of Large Scale Seawater Reverses Osmosis System

    Directory of Open Access Journals (Sweden)

    Aipeng Jiang

    2014-01-01

    Full Text Available In this work, an efficient strategy was proposed for efficient solution of the dynamic model of SWRO system. Since the dynamic model is formulated by a set of differential-algebraic equations, simultaneous strategies based on collocations on finite element were used to transform the DAOP into large scale nonlinear programming problem named Opt2. Then, simulation of RO process and storage tanks was carried element by element and step by step with fixed control variables. All the obtained values of these variables then were used as the initial value for the optimal solution of SWRO system. Finally, in order to accelerate the computing efficiency and at the same time to keep enough accuracy for the solution of Opt2, a simple but efficient finite element refinement rule was used to reduce the scale of Opt2. The proposed strategy was applied to a large scale SWRO system with 8 RO plants and 4 storage tanks as case study. Computing result shows that the proposed strategy is quite effective for optimal operation of the large scale SWRO system; the optimal problem can be successfully solved within decades of iterations and several minutes when load and other operating parameters fluctuate.

  17. Large-Scale Gene-Centric Meta-Analysis across 39 Studies Identifies Type 2 Diabetes Loci

    NARCIS (Netherlands)

    Saxena, Richa; Elbers, Clara C.; Guo, Yiran; Peter, Inga; Gaunt, Tom R.; Mega, Jessica L.; Lanktree, Matthew B.; Tare, Archana; Almoguera Castillo, Berta; Li, Yun R.; Johnson, Toby; Bruinenberg, Marcel; Gilbert-Diamond, Diane; Rajagopalan, Ramakrishnan; Voight, Benjamin F.; Balasubramanyam, Ashok; Barnard, John; Bauer, Florianne; Baumert, Jens; Bhangale, Tushar; Boehm, Bernhard O.; Braund, Peter S.; Burton, Paul R.; Chandrupatla, Hareesh R.; Clarke, Robert; Cooper-DeHoff, Rhonda M.; Crook, Errol D.; Davey-Smith, George; Day, Ian N.; de Boer, Anthonius; de Groot, Mark C. H.; Drenos, Fotios; Ferguson, Jane; Fox, Caroline S.; Furlong, Clement E.; Gibson, Quince; Gieger, Christian; Gilhuijs-Pederson, Lisa A.; Glessner, Joseph T.; Goel, Anuj; Gong, Yan; Grant, Struan F. A.; Kumari, Meena; van der Harst, Pim; van Vliet-Ostaptchouk, Jana V.; Verweij, Niek; Wolffenbuttel, Bruce H. R.; Hofker, Marten H.; Asselbergs, Folkert W.; Wijmenga, Cisca

    2012-01-01

    To identify genetic factors contributing to type 2 diabetes (T2D), we performed large-scale meta-analyses by using a custom similar to 50,000 SNP genotyping array (the ITMAT-Broad-CARe array) with similar to 2000 candidate genes in 39 multiethnic population-based studies, case-control studies, and

  18. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  19. A mental health intervention for schoolchildren exposed to violence: a randomized controlled trial.

    Science.gov (United States)

    Stein, Bradley D; Jaycox, Lisa H; Kataoka, Sheryl H; Wong, Marleen; Tu, Wenli; Elliott, Marc N; Fink, Arlene

    2003-08-06

    No randomized controlled studies have been conducted to date on the effectiveness of psychological interventions for children with symptoms of posttraumatic stress disorder (PTSD) that has resulted from personally witnessing or being personally exposed to violence. To evaluate the effectiveness of a collaboratively designed school-based intervention for reducing children's symptoms of PTSD and depression that has resulted from exposure to violence. A randomized controlled trial conducted during the 2001-2002 academic year. Sixth-grade students at 2 large middle schools in Los Angeles who reported exposure to violence and had clinical levels of symptoms of PTSD. Students were randomly assigned to a 10-session standardized cognitive-behavioral therapy (the Cognitive-Behavioral Intervention for Trauma in Schools) early intervention group (n = 61) or to a wait-list delayed intervention comparison group (n = 65) conducted by trained school mental health clinicians. Students were assessed before the intervention and 3 months after the intervention on measures assessing child-reported symptoms of PTSD (Child PTSD Symptom Scale; range, 0-51 points) and depression (Child Depression Inventory; range, 0-52 points), parent-reported psychosocial dysfunction (Pediatric Symptom Checklist; range, 0-70 points), and teacher-reported classroom problems using the Teacher-Child Rating Scale (acting out, shyness/anxiousness, and learning problems; range of subscales, 6-30 points). Compared with the wait-list delayed intervention group (no intervention), after 3 months of intervention students who were randomly assigned to the early intervention group had significantly lower scores on symptoms of PTSD (8.9 vs 15.5, adjusted mean difference, - 7.0; 95% confidence interval [CI], - 10.8 to - 3.2), depression (9.4 vs 12.7, adjusted mean difference, - 3.4; 95% CI, - 6.5 to - 0.4), and psychosocial dysfunction (12.5 vs 16.5, adjusted mean difference, - 6.4; 95% CI, -10.4 to -2.3). Adjusted

  20. The genetic etiology of Tourette Syndrome: Large-scale collaborative efforts on the precipice of discovery

    Directory of Open Access Journals (Sweden)

    Marianthi Georgitsi

    2016-08-01

    Full Text Available Gilles de la Tourette Syndrome (TS is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive;however, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report, are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS, copy number variation (CNV scans and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios and multigenerational families. The European Multicentre Tics in Children Study (EMTICS seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for indentifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder.