WorldWideScience

Sample records for models large group

  1. Model Selection and Hypothesis Testing for Large-Scale Network Models with Overlapping Groups

    Directory of Open Access Journals (Sweden)

    Tiago P. Peixoto

    2015-03-01

    Full Text Available The effort to understand network systems in increasing detail has resulted in a diversity of methods designed to extract their large-scale structure from data. Unfortunately, many of these methods yield diverging descriptions of the same network, making both the comparison and understanding of their results a difficult challenge. A possible solution to this outstanding issue is to shift the focus away from ad hoc methods and move towards more principled approaches based on statistical inference of generative models. As a result, we face instead the more well-defined task of selecting between competing generative processes, which can be done under a unified probabilistic framework. Here, we consider the comparison between a variety of generative models including features such as degree correction, where nodes with arbitrary degrees can belong to the same group, and community overlap, where nodes are allowed to belong to more than one group. Because such model variants possess an increasing number of parameters, they become prone to overfitting. In this work, we present a method of model selection based on the minimum description length criterion and posterior odds ratios that is capable of fully accounting for the increased degrees of freedom of the larger models and selects the best one according to the statistical evidence available in the data. In applying this method to many empirical unweighted networks from different fields, we observe that community overlap is very often not supported by statistical evidence and is selected as a better model only for a minority of them. On the other hand, we find that degree correction tends to be almost universally favored by the available data, implying that intrinsic node proprieties (as opposed to group properties are often an essential ingredient of network formation.

  2. Group Active Engagements Using Quantitative Modeling of Physiology Concepts in Large-Enrollment Biology Classes

    Directory of Open Access Journals (Sweden)

    Karen L. Carleton

    2016-12-01

    Full Text Available Organismal Biology is the third introductory biology course taught at the University of Maryland. Students learn about the geometric, physical, chemical, and thermodynamic constraints that are common to all life, and their implications for the evolution of multicellular organisms based on a common genetic “toolbox.”  An additional goal is helping students to improve their scientific logic and comfort with quantitative modeling.  We recently developed group active engagement exercises (GAEs for this Organismal Biology class.  Currently, our class is built around twelve GAE activities implemented in an auditorium lecture hall in a large enrollment class.  The GAEs examine scientific concepts using a variety of models including physical models, qualitative models, and Excel-based quantitative models. Three quantitative GAEs give students an opportunity to build their understanding of key physiological ideas. 1 The Escape from Planet Ranvier exercise reinforces student understanding that membrane permeability means that ions move through open channels in the membrane.  2 The Stressing and Straining exercise requires students to quantify the elastic modulus from data gathered either in class or from scientific literature. 3 In Leveraging Your Options exercise, students learn about lever systems and apply this knowledge to biological systems.

  3. A model for the use of blended learning in large group teaching sessions

    Directory of Open Access Journals (Sweden)

    Cristan Herbert

    2017-11-01

    modules were described as enjoyable, motivating and were appreciated for their flexibility, which enabled students to work at their own pace. Conclusions In transforming this introductory Pathology course, we have demonstrated a model for the use of blended learning in large group teaching sessions, which achieved high levels of completion, satisfaction and value for learning.

  4. A model for the use of blended learning in large group teaching sessions.

    Science.gov (United States)

    Herbert, Cristan; Velan, Gary M; Pryor, Wendy M; Kumar, Rakesh K

    2017-11-09

    appreciated for their flexibility, which enabled students to work at their own pace. In transforming this introductory Pathology course, we have demonstrated a model for the use of blended learning in large group teaching sessions, which achieved high levels of completion, satisfaction and value for learning.

  5. Distributed Model Predictive Control over Multiple Groups of Vehicles in Highway Intelligent Space for Large Scale System

    OpenAIRE

    Tang Xiaofeng; Gao Feng; Xu Guoyan; Ding Nenggen; Cai Yao; Liu Jian Xing

    2014-01-01

    The paper presents the three time warning distances for solving the large scale system of multiple groups of vehicles safety driving characteristics towards highway tunnel environment based on distributed model prediction control approach. Generally speaking, the system includes two parts. First, multiple vehicles are divided into multiple groups. Meanwhile, the distributed model predictive control approach is proposed to calculate the information framework of each group. Each group of optimi...

  6. Quantitative Modeling of Membrane Transport and Anisogamy by Small Groups Within a Large-Enrollment Organismal Biology Course

    Directory of Open Access Journals (Sweden)

    Eric S. Haag

    2016-12-01

    Full Text Available Quantitative modeling is not a standard part of undergraduate biology education, yet is routine in the physical sciences. Because of the obvious biophysical aspects, classes in anatomy and physiology offer an opportunity to introduce modeling approaches to the introductory curriculum. Here, we describe two in-class exercises for small groups working within a large-enrollment introductory course in organismal biology. Both build and derive biological insights from quantitative models, implemented using spreadsheets. One exercise models the evolution of anisogamy (i.e., small sperm and large eggs from an initial state of isogamy. Groups of four students work on Excel spreadsheets (from one to four laptops per group. The other exercise uses an online simulator to generate data related to membrane transport of a solute, and a cloud-based spreadsheet to analyze them. We provide tips for implementing these exercises gleaned from two years of experience.

  7. Effects of core models and neutron energy group structures on xenon oscillation in large graphite-moderated reactors

    International Nuclear Information System (INIS)

    Yamasita, Kiyonobu; Harada, Hiroo; Murata, Isao; Shindo, Ryuichi; Tsuruoka, Takuya.

    1993-01-01

    Xenon oscillations of large graphite-moderated reactors have been analyzed by a multi-group diffusion code with two- and three-dimensional core models to study the effects of the geometric core models and the neutron energy group structures on the evaluation of the Xe oscillation behavior. The study clarified the following. It is important for accurate Xe oscillation simulations to use the neutron energy group structure that describes well the large change in the absorption cross section of Xe in the thermal energy range of 0.1∼0.65 eV, because the energy structure in this energy range has significant influences on the amplitude and the period of oscillations in power distributions. Two-dimensional R-Z models can be used instead of three-dimensional R-θ-Z models for evaluation of the threshold power of Xe oscillation, but two-dimensional R-θ models cannot be used for evaluation of the threshold power. Although the threshold power evaluated with the R-θ-Z models coincides with that of the R-Z models, it does not coincide with that of the R-θ models. (author)

  8. Distributed Model Predictive Control over Multiple Groups of Vehicles in Highway Intelligent Space for Large Scale System

    Directory of Open Access Journals (Sweden)

    Tang Xiaofeng

    2014-01-01

    Full Text Available The paper presents the three time warning distances for solving the large scale system of multiple groups of vehicles safety driving characteristics towards highway tunnel environment based on distributed model prediction control approach. Generally speaking, the system includes two parts. First, multiple vehicles are divided into multiple groups. Meanwhile, the distributed model predictive control approach is proposed to calculate the information framework of each group. Each group of optimization performance considers the local optimization and the neighboring subgroup of optimization characteristics, which could ensure the global optimization performance. Second, the three time warning distances are studied based on the basic principles used for highway intelligent space (HIS and the information framework concept is proposed according to the multiple groups of vehicles. The math model is built to avoid the chain avoidance of vehicles. The results demonstrate that the proposed highway intelligent space method could effectively ensure driving safety of multiple groups of vehicles under the environment of fog, rain, or snow.

  9. Teacher-student co-construction processes in biology: Strategies for developing mental models in large group discussions

    Science.gov (United States)

    Nunez Oviedo, Maria Cecilia

    The aim of this study was to describe co-construction processes in large group discussions. Co-construction, as used here, is a process by which the teacher and the students work together to construct and evaluate mental models of a target concept. Data were collected for an in-depth case study of a single teacher instructing middle school students with an innovative curriculum on human respiration. Data came from transcripts of video taped lessons, drawings, and pre- and post-test scores. Quantitative and qualitative analyses were conducted. In the quantitative analysis, differences in gains between one and two standard deviations in size were found between the pre- and post-test scores indicating that the students increased their understanding about human respiration. In the qualitative analysis, a generative exploratory method followed by a convergent coded method was conducted to examine teacher-student interaction patterns. The aim of this part was to determine how learning occurred by attempting to connect dialogue patterns with underlying cognitive processes. The main outcome of the study is a hypothesized model containing four layers of nested teaching strategies. Listed from large to small time scales these are: the Macro Cycle, the Co-construction Modes, the Micro Cycle, and the Teaching Tactics. The most intensive analysis focused on identifying and articulating the Co-construction Modes---Accretion Mode, Disconfirmation Mode, Modification Mode, Evolution Mode, and Competition Mode---and their relations to the other levels of the model. These modes can either describe the construction and evaluation of individual model elements or of entire models giving a total of ten modes. The frequency of these co-construction modes was then determined by coding, twenty-six hours of transcripts. The most frequent modes were the Accretion Mode and the Disconfirmation Mode. The teacher's and the students' contributions to the co-construction process were also examined

  10. Large-scale effects of migration and conflict in pre-agricultural groups: Insights from a dynamic model.

    Directory of Open Access Journals (Sweden)

    Francesco Gargano

    Full Text Available The debate on the causes of conflict in human societies has deep roots. In particular, the extent of conflict in hunter-gatherer groups remains unclear. Some authors suggest that large-scale violence only arose with the spreading of agriculture and the building of complex societies. To shed light on this issue, we developed a model based on operatorial techniques simulating population-resource dynamics within a two-dimensional lattice, with humans and natural resources interacting in each cell of the lattice. The model outcomes under different conditions were compared with recently available demographic data for prehistoric South America. Only under conditions that include migration among cells and conflict was the model able to consistently reproduce the empirical data at a continental scale. We argue that the interplay between resource competition, migration, and conflict drove the population dynamics of South America after the colonization phase and before the introduction of agriculture. The relation between population and resources indeed emerged as a key factor leading to migration and conflict once the carrying capacity of the environment has been reached.

  11. Large-scale effects of migration and conflict in pre-agricultural groups: Insights from a dynamic model.

    Science.gov (United States)

    Gargano, Francesco; Tamburino, Lucia; Bagarello, Fabio; Bravo, Giangiacomo

    2017-01-01

    The debate on the causes of conflict in human societies has deep roots. In particular, the extent of conflict in hunter-gatherer groups remains unclear. Some authors suggest that large-scale violence only arose with the spreading of agriculture and the building of complex societies. To shed light on this issue, we developed a model based on operatorial techniques simulating population-resource dynamics within a two-dimensional lattice, with humans and natural resources interacting in each cell of the lattice. The model outcomes under different conditions were compared with recently available demographic data for prehistoric South America. Only under conditions that include migration among cells and conflict was the model able to consistently reproduce the empirical data at a continental scale. We argue that the interplay between resource competition, migration, and conflict drove the population dynamics of South America after the colonization phase and before the introduction of agriculture. The relation between population and resources indeed emerged as a key factor leading to migration and conflict once the carrying capacity of the environment has been reached.

  12. Functional renormalization group approach to SU(N ) Heisenberg models: Momentum-space renormalization group for the large-N limit

    Science.gov (United States)

    Roscher, Dietrich; Buessen, Finn Lasse; Scherer, Michael M.; Trebst, Simon; Diehl, Sebastian

    2018-02-01

    In frustrated magnetism, making a stringent connection between microscopic spin models and macroscopic properties of spin liquids remains an important challenge. A recent step towards this goal has been the development of the pseudofermion functional renormalization group approach (pf-FRG) which, building on a fermionic parton construction, enables the numerical detection of the onset of spin liquid states as temperature is lowered. In this work, focusing on the SU (N ) Heisenberg model at large N , we extend this approach in a way that allows us to directly enter the low-temperature spin liquid phase, and to probe its character. Our approach proceeds in momentum space, making it possible to keep the truncation minimalistic, while also avoiding the bias introduced by an explicit decoupling of the fermionic parton interactions into a given channel. We benchmark our findings against exact mean-field results in the large-N limit, and show that even without prior knowledge the pf-FRG approach identifies the correct mean-field decoupling channel. On a technical level, we introduce an alternative finite temperature regularization scheme that is necessitated to access the spin liquid ordered phase. In a companion paper [Buessen et al., Phys. Rev. B 97, 064415 (2018), 10.1103/PhysRevB.97.064415] we present a different set of modifications of the pf-FRG scheme that allow us to study SU (N ) Heisenberg models (using a real-space RG approach) for arbitrary values of N , albeit only up to the phase transition towards spin liquid physics.

  13. A COMPARISON OF DOSE-RESPONSE MODELS FOR THE PAROTID GLAND IN A LARGE GROUP OF HEAD-AND-NECK CANCER PATIENTS

    NARCIS (Netherlands)

    Houweling, Antonetta C.; Philippens, Marielle E. P.; Dijkema, Tim; Roesink, Judith M.; Terhaard, Chris H. J.; Schilstra, Cornelis; Ten Haken, Randall K.; Eisbruch, Avraham; Raaijmakers, Cornelis P. J.

    2010-01-01

    Purpose: The dose response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that

  14. The large-Nc renormalization group

    International Nuclear Information System (INIS)

    Dorey, N.

    1995-01-01

    In this talk, we review how effective theories of mesons and baryons become exactly soluble in the large-N c , limit. We start with a generic hadron Lagrangian constrained only by certain well-known large-N c , selection rules. The bare vertices of the theory are dressed by an infinite class of UV divergent Feynman diagrams at leading order in 1/N c . We show how all these leading-order dia, grams can be summed exactly using semiclassical techniques. The saddle-point field configuration is reminiscent of the chiral bag: hedgehog pions outside a sphere of radius Λ -1 (Λ being the UV cutoff of the effective theory) matched onto nucleon degrees of freedom for r ≤ Λ -1 . The effect of this pion cloud is to renormalize the bare nucleon mass, nucleon-Δ hyperfine mass splitting, and Yukawa couplings of the theory. The corresponding large-N c , renormalization group equations for these parameters are presented, and solved explicitly in a series of simple models. We explain under what conditions the Skyrmion emerges as a UV fixed-point of the RG flow as Λ → ∞

  15. A comparison of dose-response models for the parotid gland in a large group of head-and-neck cancer patients.

    Science.gov (United States)

    Houweling, Antonetta C; Philippens, Marielle E P; Dijkema, Tim; Roesink, Judith M; Terhaard, Chris H J; Schilstra, Cornelis; Ten Haken, Randall K; Eisbruch, Avraham; Raaijmakers, Cornelis P J

    2010-03-15

    The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulated salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD(50) in these models was approximately 39 Gy. The mean dose model was preferred for describing the dose-response relationship of the parotid gland. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  17. Group Capability Model

    Science.gov (United States)

    Olejarski, Michael; Appleton, Amy; Deltorchio, Stephen

    2009-01-01

    The Group Capability Model (GCM) is a software tool that allows an organization, from first line management to senior executive, to monitor and track the health (capability) of various groups in performing their contractual obligations. GCM calculates a Group Capability Index (GCI) by comparing actual head counts, certifications, and/or skills within a group. The model can also be used to simulate the effects of employee usage, training, and attrition on the GCI. A universal tool and common method was required due to the high risk of losing skills necessary to complete the Space Shuttle Program and meet the needs of the Constellation Program. During this transition from one space vehicle to another, the uncertainty among the critical skilled workforce is high and attrition has the potential to be unmanageable. GCM allows managers to establish requirements for their group in the form of head counts, certification requirements, or skills requirements. GCM then calculates a Group Capability Index (GCI), where a score of 1 indicates that the group is at the appropriate level; anything less than 1 indicates a potential for improvement. This shows the health of a group, both currently and over time. GCM accepts as input head count, certification needs, critical needs, competency needs, and competency critical needs. In addition, team members are categorized by years of experience, percentage of contribution, ex-members and their skills, availability, function, and in-work requirements. Outputs are several reports, including actual vs. required head count, actual vs. required certificates, CGI change over time (by month), and more. The program stores historical data for summary and historical reporting, which is done via an Excel spreadsheet that is color-coded to show health statistics at a glance. GCM has provided the Shuttle Ground Processing team with a quantifiable, repeatable approach to assessing and managing the skills in their organization. They now have a common

  18. GRIP LANGLEY AEROSOL RESEARCH GROUP EXPERIMENT (LARGE) V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GRIP Langley Aerosol Research Group Experiment (LARGE) dataset was collected by the Langley Aerosol Research Group Experiment (LARGE), which measures ultrafine...

  19. Large scale composting model

    OpenAIRE

    Henon , Florent; Debenest , Gérald; Tremier , Anne; Quintard , Michel; Martel , Jean-Luc; Duchalais , Guy

    2012-01-01

    International audience; One way to treat the organic wastes accordingly to the environmental policies is to develop biological treatment like composting. Nevertheless, this development largely relies on the quality of the final product and as a consequence on the quality of the biological activity during the treatment. Favourable conditions (oxygen concentration, temperature and moisture content) in the waste bed largely contribute to the establishment of a good aerobic biological activity an...

  20. Continuous Assessment in a Large Group of Psychology Undergraduates

    Science.gov (United States)

    Clariana, Merce; Gotzens, Concepcion; Badia, Mar

    2011-01-01

    Introduction: A continuous classroom assessment method was applied to a higher education course aimed at a large group of educational psychology students at the "Universitat Autonoma de Barcelona". Following the Bologna directions and the constructivist model, both declarative and procedural knowledge was taught in the module, and the…

  1. Report of the large solenoid detector group

    International Nuclear Information System (INIS)

    Hanson, G.G.; Mori, S.; Pondrom, L.G.

    1987-09-01

    This report presents a conceptual design of a large solenoid for studying physics at the SSC. The parameters and nature of the detector have been chosen based on present estimates of what is required to allow the study of heavy quarks, supersymmetry, heavy Higgs particles, WW scattering at large invariant masses, new W and Z bosons, and very large momentum transfer parton-parton scattering. Simply stated, the goal is to obtain optimum detection and identification of electrons, muons, neutrinos, jets, W's and Z's over a large rapidity region. The primary region of interest extends over +-3 units of rapidity, although the calorimetry must extend to +-5.5 units if optimal missing energy resolution is to be obtained. A magnetic field was incorporated because of the importance of identifying the signs of the charges for both electrons and muons and because of the added possibility of identifying tau leptons and secondary vertices. In addition, the existence of a magnetic field may prove useful for studying new physics processes about which we currently have no knowledge. Since hermeticity of the calorimetry is extremely important, the entire central and endcap calorimeters were located inside the solenoid. This does not at the moment seem to produce significant problems (although many issues remain to be resolved) and in fact leads to a very effective muon detector in the central region

  2. Mining Behavioral Groups in Large Wireless LANs

    OpenAIRE

    Hsu, Wei-jen; Dutta, Debojyoti; Helmy, Ahmed

    2006-01-01

    One vision of future wireless networks is that they will be deeply integrated and embedded in our lives and will involve the use of personalized mobile devices. User behavior in such networks is bound to affect the network performance. It is imperative to study and characterize the fundamental structure of wireless user behavior in order to model, manage, leverage and design efficient mobile networks. It is also important to make such study as realistic as possible, based on extensive measure...

  3. Modelling group dynamic animal movement

    DEFF Research Database (Denmark)

    Langrock, Roland; Hopcraft, J. Grant C.; Blackwell, Paul G.

    2014-01-01

    in non-ideal scenarios, we show that generally the estimation of models of this type is both feasible and ecologically informative. We illustrate the approach using real movement data from 11 reindeer (Rangifer tarandus). Results indicate a directional bias towards a group centroid for reindeer......Group dynamic movement is a fundamental aspect of many species' movements. The need to adequately model individuals' interactions with other group members has been recognised, particularly in order to differentiate the role of social forces in individual movement from environmental factors. However......, to date, practical statistical methods which can include group dynamics in animal movement models have been lacking. We consider a flexible modelling framework that distinguishes a group-level model, describing the movement of the group's centre, and an individual-level model, such that each individual...

  4. NAMMA LANGLEY AEROSOL RESEARCH GROUP EXPERIMENT (LARGE) V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The NAMMA Langley Aerosol Research Group Experiment (LARGE) dataset contains data collected from the following in situ aerosol sensors: condensation nuclei counters,...

  5. GRIP LANGLEY AEROSOL RESEARCH GROUP EXPERIMENT (LARGE) V1

    Data.gov (United States)

    National Aeronautics and Space Administration — Langley Aerosol Research Group Experiment (LARGE) measures ultrafine aerosol number density, total and non-volatile aerosol number density, dry aerosol size...

  6. NAMMA LANGLEY AEROSOL RESEARCH GROUP EXPERIMENT (LARGE) V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The NAMMA Langley Aerosol Research Group Experiment (LARGE) dataset is data collected from in situ aerosol sensors: condensation nuclei counters, optical particle...

  7. Selmer groups of elliptic curves that can be arbitrarily large

    NARCIS (Netherlands)

    Schaefer, EF; Kloosterman, Remke

    In this article, it is shown that certain kinds of Selmer groups of elliptic curves can be arbitrarily large. The main result is that if p is a prime at least 5, then p-Selmer groups of elliptic curves can be arbitrarily large if one ranges over number fields of degree at most g + 1 over the

  8. Model of large pool fires

    International Nuclear Information System (INIS)

    Fay, J.A.

    2006-01-01

    A two zone entrainment model of pool fires is proposed to depict the fluid flow and flame properties of the fire. Consisting of combustion and plume zones, it provides a consistent scheme for developing non-dimensional scaling parameters for correlating and extrapolating pool fire visible flame length, flame tilt, surface emissive power, and fuel evaporation rate. The model is extended to include grey gas thermal radiation from soot particles in the flame zone, accounting for emission and absorption in both optically thin and thick regions. A model of convective heat transfer from the combustion zone to the liquid fuel pool, and from a water substrate to cryogenic fuel pools spreading on water, provides evaporation rates for both adiabatic and non-adiabatic fires. The model is tested against field measurements of large scale pool fires, principally of LNG, and is generally in agreement with experimental values of all variables

  9. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  10. Net Based Examination: Small Group Tutoring, Home Assignments, and Large Group Automatic and Peer Assessment

    Directory of Open Access Journals (Sweden)

    G. Karlsson

    2007-09-01

    Full Text Available This paper deals with net based examination,tutoring and scaffolding of groups of different sizes: Firstfor very small groups, then for normal sized groups around100 students and finally for very large groups. The threedifferent methods can be applied to internationally basedcourses. Methods which support deep learning throughtutoring, scaffolding, project work and peer learning arealso mentioned.

  11. Will Large DSO-Managed Group Practices Be the Predominant Setting for Oral Health Care by 2025? Two Viewpoints: Viewpoint 1: Large DSO-Managed Group Practices Will Be the Setting in Which the Majority of Oral Health Care Is Delivered by 2025 and Viewpoint 2: Increases in DSO-Managed Group Practices Will Be Offset by Models Allowing Dentists to Retain the Independence and Freedom of a Traditional Practice.

    Science.gov (United States)

    Cole, James R; Dodge, William W; Findley, John S; Young, Stephen K; Horn, Bruce D; Kalkwarf, Kenneth L; Martin, Max M; Winder, Ronald L

    2015-05-01

    This Point/Counterpoint article discusses the transformation of dental practice from the traditional solo/small-group (partnership) model of the 1900s to large Dental Support Organizations (DSO) that support affiliated dental practices by providing nonclinical functions such as, but not limited to, accounting, human resources, marketing, and legal and practice management. Many feel that DSO-managed group practices (DMGPs) with employed providers will become the setting in which the majority of oral health care will be delivered in the future. Viewpoint 1 asserts that the traditional dental practice patterns of the past are shifting as many younger dentists gravitate toward employed positions in large group practices or the public sector. Although educational debt is relevant in predicting graduates' practice choices, other variables such as gender, race, and work-life balance play critical roles as well. Societal characteristics demonstrated by aging Gen Xers and those in the Millennial generation blend seamlessly with the opportunities DMGPs offer their employees. Viewpoint 2 contends the traditional model of dental care delivery-allowing entrepreneurial practitioners to make decisions in an autonomous setting-is changing but not to the degree nor as rapidly as Viewpoint 1 professes. Millennials entering the dental profession, with characteristics universally attributed to their generation, see value in the independence and flexibility that a traditional practice allows. Although DMGPs provide dentists one option for practice, several alternative delivery models offer current dentists and future dental school graduates many of the advantages of DMGPs while allowing them to maintain the independence and freedom a traditional practice provides.

  12. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  13. Working group report: Dictionary of Large Hadron Collider signatures

    Indian Academy of Sciences (India)

    Working group report: Dictionary of Large Hadron. Collider signatures. A BELYAEV1,∗, I A CHRISTIDI2, A DE ROECK3, R M GODBOLE4,. B MELLADO5, A NYFFELER6, C PETRIDOU2 and D P ROY7. 1School of Physics & Astronomy, University of Southampton, Southampton SO17 1BJ,. UK; Particle Physics Department, ...

  14. Characteristic properties of large subgroups in primary abelian groups

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Current address: 13, General Kutuzov Street, Block 7, Floor 2, Flat 4, 4003 Plovdiv,. Bulgaria. E-mail: pvdanchev@yahoo.com. MS received 27 May 2002; revised 19 May 2004. Abstract. Suppose G is an arbitrary additively written primary abelian group with a fixed large subgroup L. It is shown that G is (a) summable; ...

  15. The EU model evaluation group

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1999-01-01

    The model evaluation group (MEG) was launched in 1992 growing out of the Major Technological Hazards Programme with EU/DG XII. The goal of MEG was to improve the culture in which models were developed, particularly by encouraging voluntary model evaluation procedures based on a formalised and consensus protocol. The evaluation intended to assess the fitness-for-purpose of the models being used as a measure of the quality. The approach adopted was focused on developing a generic model evaluation protocol and subsequent targeting this onto specific areas of application. Five such developments have been initiated, on heavy gas dispersion, liquid pool fires, gas explosions, human factors and momentum fires. The quality of models is an important element when complying with the 'Seveso Directive' requiring that the safety reports submitted to the authorities comprise an assessment of the extent and severity of the consequences of identified major accidents. Further, the quality of models become important in the land use planning process, where the proximity of industrial sites to vulnerable areas may be critical. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  16. Working group report: Dictionary of Large Hadron Collider signatures

    Indian Academy of Sciences (India)

    universal extra dimensions with KK-parity for generic cases of their realization in a wide range of the model space. Discriminating signatures are tabulated and will need a further detailed analysis. Keywords. Large Hadron Collider; dark matter; discrimination; underlying theory. PACS Nos 11.30.Pb; 12.60.Jv. 1. Introduction.

  17. The effect of continuous grouping of pigs in large groups on stress response and haematological parameters

    DEFF Research Database (Denmark)

    Damgaard, Birthe Marie; Studnitz, Merete; Jensen, Karin Hjelholt

    2009-01-01

    The consequences of an ‘all in-all out' static group of uniform age vs. a continuously dynamic group with litter introduction and exit every third week were examined with respect to stress response and haematological parameters in large groups of 60 pigs. The experiment included a total of 480 pigs...... from weaning at the age of 4 weeks to the age of 18 weeks after weaning. Limited differences were found in stress and haematological parameters between pigs in dynamic and static groups. The cortisol response to the stress test was increasing with the duration of the stress test in pigs from...... the dynamic group while it was decreasing in the static group. The health condition and the growth performance were reduced in the dynamic groups compared with the static groups. In the dynamic groups the haematological parameters indicated an activation of the immune system characterised by an increased...

  18. Memory efficient PCA methods for large group ICA

    Directory of Open Access Journals (Sweden)

    Srinivas eRachakonda

    2016-02-01

    Full Text Available Principal component analysis (PCA is widely used for data reduction in group independent component analysis (ICA of fMRI data. Commonly, group-level PCA of temporally concatenated datasets is computed prior to ICA of the group principal components. This work focuses on reducing very high dimensional temporally concatenated datasets into its group PCA space. Existing randomized PCA methods can determine the PCA subspace with minimal memory requirements and, thus, are ideal for solving large PCA problems. Since the number of dataloads is not typically optimized, we extend one of these methods to compute PCA of very large datasets with a minimal number of dataloads. This method is coined multi power iteration (MPOWIT. The key idea behind MPOWIT is to estimate a subspace larger than the desired one, while checking for convergence of only the smaller subset of interest. The number of iterations is reduced considerably (as well as the number of dataloads, accelerating convergence without loss of accuracy. More importantly, in the proposed implementation of MPOWIT, the memory required for successful recovery of the group principal components becomes independent of the number of subjects analyzed. Highly efficient subsampled eigenvalue decomposition techniques are also introduced, furnishing excellent PCA subspace approximations that can be used for intelligent initialization of randomized methods such as MPOWIT. Together, these developments enable efficient estimation of accurate principal components, as we illustrate by solving a 1600-subject group-level PCA of fMRI with standard acquisition parameters, on a regular desktop computer with only 4GB RAM, in just a few hours. MPOWIT is also highly scalable and could realistically solve group-level PCA of fMRI on thousands of subjects, or more, using standard hardware, limited only by time, not memory. Also, the MPOWIT algorithm is highly parallelizable, which would enable fast, distributed implementations

  19. Memory Efficient PCA Methods for Large Group ICA.

    Science.gov (United States)

    Rachakonda, Srinivas; Silva, Rogers F; Liu, Jingyu; Calhoun, Vince D

    2016-01-01

    Principal component analysis (PCA) is widely used for data reduction in group independent component analysis (ICA) of fMRI data. Commonly, group-level PCA of temporally concatenated datasets is computed prior to ICA of the group principal components. This work focuses on reducing very high dimensional temporally concatenated datasets into its group PCA space. Existing randomized PCA methods can determine the PCA subspace with minimal memory requirements and, thus, are ideal for solving large PCA problems. Since the number of dataloads is not typically optimized, we extend one of these methods to compute PCA of very large datasets with a minimal number of dataloads. This method is coined multi power iteration (MPOWIT). The key idea behind MPOWIT is to estimate a subspace larger than the desired one, while checking for convergence of only the smaller subset of interest. The number of iterations is reduced considerably (as well as the number of dataloads), accelerating convergence without loss of accuracy. More importantly, in the proposed implementation of MPOWIT, the memory required for successful recovery of the group principal components becomes independent of the number of subjects analyzed. Highly efficient subsampled eigenvalue decomposition techniques are also introduced, furnishing excellent PCA subspace approximations that can be used for intelligent initialization of randomized methods such as MPOWIT. Together, these developments enable efficient estimation of accurate principal components, as we illustrate by solving a 1600-subject group-level PCA of fMRI with standard acquisition parameters, on a regular desktop computer with only 4 GB RAM, in just a few hours. MPOWIT is also highly scalable and could realistically solve group-level PCA of fMRI on thousands of subjects, or more, using standard hardware, limited only by time, not memory. Also, the MPOWIT algorithm is highly parallelizable, which would enable fast, distributed implementations ideal for big

  20. Two-group interfacial area concentration correlations of two-phase flows in large diameter pipes

    International Nuclear Information System (INIS)

    Shen, Xiuzhong; Hibiki, Takashi

    2015-01-01

    The reliable empirical correlations and models are one of the important ways to predict the interfacial area concentration (IAC) in two-phase flows. However, up to now, no correlation or model is available for the prediction of the IAC in the two-phase flows in large diameter pipes. This study collected an IAC experimental database of two-phase flows taken under various flow conditions in large diameter pipes and presented a systematic way to predict the IAC for two-phase flows from bubbly, cap-bubbly to churn flow in large diameter pipes by categorizing bubbles into two groups (group-1: spherical and distorted bubble, group-2: cap bubble). Correlations were developed to predict the group-1 void fraction from the void fraction of all bubble. The IAC contribution from group-1 bubbles was modeled by using the dominant parameters of group-1 bubble void fraction and Reynolds number based on the parameter-dependent analysis of Hibiki and Ishii (2001, 2002) using one-dimensional bubble number density and interfacial area transport equations. A new drift velocity correlation for two-phase flow with large cap bubbles in large diameter pipes was derived in this study. By comparing the newly-derived drift velocity correlation with the existing drift velocity correlation of Kataoka and Ishii (1987) for large diameter pipes and using the characteristics of the representative bubbles among the group 2 bubbles, we developed the model of IAC and bubble size for group 2 cap bubbles. The developed models for estimating the IAC are compared with the entire collected database. A reasonable agreement was obtained with average relative errors of ±28.1%, ±54.4% and ±29.6% for group 1, group 2 and all bubbles respectively. (author)

  1. Very Large System Dynamics Models - Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Leonard Malczynski

    2008-10-01

    This paper provides lessons learned from developing several large system dynamics (SD) models. System dynamics modeling practice emphasize the need to keep models small so that they are manageable and understandable. This practice is generally reasonable and prudent; however, there are times that large SD models are necessary. This paper outlines two large SD projects that were done at two Department of Energy National Laboratories, the Idaho National Laboratory and Sandia National Laboratories. This paper summarizes the models and then discusses some of the valuable lessons learned during these two modeling efforts.

  2. MOVES Model Review Work Group

    Science.gov (United States)

    The FACA MOVES Review Work Group was formed under the Mobile Sources Technical Review Subcommittee (MSTRS), and is charged to provide input to EPA via the MSTRS and the Clean Air Act Advisory Committee on specific issues regarding MOVES development.

  3. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  4. Facilitating Active Engagement of the University Student in a Large-Group Setting Using Group Work Activities

    Science.gov (United States)

    Kinsella, Gemma K.; Mahon, Catherine; Lillis, Seamus

    2017-01-01

    It is envisaged that small-group exercises as part of a large-group session would facilitate not only group work exercises (a valuable employability skill), but also peer learning. In this article, such a strategy to facilitate the active engagement of the student in a large-group setting was explored. The production of student-led resources was…

  5. Working group report: Beyond the standard model

    Indian Academy of Sciences (India)

    The working group on Beyond the Standard Model concentrated on identifying interesting physics issues in models ... In view of the range of current interest in the high energy physics community, this work- ing group was organised ... the computational tools currently relevant for particle phenomenology. Thus in this group,.

  6. Working group report: Physics at the Large Hadron Collider

    Indian Academy of Sciences (India)

    This is a summary of the activities of the Physics at the LHC working group in the XIth Workshop on High Energy Physics Phenomenology (WHEPP-XI) held at the Physical Research Laboratory, Ahmedabad, India in January 2010. We discuss the activities of each sub-working group on physics issues at colliders such as ...

  7. MODELLING GROUP ACTION OF UNMANNED AERIAL VEHICLES

    Directory of Open Access Journals (Sweden)

    S. V. Korevanov

    2015-01-01

    Full Text Available The problems of modeling and planning group flights of unmanned aerial vehicles are considered. For each stage of the planning procedure of group activates neural network structure is designed.

  8. Model of trust in work groups

    OpenAIRE

    Sidorenkov, Andrey; Sidorenkova, Irina

    2013-01-01

    A multi-dimensional model of trust in a small group has been developed and approved. This model includes two dimensions: trust levels (interpersonal trust, micro-group trust, group trust, trust between subgroups, trust between subgroups and group) and types of trust (activity-coping, information-influential and confidentially-protective trust). Each level of trust is manifested in three types, so there are fifteen varieties of trust. Two corresponding questionnaires were developed for the stu...

  9. Measuring stereotypies in large groups of veal calves

    NARCIS (Netherlands)

    Webb, L.E.

    2015-01-01

    Diets fed to veal calves seem to lead to frustration and chronic stress in the calves, due to the limited opportunity to chew and ruminate on solid feed (e.g. Veissier et al., 1998). Veal calves are typically fed only twice a day, and their meals comprise of large quantities of milk replacer

  10. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  11. Models for large superconducting toroidal magnet systems

    International Nuclear Information System (INIS)

    Arendt, F.; Brechna, H.; Erb, J.; Komarek, P.; Krauth, H.; Maurer, W.

    1976-01-01

    Prior to the design of large GJ toroidal magnet systems it is appropriate to procure small scale models, which can simulate their pertinent properties and allow to investigate their relevant phenomena. The important feature of the model is to show under which circumstances the system performance can be extrapolated to large magnets. Based on parameters such as the maximum magnetic field and the current density, the maximum tolerable magneto-mechanical stresses, a simple method of designing model magnets is presented. It is shown how pertinent design parameters are changed when the toroidal dimensions are altered. In addition some conductor cost estimations are given based on reactor power output and wall loading

  12. Unilateral neglect and perceptual parsing: a large-group study.

    Science.gov (United States)

    Neppi-Mòdona, Marco; Savazzi, Silvia; Ricci, Raffaella; Genero, Rosanna; Berruti, Giuseppina; Pepi, Riccardo

    2002-01-01

    Array-centred and subarray-centred neglect were disambiguated in a group of 116 patients with left neglect by means of a modified version of the Albert test in which the central column of segments was deleted so as to create two separate sets of targets grouped by proximity. The results indicated that neglect was more frequent in array- than subarray-centred coordinates and that, in a minority of cases, neglect co-occurred in both coordinate-systems. The two types of neglect were functionally but not anatomically dissociated. Presence of visual field defects was not prevalent in one type of neglect with respect to the other. These data contribute further evidence to previous single-case and small-group studies by showing that neglect can occur in single or multiple reference frames simultaneously, in agreement with current neuropsychological, neurophysiological and computational concepts of space representation.

  13. Commons Dilemma Choices in Small vs. Large Groups.

    Science.gov (United States)

    Powers, Richard B.; Boyle, William

    The purpose of the Commons Game is to teach students how social traps work; that is, that short-term individual gain tends to dominate long-term collective gain. Simulations of Commons Dilemma have grown considerably in the last decade; however, the research has used small face-to-face groups to study behavior in the Commons. To compare the…

  14. Working group report: Physics at the Large Hadron Collider

    Indian Academy of Sciences (India)

    . Corresponding author. E-mail: tpdkg@iacs.res.in. Abstract. This is a summary of the activities of the Physics at the LHC working group in the. XIth Workshop on High Energy Physics Phenomenology (WHEPP-XI) held at the Physical Research.

  15. Conjugacy in relatively extra-large Artin groups

    Directory of Open Access Journals (Sweden)

    Arye Juhasz

    2015-09-01

    Full Text Available Let A be an Artin group with standard generators X={x 1 ,…,x n } , n≥1 and defining graph Γ A . A \\emph{standard parabolic subgroup} of A is a subgroup generated by a subset of X . For elements u and v of A we say (as usual that u is conjugate to v by an element h of A if h −1 uh=v holds in A . Similarly, if K and L are subsets of A then K is conjugate to L by an element h of A if h −1 Kh=L . In this work we consider the conjugacy of elements and standard parabolic subgroups of a certain type of Artin groups. Results in this direction occur in occur in papers by Duncan, Kazachkov, Remeslennikov, Fenn, Dale, Jun, Godelle, Gonzalez-Meneses, Wiest, Paris, Rolfsen, for example. Of particular interest are centralisers of elements, and of standard parabolic subgroups, normalisers of standard parabolic subgroups and commensurators of parabolic subgroups. In this work we consider similar problems in a new class of Artin groups, introduced in the paper "On relatively extralarge Artin groups and their relative asphericity", by Juhasz, where the word problem is solved, among other things. Also, intersections of parabolic subgroups and their conjugates are considered.

  16. Curvature Properties of Lorentzian Manifolds with Large Isometry Groups

    Energy Technology Data Exchange (ETDEWEB)

    Batat, Wafaa [Ecole Normale Superieure de L' Enseignement Technique d' Oran, Departement de Mathematiques et Informatique (Algeria)], E-mail: wafa.batat@enset-oran.dz; Calvaruso, Giovanni, E-mail: giovanni.calvaruso@unile.it; Leo, Barbara De [University of Salento, Dipartimento di Matematica ' E. De Giorgi' (Italy)], E-mail: barbara.deleo@unile.it

    2009-08-15

    The curvature of Lorentzian manifolds (M{sup n},g), admitting a group of isometries of dimension at least 1/2n(n - 1) + 1, is completely described. Interesting behaviours are found, in particular as concerns local symmetry, local homogeneity and conformal flatness.

  17. Large Group Teaching, an Effective and Efficient Teaching Methodology

    OpenAIRE

    Afshan Sumera

    2014-01-01

    In general sense, education is form of learning in which there is transfer of knowledge, skills and attitude from generation to generation by the means of teaching, training, research, or by self-directed learning. (Dewey, 1916/1944)The word teaching is defined in Oxford dictionary as ‘’ to impart knowledge to or instruct (someone) as to how to do something’’. Dolmens defines learning as a collaborative, constructive, contextual and self directed process. (Dolmans et al., 2005)Large g...

  18. Characteristic properties of large subgroups in primary abelian groups

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    1. Introduction. The main purpose of this article is to study the relations between the structures of primary abelian groups and their ..... Case 2. γ − 2 exists. Let Gγ −1 be a direct summand of Gγ . We remark, in connection with Case 1, that any pγ −1. -high subgroup of Gγ is isomorphic to Gγ −1. As far as Case 2 is concerned, ...

  19. Constituent models and large transverse momentum reactions

    International Nuclear Information System (INIS)

    Brodsky, S.J.

    1975-01-01

    The discussion of constituent models and large transverse momentum reactions includes the structure of hard scattering models, dimensional counting rules for large transverse momentum reactions, dimensional counting and exclusive processes, the deuteron form factor, applications to inclusive reactions, predictions for meson and photon beams, the charge-cubed test for the e/sup +-/p → e/sup +-/γX asymmetry, the quasi-elastic peak in inclusive hadronic reactions, correlations, and the multiplicity bump at large transverse momentum. Also covered are the partition method for bound state calculations, proofs of dimensional counting, minimal neutralization and quark--quark scattering, the development of the constituent interchange model, and the A dependence of high transverse momentum reactions

  20. Working group report: Dictionary of Large Hadron Collider signatures

    Indian Academy of Sciences (India)

    We report on a plan to establish a `Dictionary of LHC Signatures', an initiative that started at the WHEPP-X workshop in Chennai, January 2008. This study aims at the strategy of distinguishing 3 classes of dark matter motivated scenarios such as -parity conserved supersymmetry, little Higgs models with -parity ...

  1. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  2. Risk contracting and operational capabilities in large medical groups during national healthcare reform.

    Science.gov (United States)

    Mechanic, Robert E; Zinner, Darren

    2016-06-01

    Little is known about the scope of alternative payment models outside of Medicare. This study measures the full complement of public and private payment arrangements in large, multi-specialty group practices as a barometer of payment reform among advanced organizations. We collected information from 33 large, multi-specialty group practices about the proportion of their total revenue in 7 payment models, physician compensation strategies, and the implementation of selected performance management initiatives. We grouped respondents into 3 categories based on the proportion of their revenue in risk arrangements: risk-based (45%-100%), mixed (10%-35%), and fee-for-service (FFS) (0%-10%). We analyzed changes in contracting and operating characteristics between 2011 and 2013. In 2013, 68% of groups' total patient revenue was from FFS payments and 32% was from risk arrangements (unweighted average). Risk-based groups had 26% FFS revenue, whereas mixed-payment and FFS groups had 75% and 98%, respectively. Between 2011 and 2013, 9 groups increased risk contract revenue by about 15 percentage points and 22 reported few changes. Risk-based groups reported more advanced implementation of performance management strategies and were more likely to have physician financial incentives for quality and patient experience. The groups in this study are well positioned to manage risk-based contracts successfully, but less than one-third receive a majority of their revenue from risk arrangements. The experience of these relatively advanced groups suggests that expanding risk-based arrangements across the US health system will likely be slower and more challenging than many people assume.

  3. Large Mammalian Animal Models of Heart Disease

    Directory of Open Access Journals (Sweden)

    Paula Camacho

    2016-10-01

    Full Text Available Due to the biological complexity of the cardiovascular system, the animal model is an urgent pre-clinical need to advance our knowledge of cardiovascular disease and to explore new drugs to repair the damaged heart. Ideally, a model system should be inexpensive, easily manipulated, reproducible, a biological representative of human disease, and ethically sound. Although a larger animal model is more expensive and difficult to manipulate, its genetic, structural, functional, and even disease similarities to humans make it an ideal model to first consider. This review presents the commonly-used large animals—dog, sheep, pig, and non-human primates—while the less-used other large animals—cows, horses—are excluded. The review attempts to introduce unique points for each species regarding its biological property, degrees of susceptibility to develop certain types of heart diseases, and methodology of induced conditions. For example, dogs barely develop myocardial infarction, while dilated cardiomyopathy is developed quite often. Based on the similarities of each species to the human, the model selection may first consider non-human primates—pig, sheep, then dog—but it also depends on other factors, for example, purposes, funding, ethics, and policy. We hope this review can serve as a basic outline of large animal models for cardiovascular researchers and clinicians.

  4. Modeling capillary forces for large displacements

    NARCIS (Netherlands)

    Mastrangeli, M.; Arutinov, G.; Smits, E.C.P.; Lambert, P.

    2014-01-01

    Originally applied to the accurate, passive positioning of submillimetric devices, recent works proved capillary self-alignment as effective also for larger components and relatively large initial offsets. In this paper, we describe an analytic quasi-static model of 1D capillary restoring forces

  5. Pronunciation Modeling for Large Vocabulary Speech Recognition

    Science.gov (United States)

    Kantor, Arthur

    2010-01-01

    The large pronunciation variability of words in conversational speech is one of the major causes of low accuracy in automatic speech recognition (ASR). Many pronunciation modeling approaches have been developed to address this problem. Some explicitly manipulate the pronunciation dictionary as well as the set of the units used to define the…

  6. Behavioral Health Integration in Large Multi-group Pediatric Practice.

    Science.gov (United States)

    Schlesinger, Abigail Boden

    2017-03-01

    There is increasing interest in methods to improve access to behavioral health services for children and adolescents. Children's Community Pediatric Behavioral Health Service (CCPBHS) is an integrated behavioral health service whose method of (a) creating a leadership team with empowered administrative and clinical stakeholders who can act on a commitment to change and (b) having a clear mission statement with integrated administrative and clinical care processes can serve as a model for implementing integration efforts within the medical home. Community Pediatrics Behavioral Health Service (CPBHS) is a sustainable initiative that improved the utilization of physical health and behavioral health systems for youth and improved the utilization of evidence-based interventions for youth served in primary care.

  7. Generation and analysis of large reliability models

    Science.gov (United States)

    Palumbo, Daniel L.; Nicol, David M.

    1990-01-01

    An effort has been underway for several years at NASA's Langley Research Center to extend the capability of Markov modeling techniques for reliability analysis to the designers of highly reliable avionic systems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG), a software tool which uses as input a graphical, object-oriented block diagram of the system, is discussed. RMG uses an automated failure modes-effects analysis algorithm to produce the reliability model from the graphical description. Also considered is the ASSURE software tool, a parallel processing program which uses the ASSIST modeling language and SURE semi-Markov solution technique. An executable failure modes-effects analysis is used by ASSURE. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that large system architectures can now be analyzed.

  8. Dark radiation in LARGE volume models

    Science.gov (United States)

    Cicoli, Michele; Conlon, Joseph P.; Quevedo, Fernando

    2013-02-01

    We consider reheating driven by volume modulus decays in the LARGE volume scenario. Such reheating always generates nonzero dark radiation through the decays to the axion partner, while the only competitive visible sector decays are Higgs pairs via the Giudice-Masiero term. In the framework of sequestered models where the cosmological moduli problem is absent, the simplest model with a shift-symmetric Higgs sector generates 1.56≤ΔNeff≤1.74. For more general cases, the known experimental bounds on ΔNeff strongly constrain the parameters and matter content of the models.

  9. Modelling and control of large cryogenic refrigerator

    International Nuclear Information System (INIS)

    Bonne, Francois

    2014-01-01

    This manuscript is concern with both the modeling and the derivation of control schemes for large cryogenic refrigerators. The particular case of those which are submitted to highly variable pulsed heat load is studied. A model of each object that normally compose a large cryo-refrigerator is proposed. The methodology to gather objects model into the model of a subsystem is presented. The manuscript also shows how to obtain a linear equivalent model of the subsystem. Based on the derived models, advances control scheme are proposed. Precisely, a linear quadratic controller for warm compression station working with both two and three pressures state is derived, and a predictive constrained one for the cold-box is obtained. The particularity of those control schemes is that they fit the computing and data storage capabilities of Programmable Logic Controllers (PLC) with are well used in industry. The open loop model prediction capability is assessed using experimental data. Developed control schemes are validated in simulation and experimentally on the 400W1.8K SBT's cryogenic test facility and on the CERN's LHC warm compression station. (author) [fr

  10. Model of trust in work groups

    Directory of Open Access Journals (Sweden)

    Sidorenkov, Andrey V.

    2013-09-01

    Full Text Available A multi-dimensional model of trust in a small group has been developed and approved. This model includes two dimensions: trust levels (interpersonal trust, micro-group trust, group trust, trust between subgroups, trust between subgroups and group and types of trust (activity-coping, information-influential and confidentially-protective trust. Each level of trust is manifested in three types, so there are fifteen varieties of trust. Two corresponding questionnaires were developed for the study. 347 persons from 32 work groups participated in the research. It was determined that in a small group there is an asymmetry of trust levels within the group. In particular, micro-group trust is demonstrated the most in comparison with other trust levels. There is also an asymmetry in the manifestation of interpersonal trust in a group structure. This is demonstrated by the fact that in informal subgroups, in comparison with a group as a whole, interpersonal confidential and performance trust is the most manifested. In a small group and in informal subgroups there are relationships between trust levels which have certain regularities.

  11. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.

    2010-08-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models coupled to fire behavior models to simulate fire behavior. NWP models are capable of modeling very high resolution (< 100 m) atmospheric flows. The wildland fire component is based upon semi-empirical formulas for fireline rate of spread, post-frontal heat release, and a canopy fire. The fire behavior is coupled to the atmospheric model such that low level winds drive the spread of the surface fire, which in turn releases sensible heat, latent heat, and smoke fluxes into the lower atmosphere, feeding back to affect the winds directing the fire. These coupled dynamic models capture the rapid spread downwind, flank runs up canyons, bifurcations of the fire into two heads, and rough agreement in area, shape, and direction of spread at periods for which fire location data is available. Yet, intriguing computational science questions arise in applying such models in a predictive manner, including physical processes that span a vast range of scales, processes such as spotting that cannot be modeled deterministically, estimating the consequences of uncertainty, the efforts to steer simulations with field data ("data assimilation"), lingering issues with short term forecasting of weather that may show skill only on the order of a few hours, and the difficulty of gathering pertinent data for verification and initialization in a dangerous environment. © 2010 IEEE.

  12. Point groups in the Vibron model

    Energy Technology Data Exchange (ETDEWEB)

    Leviatan, A.

    1989-08-01

    The question of incorporating the notion of point groups in the algebraic Vibron model for molecular rotation--vibration spectra is addressed. Boson transformations which act on intrinsic states are identified as the algebraic analog of the discrete point group transformations. A prescription for assigning point group labels to states of the Vibron model is obtained. In case of nonlinear triatomic molecules the Jacobi coordinates are found to be a convenient possible choice for the geometric counterparts of the algebraic shape parameters. The work focuses on rigid diatomic and triatomic molecules (linear and bent).

  13. Beyond the Standard Model: Working group report

    Indian Academy of Sciences (India)

    tion within the 'Beyond the Standard Model' working group of WHEPP-6. These problems addressed various extensions of the Standard Model (SM) currently under consideration in the particle physics phenomenology community. Smaller subgroups were formed to focus on each of these problems. The progresstill the end ...

  14. Group size of a permanent large group of agile mangabeys (Cercocebus agilis) at Bai Hokou, Central African Republic.

    Science.gov (United States)

    Devreese, Lieven; Huynen, Marie-Claude; Stevens, Jeroen M G; Todd, Angelique

    2013-01-01

    White-eyelid mangabeys (genus Cercocebus) live in groups of highly variable size. Because of their semi-terrestrial behaviour and preference for dense forest habitats, re-liable data on group size are scarce. During a 5-month study, we collected 17 group counts on a habituated group of agile mangabeys (C. agilis) at Bai Hokou in the Central African Republic. We found a stable group size of approximately 135 individuals. This permanent large grouping pattern is known to occur among several populations of white-eyelid mangabeys and is congruent with extreme group sizes reported in mandrills at Lopé in Gabon. Copyright © 2013 S. Karger AG, Basel.

  15. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  16. On spinfoam models in large spin regime

    International Nuclear Information System (INIS)

    Han, Muxin

    2014-01-01

    We study the semiclassical behavior of Lorentzian Engle–Pereira–Rovelli–Livine (EPRL) spinfoam model, by taking into account the sum over spins in the large spin regime. We also employ the method of stationary phase analysis with parameters and the so-called, almost analytic machinery, in order to find the asymptotic behavior of the contributions from all possible large spin configurations in the spinfoam model. The spins contributing the sum are written as J f = λj f , where λ is a large parameter resulting in an asymptotic expansion via stationary phase approximation. The analysis shows that at least for the simplicial Lorentzian geometries (as spinfoam critical configurations), they contribute the leading order approximation of spinfoam amplitude only when their deficit angles satisfy γ Θ-ring f ≤λ −1/2 mod 4πZ. Our analysis results in a curvature expansion of the semiclassical low energy effective action from the spinfoam model, where the UV modifications of Einstein gravity appear as subleading high-curvature corrections. (paper)

  17. Modelling large-scale hydrogen infrastructure development

    International Nuclear Information System (INIS)

    De Groot, A.; Smit, R.; Weeda, M.

    2005-08-01

    In modelling a possible H2 infrastructure development the following questions are answered in this presentation: How could the future demand for H2 develop in the Netherlands?; and In which year and where would it be economically viable to construct a H2 infrastructure in the Netherlands? Conclusions are that: A model for describing a possible future H2 infrastructure is successfully developed; The model is strongly regional and time dependent; Decrease of fuel cell cost appears to be a sensitive parameter for development of H2 demand; Cost-margin between large-scale and small-scale H2 production is a main driver for development of a H2 infrastructure; A H2 infrastructure seems economically viable in the Netherlands starting from the year 2022

  18. A Large Group Decision Making Approach Based on TOPSIS Framework with Unknown Weights Information

    Directory of Open Access Journals (Sweden)

    Li Yupeng

    2017-01-01

    Full Text Available Large group decision making considering multiple attributes is imperative in many decision areas. The weights of the decision makers (DMs is difficult to obtain for the large number of DMs. To cope with this issue, an integrated multiple-attributes large group decision making framework is proposed in this article. The fuzziness and hesitation of the linguistic decision variables are described by interval-valued intuitionistic fuzzy sets. The weights of the DMs are optimized by constructing a non-linear programming model, in which the original decision matrices are aggregated by using the interval-valued intuitionistic fuzzy weighted average operator. By solving the non-linear programming model with MATLAB®, the weights of the DMs and the fuzzy comprehensive decision matrix are determined. Then the weights of the criteria are calculated based on the information entropy theory. At last, the TOPSIS framework is employed to establish the decision process. The divergence between interval-valued intuitionistic fuzzy numbers is calculated by interval-valued intuitionistic fuzzy cross entropy. A real-world case study is constructed to elaborate the feasibility and effectiveness of the proposed methodology.

  19. Large animal models for stem cell therapy.

    Science.gov (United States)

    Harding, John; Roberts, R Michael; Mirochnitchenko, Oleg

    2013-03-28

    The field of regenerative medicine is approaching translation to clinical practice, and significant safety concerns and knowledge gaps have become clear as clinical practitioners are considering the potential risks and benefits of cell-based therapy. It is necessary to understand the full spectrum of stem cell actions and preclinical evidence for safety and therapeutic efficacy. The role of animal models for gaining this information has increased substantially. There is an urgent need for novel animal models to expand the range of current studies, most of which have been conducted in rodents. Extant models are providing important information but have limitations for a variety of disease categories and can have different size and physiology relative to humans. These differences can preclude the ability to reproduce the results of animal-based preclinical studies in human trials. Larger animal species, such as rabbits, dogs, pigs, sheep, goats, and non-human primates, are better predictors of responses in humans than are rodents, but in each case it will be necessary to choose the best model for a specific application. There is a wide spectrum of potential stem cell-based products that can be used for regenerative medicine, including embryonic and induced pluripotent stem cells, somatic stem cells, and differentiated cellular progeny. The state of knowledge and availability of these cells from large animals vary among species. In most cases, significant effort is required for establishing and characterizing cell lines, comparing behavior to human analogs, and testing potential applications. Stem cell-based therapies present significant safety challenges, which cannot be addressed by traditional procedures and require the development of new protocols and test systems, for which the rigorous use of larger animal species more closely resembling human behavior will be required. In this article, we discuss the current status and challenges of and several major directions

  20. Group Centric Networking: Large Scale Over the Air Testing of Group Centric Networking

    Science.gov (United States)

    2016-11-01

    standardization: A paradigm shift in wsn routing protocols,” Communications Surveys & Tutorials , IEEE , vol. 13, no. 4, pp. 688–707, 2011. [4] G. Kuperman, J. Sun...performance of Group Centric Networking (GCN), a networking protocol developed for robust and scalable communications in lossy networks where users are...automobiles, will connect to one another. In a given location (house, factory, etc.), we expect 10s to 100s of devices to communicate with one another

  1. Beyond the Standard Model: Working group report

    Indian Academy of Sciences (India)

    55, Nos 1 & 2. — journal of. July & August 2000 physics pp. 307–313. Beyond the Standard Model: Working group report. GAUTAM BHATTACHARYYA. ½ .... action: ¯Consider the possibility that these neutrinos are of Majorana nature, i.e. r η И r , where η И. ¦½. Then the initial condition of degeneracy stated above.

  2. Diagrammatic group theory in quark models

    International Nuclear Information System (INIS)

    Canning, G.P.

    1977-05-01

    A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de

  3. Affine Poisson Groups and WZW Model

    Directory of Open Access Journals (Sweden)

    Ctirad Klimcík

    2008-01-01

    Full Text Available We give a detailed description of a dynamical system which enjoys a Poisson-Lie symmetry with two non-isomorphic dual groups. The system is obtained by taking the q → ∞ limit of the q-deformed WZW model and the understanding of its symmetry structure results in uncovering an interesting duality of its exchange relations.

  4. Models of group psychotherapy: sifting through confusion.

    Science.gov (United States)

    Dies, R R

    1992-01-01

    This editorial introduces a series of articles by leading proponents of the ten major models of group psychotherapy to appear in the International Journal of Group Psychotherapy. These theoretical contributions will be published throughout 1992 as a dedication to the American Group Psychotherapy Association's (AGPA) 50th anniversary. In the present article, the author reports results from a recent survey of senior clinicians within AGPA who expressed their opinions about the central issues that practitioners should understand during the working phase of group treatments. Statistical comparisons among action-oriented, interpersonal, and psychodynamic respondents to the questionnaire revealed striking differences in how therapeutic interventions were conceptualized. These findings are outlined as a preface to the first three articles in the series.

  5. A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size.

    Directory of Open Access Journals (Sweden)

    Gul Deniz Salali

    Full Text Available One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics.

  6. Black holes from large N singlet models

    Science.gov (United States)

    Amado, Irene; Sundborg, Bo; Thorlacius, Larus; Wintergerst, Nico

    2018-03-01

    The emergent nature of spacetime geometry and black holes can be directly probed in simple holographic duals of higher spin gravity and tensionless string theory. To this end, we study time dependent thermal correlation functions of gauge invariant observables in suitably chosen free large N gauge theories. At low temperature and on short time scales the correlation functions encode propagation through an approximate AdS spacetime while interesting departures emerge at high temperature and on longer time scales. This includes the existence of evanescent modes and the exponential decay of time dependent boundary correlations, both of which are well known indicators of bulk black holes in AdS/CFT. In addition, a new time scale emerges after which the correlation functions return to a bulk thermal AdS form up to an overall temperature dependent normalization. A corresponding length scale was seen in equal time correlation functions in the same models in our earlier work.

  7. Cooperative Coevolution with Formula-Based Variable Grouping for Large-Scale Global Optimization.

    Science.gov (United States)

    Wang, Yuping; Liu, Haiyan; Wei, Fei; Zong, Tingting; Li, Xiaodong

    2017-08-09

    For a large-scale global optimization (LSGO) problem, divide-and-conquer is usually considered an effective strategy to decompose the problem into smaller subproblems, each of which can then be solved individually. Among these decomposition methods, variable grouping is shown to be promising in recent years. Existing variable grouping methods usually assume the problem to be black-box (i.e., assuming that an analytical model of the objective function is unknown), and they attempt to learn appropriate variable grouping that would allow for a better decomposition of the problem. In such cases, these variable grouping methods do not make a direct use of the formula of the objective function. However, it can be argued that many real-world problems are white-box problems, that is, the formulas of objective functions are often known a priori. These formulas of the objective functions provide rich information which can then be used to design an effective variable group method. In this article, a formula-based grouping strategy (FBG) for white-box problems is first proposed. It groups variables directly via the formula of an objective function which usually consists of a finite number of operations (i.e., four arithmetic operations "[Formula: see text]", "[Formula: see text]", "[Formula: see text]", "[Formula: see text]" and composite operations of basic elementary functions). In FBG, the operations are classified into two classes: one resulting in nonseparable variables, and the other resulting in separable variables. In FBG, variables can be automatically grouped into a suitable number of non-interacting subcomponents, with variables in each subcomponent being interdependent. FBG can easily be applied to any white-box problem and can be integrated into a cooperative coevolution framework. Based on FBG, a novel cooperative coevolution algorithm with formula-based variable grouping (so-called CCF) is proposed in this article for decomposing a large-scale white-box problem

  8. Spectra of operators in large N tensor models

    Science.gov (United States)

    Bulycheva, Ksenia; Klebanov, Igor R.; Milekhin, Alexey; Tarnopolsky, Grigory

    2018-01-01

    We study the operators in the large N tensor models, focusing mostly on the fermionic quantum mechanics with O (N )3 symmetry which may be either global or gauged. In the model with global symmetry, we study the spectra of bilinear operators, which are in either the symmetric traceless or the antisymmetric representation of one of the O (N ) groups. In the symmetric traceless case, the spectrum of scaling dimensions is the same as in the Sachdev-Ye-Kitaev (SYK) model with real fermions; it includes the h =2 zero mode. For the operators antisymmetric in the two indices, the scaling dimensions are the same as in the additional sector found in the complex tensor and SYK models; the lowest h =0 eigenvalue corresponds to the conserved O (N ) charges. A class of singlet operators may be constructed from contracted combinations of m symmetric traceless or antisymmetric two-particle operators. Their two-point functions receive contributions from m melonic ladders. Such multiple ladders are a new phenomenon in the tensor model, which does not seem to be present in the SYK model. The more typical 2 k -particle operators do not receive any ladder corrections and have quantized large N scaling dimensions k /2 . We construct pictorial representations of various singlet operators with low k . For larger k , we use available techniques to count the operators and show that their number grows as 2kk !. As a consequence, the theory has a Hagedorn phase transition at the temperature which approaches zero in the large N limit. We also study the large N spectrum of low-lying operators in the Gurau-Witten model, which has O (N )6 symmetry. We argue that it corresponds to one of the generalized SYK models constructed by Gross and Rosenhaus. Our paper also includes studies of the invariants in large N tensor integrals with various symmetries.

  9. Computational social dynamic modeling of group recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  10. Exposing earth surface process model simulations to a large audience

    Science.gov (United States)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  11. Large Sets in Boolean and Non-Boolean Groups and Topology

    Directory of Open Access Journals (Sweden)

    Ol’ga V. Sipacheva

    2017-10-01

    Full Text Available Various notions of large sets in groups, including the classical notions of thick, syndetic, and piecewise syndetic sets and the new notion of vast sets in groups, are studied with emphasis on the interplay between such sets in Boolean groups. Natural topologies closely related to vast sets are considered; as a byproduct, interesting relations between vast sets and ultrafilters are revealed.

  12. Incidence and characterization of beta-hemolytic Streptococcus milleri and differentiation from S. pyogenes (group A), S. equisimilis (group C), and large-colony group G streptococci.

    Science.gov (United States)

    Lawrence, J; Yajko, D M; Hadley, W K

    1985-01-01

    The biochemical characteristics of 172 clinical isolates of group A, C, F, or G or "nongroupable" beta-hemolytic streptococci were examined. Among these isolates, 91 were identified as beta-hemolytic strains of Streptococcus milleri. The remaining isolates included 20 Streptococcus pyogenes, 21 Streptococcus equisimilis, 37 large-colony group G streptococci, and 3 unidentified nongroupable isolates. A majority (84%) of the S. milleri strains possessed Lancefield group antigen (3 A, 27 C, 41 F, and 5 G), whereas 15 S. milleri strains (16%) were nongroupable. Serological tests did not differentiate S. milleri isolates with group A, C, or G antigen from S. pyogenes (group A), S. equisimilis (group C), or large-colony group G streptococci. Biochemical tests which were found useful for differentiation included the Voges-Proskauer test, hydrolysis of pyroglutamic acid and beta-D-glucuronide, bacitracin susceptibility, and acid production from ribose. S. milleri represented 56% of the group C, 100% of the group F, and 83% of the nongroupable beta-hemolytic streptococci isolated in our clinical laboratory, whereas the incidence of S. milleri among group A and group G streptococci was estimated to be low. The role of beta-hemolytic S. milleri as a cause of human infection remains obscured by the failure to routinely differentiate S. milleri from other beta-hemolytic streptococci. PMID:3902878

  13. Problem-based learning: facilitating multiple small teams in a large group setting.

    Science.gov (United States)

    Hyams, Jennifer H; Raidal, Sharanne L

    2013-01-01

    Problem-based learning (PBL) is often described as resource demanding due to the high staff-to-student ratio required in a traditional PBL tutorial class where there is commonly one facilitator to every 5-16 students. The veterinary science program at Charles Sturt University, Australia, has developed a method of group facilitation which readily allows one or two staff members to facilitate up to 30 students at any one time while maintaining the benefits of a small PBL team of six students. Multi-team facilitation affords obvious financial and logistic advantages, but there are also important pedagogical benefits derived from uniform facilitation across multiple groups, enhanced discussion and debate between groups, and the development of self-facilitation skills in students. There are few disadvantages to the roaming facilitator model, provided that several requirements are addressed. These requirements include a suitable venue, large whiteboards, a structured approach to support student engagement with each disclosure, a detailed facilitator guide, and an open, collaborative, and communicative environment.

  14. Novel web service selection model based on discrete group search.

    Science.gov (United States)

    Zhai, Jie; Shao, Zhiqing; Guo, Yi; Zhang, Haiteng

    2014-01-01

    In our earlier work, we present a novel formal method for the semiautomatic verification of specifications and for describing web service composition components by using abstract concepts. After verification, the instantiations of components were selected to satisfy the complex service performance constraints. However, selecting an optimal instantiation, which comprises different candidate services for each generic service, from a large number of instantiations is difficult. Therefore, we present a new evolutionary approach on the basis of the discrete group search service (D-GSS) model. With regard to obtaining the optimal multiconstraint instantiation of the complex component, the D-GSS model has competitive performance compared with other service selection models in terms of accuracy, efficiency, and ability to solve high-dimensional service composition component problems. We propose the cost function and the discrete group search optimizer (D-GSO) algorithm and study the convergence of the D-GSS model through verification and test cases.

  15. Configured-groups hypothesis: fast comparison of exact large quantities without counting.

    Science.gov (United States)

    Miravete, Sébastien; Tricot, André; Kalyuga, Slava; Amadieu, Franck

    2017-11-01

    Our innate number sense cannot distinguish between two large exact numbers of objects (e.g., 45 dots vs 46). Configured groups (e.g., 10 blocks, 20 frames) are traditionally used in schools to represent large numbers. Previous studies suggest that these external representations make it easier to use symbolic strategies such as counting ten by ten, enabling humans to differentiate exactly two large numbers. The main hypothesis of this work is that configured groups also allow for a differentiation of large exact numbers, even when symbolic strategies become ineffective. In experiment 1, the children from grade 3 were asked to compare two large collections of objects for 5 s. When the objects were organized in configured groups, the success rate was over .90. Without this configured grouping, the children were unable to make a successful comparison. Experiments 2 and 3 controlled for a strategy based on non-numerical parameters (areas delimited by dots or the sum areas of dots, etc.) or use symbolic strategies. These results suggest that configured grouping enables humans to distinguish between two large exact numbers of objects, even when innate number sense and symbolic strategies are ineffective. These results are consistent with what we call "the configured group hypothesis": configured groups play a fundamental role in the acquisition of exact numerical abilities.

  16. Fuzzy classification of phantom parent groups in an animal model

    Directory of Open Access Journals (Sweden)

    Fikse Freddy

    2009-09-01

    Full Text Available Abstract Background Genetic evaluation models often include genetic groups to account for unequal genetic level of animals with unknown parentage. The definition of phantom parent groups usually includes a time component (e.g. years. Combining several time periods to ensure sufficiently large groups may create problems since all phantom parents in a group are considered contemporaries. Methods To avoid the downside of such distinct classification, a fuzzy logic approach is suggested. A phantom parent can be assigned to several genetic groups, with proportions between zero and one that sum to one. Rules were presented for assigning coefficients to the inverse of the relationship matrix for fuzzy-classified genetic groups. This approach was illustrated with simulated data from ten generations of mass selection. Observations and pedigree records were randomly deleted. Phantom parent groups were defined on the basis of gender and generation number. In one scenario, uncertainty about generation of birth was simulated for some animals with unknown parents. In the distinct classification, one of the two possible generations of birth was randomly chosen to assign phantom parents to genetic groups for animals with simulated uncertainty, whereas the phantom parents were assigned to both possible genetic groups in the fuzzy classification. Results The empirical prediction error variance (PEV was somewhat lower for fuzzy-classified genetic groups. The ranking of animals with unknown parents was more correct and less variable across replicates in comparison with distinct genetic groups. In another scenario, each phantom parent was assigned to three groups, one pertaining to its gender, and two pertaining to the first and last generation, with proportion depending on the (true generation of birth. Due to the lower number of groups, the empirical PEV of breeding values was smaller when genetic groups were fuzzy-classified. Conclusion Fuzzy

  17. Large and small sets with respect to homomorphisms and products of groups

    Directory of Open Access Journals (Sweden)

    Riccardo Gusso

    2002-10-01

    Full Text Available We study the behaviour of large, small and medium subsets with respect to homomorphisms and products of groups. Then we introduce the definition af a P-small set in abelian groups and we investigate the relations between this kind of smallness and the previous one, giving some examples that distinguish them.

  18. Renormalisation group improved leptogenesis in family symmetry models

    International Nuclear Information System (INIS)

    Cooper, Iain K.; King, Stephen F.; Luhn, Christoph

    2012-01-01

    We study renormalisation group (RG) corrections relevant for leptogenesis in the case of family symmetry models such as the Altarelli-Feruglio A 4 model of tri-bimaximal lepton mixing or its extension to tri-maximal mixing. Such corrections are particularly relevant since in large classes of family symmetry models, to leading order, the CP violating parameters of leptogenesis would be identically zero at the family symmetry breaking scale, due to the form dominance property. We find that RG corrections violate form dominance and enable such models to yield viable leptogenesis at the scale of right-handed neutrino masses. More generally, the results of this paper show that RG corrections to leptogenesis cannot be ignored for any family symmetry model involving sizeable neutrino and τ Yukawa couplings.

  19. Multilevel Modeling for Research in Group Work

    Science.gov (United States)

    Selig, James P.; Trott, Arianna; Lemberger, Matthew E.

    2017-01-01

    Researchers in group counseling often encounter complex data from individual clients who are members of a group. Clients in the same group may be more similar than clients from different groups and this can lead to violations of statistical assumptions. The complexity of the data also means that predictors and outcomes can be measured at both the…

  20. Random Coefficient Logit Model for Large Datasets

    NARCIS (Netherlands)

    C. Hernández-Mireles (Carlos); D. Fok (Dennis)

    2010-01-01

    textabstractWe present an approach for analyzing market shares and products price elasticities based on large datasets containing aggregate sales data for many products, several markets and for relatively long time periods. We consider the recently proposed Bayesian approach of Jiang et al [Jiang,

  1. Functional renormalisation group approach for tensorial group field theory: a rank-3 model

    Energy Technology Data Exchange (ETDEWEB)

    Benedetti, Dario [Max Planck Institute for Gravitational Physics, Albert Einstein Institute,Am Mühlenberg 1, Potsdam, 14476 (Germany); Geloun, Joseph Ben [Max Planck Institute for Gravitational Physics, Albert Einstein Institute,Am Mühlenberg 1, Potsdam, 14476 (Germany); International Chair in Mathematical Physics and Applications, ICMPA-UNESCO Chair,University of Abomey-Calavi, Cotonou (Benin); Oriti, Daniele [Max Planck Institute for Gravitational Physics, Albert Einstein Institute,Am Mühlenberg 1, Potsdam, 14476 (Germany)

    2015-03-17

    We set up the Functional Renormalisation Group formalism for Tensorial Group Field Theory in full generality. We then apply it to a rank-3 model over U(1){sup 3}, endowed with a kinetic term linear in the momenta and with nonlocal interactions. The system of FRG equations turns out to be non-autonomous in the RG flow parameter. This feature is explained by the existence of a hidden scale, the radius of the group manifold. We investigate in detail the opposite regimes of large cut-off (UV) and small cut-off (IR) of the FRG equations, where the system becomes autonomous, and we find, in both case, Gaussian and non-Gaussian fixed points. We derive and interpret the critical exponents and flow diagrams associated with these fixed points, and discuss how the UV and IR regimes are matched. Finally, we discuss the evidence for a phase transition from a symmetric phase to a broken or condensed phase, from an RG perspective, finding that this seems to exist only in the approximate regime of very large radius of the group manifold, as to be expected for systems on compact manifolds.

  2. Large Scale Management of Physicists Personal Analysis Data Without Employing User and Group Quotas

    International Nuclear Information System (INIS)

    Norman, A.; Diesbug, M.; Gheith, M.; Illingworth, R.; Lyon, A.; Mengel, M.

    2015-01-01

    The ability of modern HEP experiments to acquire and process unprecedented amounts of data and simulation have lead to an explosion in the volume of information that individual scientists deal with on a daily basis. Explosion has resulted in a need for individuals to generate and keep large personal analysis data sets which represent the skimmed portions of official data collections, pertaining to their specific analysis. While a significant reduction in size compared to the original data, these personal analysis and simulation sets can be many terabytes or 10s of TB in size and consist of 10s of thousands of files. When this personal data is aggregated across the many physicists in a single analysis group or experiment it can represent data volumes on par or exceeding the official production samples which require special data handling techniques to deal with effectively.In this paper we explore the changes to the Fermilab computing infrastructure and computing models which have been developed to allow experimenters to effectively manage their personal analysis data and other data that falls outside of the typically centrally managed production chains. In particular we describe the models and tools that are being used to provide the modern neutrino experiments like NOvA with storage resources that are sufficient to meet their analysis needs, without imposing specific quotas on users or groups of users. We discuss the storage mechanisms and the caching algorithms that are being used as well as the toolkits are have been developed to allow the users to easily operate with terascale+ datasets. (paper)

  3. Large Scale Management of Physicists Personal Analysis Data Without Employing User and Group Quotas

    Science.gov (United States)

    Norman, A.; Diesbug, M.; Gheith, M.; Illingworth, R.; Lyon, A.; Mengel, M.

    2015-12-01

    The ability of modern HEP experiments to acquire and process unprecedented amounts of data and simulation have lead to an explosion in the volume of information that individual scientists deal with on a daily basis. Explosion has resulted in a need for individuals to generate and keep large personal analysis data sets which represent the skimmed portions of official data collections, pertaining to their specific analysis. While a significant reduction in size compared to the original data, these personal analysis and simulation sets can be many terabytes or 10s of TB in size and consist of 10s of thousands of files. When this personal data is aggregated across the many physicists in a single analysis group or experiment it can represent data volumes on par or exceeding the official production samples which require special data handling techniques to deal with effectively. In this paper we explore the changes to the Fermilab computing infrastructure and computing models which have been developed to allow experimenters to effectively manage their personal analysis data and other data that falls outside of the typically centrally managed production chains. In particular we describe the models and tools that are being used to provide the modern neutrino experiments like NOvA with storage resources that are sufficient to meet their analysis needs, without imposing specific quotas on users or groups of users. We discuss the storage mechanisms and the caching algorithms that are being used as well as the toolkits are have been developed to allow the users to easily operate with terascale+ datasets.

  4. An Audit of the Effectiveness of Large Group Neurology Tutorials for Irish Undergraduate Medical Students

    LENUS (Irish Health Repository)

    Kearney, H

    2016-07-01

    The aim of this audit was to determine the effectiveness of large group tutorials for teaching neurology to medical students. Students were asked to complete a questionnaire rating their confidence on a ten point Likert scale in a number of domains in the undergraduate education guidelines from the Association of British Neurologists (ABN). We then arranged a series of interactive large group tutorials for the class and repeated the questionnaire one month after teaching. In the three core domains of neurological: history taking, examination and differential diagnosis, none of the students rated their confidence as nine or ten out of ten prior to teaching. This increased to 6% for history taking, 12 % in examination and 25% for differential diagnosis after eight weeks of tutorials. This audit demonstrates that in our centre, large group tutorials were an effective means of teaching, as measured by the ABN guidelines in undergraduate neurology.

  5. From evolution to revolution: understanding mutability in large and disruptive human groups

    Science.gov (United States)

    Whitaker, Roger M.; Felmlee, Diane; Verma, Dinesh C.; Preece, Alun; Williams, Grace-Rose

    2017-05-01

    Over the last 70 years there has been a major shift in the threats to global peace. While the 1950's and 1960's were characterised by the cold war and the arms race, many security threats are now characterised by group behaviours that are disruptive, subversive or extreme. In many cases such groups are loosely and chaotically organised, but their ideals are sociologically and psychologically embedded in group members to the extent that the group represents a major threat. As a result, insights into how human groups form, emerge and change are critical, but surprisingly limited insights into the mutability of human groups exist. In this paper we argue that important clues to understand the mutability of groups come from examining the evolutionary origins of human behaviour. In particular, groups have been instrumental in human evolution, used as a basis to derive survival advantage, leaving all humans with a basic disposition to navigate the world through social networking and managing their presence in a group. From this analysis we present five critical features of social groups that govern mutability, relating to social norms, individual standing, status rivalry, ingroup bias and cooperation. We argue that understanding how these five dimensions interact and evolve can provide new insights into group mutation and evolution. Importantly, these features lend themselves to digital modeling. Therefore computational simulation can support generative exploration of groups and the discovery of latent factors, relevant to both internal group and external group modelling. Finally we consider the role of online social media in relation to understanding the mutability of groups. This can play an active role in supporting collective behaviour, and analysis of social media in the context of the five dimensions of group mutability provides a fresh basis to interpret the forces affecting groups.

  6. Holonic Modelling of Large Scale Geographic Environments

    Science.gov (United States)

    Mekni, Mehdi; Moulin, Bernard

    In this paper, we propose a novel approach to model Virtual Geographic Environments (VGE) which uses the holonic approach as a computational geographic methodology and holarchy as organizational principle. Our approach allows to automatically build VGE using data provided by Geographic Information Systems (GIS) and enables an explicit representation of the geographic environment for Situated Multi-Agent Systems (SMAS) in which agents are situated and with which they interact. In order to take into account geometric, topologic, and semantic characteristics of the geographic environment, we propose the use of the holonic approach to build the environment holarchy. We illustrate our holonic model using two different environments: an urban environment and a natural environment.

  7. Revisiting the merits of a mandatory large group classroom learning format: an MD-MBA perspective.

    Science.gov (United States)

    Li, Shawn X; Pinto-Powell, Roshini

    2017-01-01

    The role of classroom learning in medical education is rapidly changing. To promote active learning and reduce student stress, medical schools have adopted policies such as pass/fail curriculums and recorded lectures. These policies along with the rising importance of the USMLE (United States Medical Licensing Examination) exams have made asynchronous learning popular to the detriment of classroom learning. In contrast to this model, modern day business schools employ mandatory large group classes with assigned seating and cold-calling. Despite similar student demographics, medical and business schools have adopted vastly different approaches to the classroom. When examining the classroom dynamic at business schools with mandatory classes, it is evident that there's an abundance of engaging discourse and peer learning objectives that medical schools share. Mandatory classes leverage the network effect just like social media forums such as Facebook and Twitter. That is, the value of a classroom discussion increases when more students are present to participate. At a time when students are savvy consumers of knowledge, the classroom is competing against an explosion of study aids dedicated to USMLE preparation. Certainly, the purpose of medical school is not solely about the efficient transfer of knowledge - but to train authentic, competent, and complete physicians. To accomplish this, we must promote the inimitable and deeply personal interactions amongst faculty and students. When viewed through this lens, mandatory classes might just be a way for medical schools to leverage their competitive advantage in educating the complete physician.

  8. Establishing Peer Mentor-Led Writing Groups in Large First-Year Courses

    Science.gov (United States)

    Marcoux, Sarah; Marken, Liv; Yu, Stan

    2012-01-01

    This paper describes the results of a pilot project designed to improve students' academic writing in a large (200-student) first-year Agriculture class at the University of Saskatchewan. In collaboration with the course's professor, the Writing Centre coordinator and a summer student designed curriculum for four two-hour Writing Group sessions…

  9. Counting irreducible representations of large degree of the upper triangular groups

    OpenAIRE

    Le, Tung

    2009-01-01

    Let $U_n(q)$ be the upper triangular group of degree $n$ over the finite field $\\F_q$ with $q$ elements. In this paper, we present constructions of large degree ordinary irreducible representations of $U_n(q)$ where $n\\geq 7$, and then determine the number of irreducible representations of largest, second largest and third largest degrees.

  10. Learning through Discussions: Comparing the Benefits of Small-Group and Large-Class Settings

    Science.gov (United States)

    Pollock, Philip H.; Hamann, Kerstin; Wilson, Bruce M.

    2011-01-01

    The literature on teaching and learning heralds the benefits of discussion for student learner outcomes, especially its ability to improve students' critical thinking skills. Yet, few studies compare the effects of different types of face-to-face discussions on learners. Using student surveys, we analyze the benefits of small-group and large-class…

  11. All polymer chip for amperometric studies of transmitter release from large groups of neuronal cells

    DEFF Research Database (Denmark)

    Larsen, Simon T.; Taboryski, Rafael

    2012-01-01

    We present an all polymer electrochemical chip for simple detection of transmitter release from large groups of cultured PC 12 cells. Conductive polymer PEDOT:tosylate microelectrodes were used together with constant potential amperometry to obtain easy-to-analyze oxidation signals from potassium...

  12. Assessing Activity and Location of Individual Laying Hens in Large Groups Using Modern Technology

    Science.gov (United States)

    Siegford, Janice M.; Berezowski, John; Biswas, Subir K.; Daigle, Courtney L.; Gebhardt-Henrich, Sabine G.; Hernandez, Carlos E.; Thurner, Stefan; Toscano, Michael J.

    2016-01-01

    Simple Summary Tracking of individual animals within large groups is increasingly possible offering an exciting opportunity to researchers. Whereas previously only relatively indistinguishable groups of individual animals could be observed and combined into pen level data, we can now focus on individual actors and track their activities across time and space with minimal intervention and disturbance. We describe several tracking systems that are currently in use for laying hens and review each, highlighting their strengths and weaknesses, as well as environments or conditions for which they may be most suited, and relevant issues to fit the best technology for the intended purpose. Abstract Tracking individual animals within large groups is increasingly possible, offering an exciting opportunity to researchers. Whereas previously only relatively indistinguishable groups of individual animals could be observed and combined into pen level data, we can now focus on individual actors within these large groups and track their activities across time and space with minimal intervention and disturbance. The development is particularly relevant to the poultry industry as, due to a shift away from battery cages, flock sizes are increasingly becoming larger and environments more complex. Many efforts have been made to track individual bird behavior and activity in large groups using a variety of methodologies with variable success. Of the technologies in use, each has associated benefits and detriments, which can make the approach more or less suitable for certain environments and experiments. Within this article, we have divided several tracking systems that are currently available into two major categories (radio frequency identification and radio signal strength) and review the strengths and weaknesses of each, as well as environments or conditions for which they may be most suitable. We also describe related topics including types of analysis for the data and concerns

  13. Working group report: Beyond the standard model

    Indian Academy of Sciences (India)

    Superstring-inspired phenomenology: This included. – models of low-scale quantum gravity with one or more extra dimensions,. – noncommutative geometry and gauge theories,. – string-inspired grand unification. • Models of supersymmetry-breaking: This included. – Supersymmetry-breaking in minimal supergravity ...

  14. Qualitative Analysis of Collaborative Learning Groups in Large Enrollment Introductory Astronomy

    Science.gov (United States)

    Skala, Chija; Slater, Timothy F.; Adams, Jeffrey P.

    2000-08-01

    Large-lecture introductory astronomy courses for undergraduate, non-science majors present numerous problems for faculty. As part of a systematic effort to improve the course learning environment, a series of small-group, collaborative learning activities were implemented in an otherwise conventional lecture astronomy survey course. These activities were used once each week during the regularly scheduled lecture period. After eight weeks, ten focus group interviews were conducted to qualitatively assess the impact and dynamics of these small group learning activities. Overall, the data strongly suggest that students enjoy participating in the in-class learning activities in learning teams of three to four students. These students firmly believe that they are learning more than they would from lectures alone. Inductive analysis of the transcripts revealed five major themes prevalent among the students' perspectives: (1) self-formed, cooperative group composition and formation should be more regulated by the instructor; (2) team members' assigned rolls should be less formally structured by the instructors; (3) cooperative groups helped in learning the course content; (4) time constraints on lectures and activities need to be more carefully aligned; and (5) gender issues can exist within the groups. These themes serve as a guide for instructors who are developing instructional interventions for large lecture courses.

  15. Group Modeling in Social Learning Environments

    Science.gov (United States)

    Stankov, Slavomir; Glavinic, Vlado; Krpan, Divna

    2012-01-01

    Students' collaboration while learning could provide better learning environments. Collaboration assumes social interactions which occur in student groups. Social theories emphasize positive influence of such interactions on learning. In order to create an appropriate learning environment that enables social interactions, it is important to…

  16. Assessing Activity and Location of Individual Laying Hens in Large Groups Using Modern Technology.

    Science.gov (United States)

    Siegford, Janice M; Berezowski, John; Biswas, Subir K; Daigle, Courtney L; Gebhardt-Henrich, Sabine G; Hernandez, Carlos E; Thurner, Stefan; Toscano, Michael J

    2016-02-02

    Tracking individual animals within large groups is increasingly possible, offering an exciting opportunity to researchers. Whereas previously only relatively indistinguishable groups of individual animals could be observed and combined into pen level data, we can now focus on individual actors within these large groups and track their activities across time and space with minimal intervention and disturbance. The development is particularly relevant to the poultry industry as, due to a shift away from battery cages, flock sizes are increasingly becoming larger and environments more complex. Many efforts have been made to track individual bird behavior and activity in large groups using a variety of methodologies with variable success. Of the technologies in use, each has associated benefits and detriments, which can make the approach more or less suitable for certain environments and experiments. Within this article, we have divided several tracking systems that are currently available into two major categories (radio frequency identification and radio signal strength) and review the strengths and weaknesses of each, as well as environments or conditions for which they may be most suitable. We also describe related topics including types of analysis for the data and concerns with selecting focal birds.

  17. An investigation into the factors that encourage learner participation in a large group medical classroom.

    Science.gov (United States)

    Moffett, Jennifer; Berezowski, John; Spencer, Dustine; Lanning, Shari

    2014-01-01

    Effective lectures often incorporate activities that encourage learner participation. A challenge for educators is how to facilitate this in the large group lecture setting. This study investigates the individual student characteristics involved in encouraging (or dissuading) learners to interact, ask questions, and make comments in class. Students enrolled in a Doctor of Veterinary Medicine program at Ross University School of Veterinary Medicine, St Kitts, were invited to complete a questionnaire canvassing their participation in the large group classroom. Data from the questionnaire were analyzed using Excel (Microsoft, Redmond, WA, USA) and the R software environment (http://www.r-project.org/). One hundred and ninety-two students completed the questionnaire (response rate, 85.7%). The results showed statistically significant differences between male and female students when asked to self-report their level of participation (P=0.011) and their confidence to participate (Pclassroom. Male students were more likely to participate in class and reported feeling more confident to participate than female students. Female students in this study commonly identified aversion to public speaking as a factor which held them back from participating in the large group lecture setting. These are important findings for veterinary and medical educators aiming to improve learner participation in the classroom. Potential ways of addressing this challenge include addition of small group activities and audience response systems during lectures, and inclusion of training interventions in public speaking at an early stage of veterinary and medical curricula.

  18. Assessing Activity and Location of Individual Laying Hens in Large Groups Using Modern Technology

    Directory of Open Access Journals (Sweden)

    Janice M. Siegford

    2016-02-01

    Full Text Available Tracking individual animals within large groups is increasingly possible, offering an exciting opportunity to researchers. Whereas previously only relatively indistinguishable groups of individual animals could be observed and combined into pen level data, we can now focus on individual actors within these large groups and track their activities across time and space with minimal intervention and disturbance. The development is particularly relevant to the poultry industry as, due to a shift away from battery cages, flock sizes are increasingly becoming larger and environments more complex. Many efforts have been made to track individual bird behavior and activity in large groups using a variety of methodologies with variable success. Of the technologies in use, each has associated benefits and detriments, which can make the approach more or less suitable for certain environments and experiments. Within this article, we have divided several tracking systems that are currently available into two major categories (radio frequency identification and radio signal strength and review the strengths and weaknesses of each, as well as environments or conditions for which they may be most suitable. We also describe related topics including types of analysis for the data and concerns with selecting focal birds.

  19. Large Animal Stroke Models vs. Rodent Stroke Models, Pros and Cons, and Combination?

    Science.gov (United States)

    Cai, Bin; Wang, Ning

    2016-01-01

    Stroke is a leading cause of serious long-term disability worldwide and the second leading cause of death in many countries. Long-time attempts to salvage dying neurons via various neuroprotective agents have failed in stroke translational research, owing in part to the huge gap between animal stroke models and stroke patients, which also suggests that rodent models have limited predictive value and that alternate large animal models are likely to become important in future translational research. The genetic background, physiological characteristics, behavioral characteristics, and brain structure of large animals, especially nonhuman primates, are analogous to humans, and resemble humans in stroke. Moreover, relatively new regional imaging techniques, measurements of regional cerebral blood flow, and sophisticated physiological monitoring can be more easily performed on the same animal at multiple time points. As a result, we can use large animal stroke models to decrease the gap and promote translation of basic science stroke research. At the same time, we should not neglect the disadvantages of the large animal stroke model such as the significant expense and ethical considerations, which can be overcome by rodent models. Rodents should be selected as stroke models for initial testing and primates or cats are desirable as a second species, which was recommended by the Stroke Therapy Academic Industry Roundtable (STAIR) group in 2009.

  20. Modelling large scale human activity in San Francisco

    Science.gov (United States)

    Gonzalez, Marta

    2010-03-01

    Diverse group of people with a wide variety of schedules, activities and travel needs compose our cities nowadays. This represents a big challenge for modeling travel behaviors in urban environments; those models are of crucial interest for a wide variety of applications such as traffic forecasting, spreading of viruses, or measuring human exposure to air pollutants. The traditional means to obtain knowledge about travel behavior is limited to surveys on travel journeys. The obtained information is based in questionnaires that are usually costly to implement and with intrinsic limitations to cover large number of individuals and some problems of reliability. Using mobile phone data, we explore the basic characteristics of a model of human travel: The distribution of agents is proportional to the population density of a given region, and each agent has a characteristic trajectory size contain information on frequency of visits to different locations. Additionally we use a complementary data set given by smart subway fare cards offering us information about the exact time of each passenger getting in or getting out of the subway station and the coordinates of it. This allows us to uncover the temporal aspects of the mobility. Since we have the actual time and place of individual's origin and destination we can understand the temporal patterns in each visited location with further details. Integrating two described data set we provide a dynamical model of human travels that incorporates different aspects observed empirically.

  1. Long-Term Calculations with Large Air Pollution Models

    DEFF Research Database (Denmark)

    Ambelas Skjøth, C.; Bastrup-Birk, A.; Brandt, J.

    1999-01-01

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  2. Constituent rearrangement model and large transverse momentum reactions

    International Nuclear Information System (INIS)

    Igarashi, Yuji; Imachi, Masahiro; Matsuoka, Takeo; Otsuki, Shoichiro; Sawada, Shoji.

    1978-01-01

    In this chapter, two models based on the constituent rearrangement picture for large p sub( t) phenomena are summarized. One is the quark-junction model, and the other is the correlating quark rearrangement model. Counting rules of the models apply to both two-body reactions and hadron productions. (author)

  3. Seasonal and group effects on dairy cow behavior in large yards

    OpenAIRE

    NIKKHAH, Akbar; KOWSAR, Rasool

    2014-01-01

    In mechanized modern dairy facilities with competitive environments, monitoring behavior provides opportunities to manipulate and optimize the nutritional, health, and social status of high-merit cows. The objective of the current study was to determine seasonal and cow group effects on the eating, ruminating, standing, and lying behaviors of dairy cows in large yards. Seasonal data on various behaviors of lactating cows in different production and lactation stages were collected continuously...

  4. An investigation into the factors that encourage learner participation in a large group medical classroom

    Directory of Open Access Journals (Sweden)

    Moffett J

    2014-03-01

    Full Text Available Jennifer Moffett, John Berezowski, Dustine Spencer, Shari Lanning Ross University School of Veterinary Medicine, West Farm, St Kitts, West Indies Background: Effective lectures often incorporate activities that encourage learner participation. A challenge for educators is how to facilitate this in the large group lecture setting. This study investigates the individual student characteristics involved in encouraging (or dissuading learners to interact, ask questions, and make comments in class. Methods: Students enrolled in a Doctor of Veterinary Medicine program at Ross University School of Veterinary Medicine, St Kitts, were invited to complete a questionnaire canvassing their participation in the large group classroom. Data from the questionnaire were analyzed using Excel (Microsoft, Redmond, WA, USA and the R software environment (http://www.r-project.org/. Results: One hundred and ninety-two students completed the questionnaire (response rate, 85.7%. The results showed statistically significant differences between male and female students when asked to self-report their level of participation (P=0.011 and their confidence to participate (P<0.001 in class. No statistically significant difference was identified between different age groups of students (P=0.594. Student responses reflected that an "aversion to public speaking" acted as the main deterrent to participating during a lecture. Female participants were 3.56 times more likely to report a fear of public speaking than male participants (odds ratio 3.56, 95% confidence interval 1.28–12.33, P=0.01. Students also reported "smaller sizes of class and small group activities" and "other students participating" as factors that made it easier for them to participate during a lecture. Conclusion: In this study, sex likely played a role in learner participation in the large group veterinary classroom. Male students were more likely to participate in class and reported feeling more confident to

  5. Multiloop functional renormalization group for general models

    Science.gov (United States)

    Kugler, Fabian B.; von Delft, Jan

    2018-02-01

    We present multiloop flow equations in the functional renormalization group (fRG) framework for the four-point vertex and self-energy, formulated for a general fermionic many-body problem. This generalizes the previously introduced vertex flow [F. B. Kugler and J. von Delft, Phys. Rev. Lett. 120, 057403 (2018), 10.1103/PhysRevLett.120.057403] and provides the necessary corrections to the self-energy flow in order to complete the derivative of all diagrams involved in the truncated fRG flow. Due to its iterative one-loop structure, the multiloop flow is well suited for numerical algorithms, enabling improvement of many fRG computations. We demonstrate its equivalence to a solution of the (first-order) parquet equations in conjunction with the Schwinger-Dyson equation for the self-energy.

  6. The impact of group model building on behaviour

    NARCIS (Netherlands)

    Rouwette, E.A.J.A.

    2017-01-01

    Group model building refers to a process of building system dynamics models with decision makers, experts, and other stakeholders. Involving stakeholders in building system dynamics models has a long history going back several decades (Andersen, Vennix, Richardson, & Rouwette, 2007). In

  7. Utilizing Focus Groups with Potential Participants and Their Parents: An Approach to Inform Study Design in a Large Clinical Trial.

    Science.gov (United States)

    Kadimpati, Sandeep; McCormick, Jennifer B; Chiu, Yichen; Parker, Ashley B; Iftikhar, Aliya Z; Flick, Randall P; Warner, David O

    2014-01-01

    In the recent literature, there has been some evidence that exposure of children to anesthetic procedures during the first two years of life may impair cognitive function and learning in later life. We planned a clinical study to quantify this risk, a study involving testing 1,000 children for neurodevelopmental deficits. As a part of this planning, we conducted focus groups involving potential participants and their parents to elicit information regarding three issues: communications with the community and potential participants, recruitment and consent processes, and the return of neurodevelopmental testing results. Three focus groups were conducted with the parents of potential participants and one focus group was conducted with an 18-19 year old group; each group consisted of 6-10 participants. The moderated discussions had questions about recruitment, consenting issues, and expectations from the study about return of both overall trial findings and individual research test results. The focus group data gave us an insight on potential participants' views on recruitment, consenting, communications about the study, and expectations about return of both overall trial findings and individual research test results. The concerns expressed were largely addressable. In addition, the concern we had about some parents enrolling their children in the study solely for the sake of getting their child's cognitive function results was dispelled. We found that the individuals participating in our focus groups were generally enthusiastic about the large clinical study and could see the value in answering the study question. The data from the focus groups were used to inform changes to the recruitment and consent process. Focus group input was also instrumental in affirming the study design regarding return of results. Our experience suggests that the approach we used may serve as a model for other investigators to help inform the various elements of clinical study design, in

  8. Large scale stochastic spatio-temporal modelling with PCRaster

    NARCIS (Netherlands)

    Karssenberg, D.J.; Drost, N.; Schmitz, O.; Jong, K. de; Bierkens, M.F.P.

    2013-01-01

    PCRaster is a software framework for building spatio-temporal models of land surface processes (http://www.pcraster.eu). Building blocks of models are spatial operations on raster maps, including a large suite of operations for water and sediment routing. These operations are available to model

  9. An accurate and simple large signal model of HEMT

    DEFF Research Database (Denmark)

    Liu, Qing

    1989-01-01

    A large-signal model of discrete HEMTs (high-electron-mobility transistors) has been developed. It is simple and suitable for SPICE simulation of hybrid digital ICs. The model parameters are extracted by using computer programs and data provided by the manufacturer. Based on this model, a hybrid...

  10. Seasonal patterns of mixed species groups in large East African mammals.

    Science.gov (United States)

    Kiffner, Christian; Kioko, John; Leweri, Cecilia; Krause, Stefan

    2014-01-01

    Mixed mammal species groups are common in East African savannah ecosystems. Yet, it is largely unknown if co-occurrences of large mammals result from random processes or social preferences and if interspecific associations are consistent across ecosystems and seasons. Because species may exchange important information and services, understanding patterns and drivers of heterospecific interactions is crucial for advancing animal and community ecology. We recorded 5403 single and multi-species clusters in the Serengeti-Ngorongoro and Tarangire-Manyara ecosystems during dry and wet seasons and used social network analyses to detect patterns of species associations. We found statistically significant associations between multiple species and association patterns differed spatially and seasonally. Consistently, wildebeest and zebras preferred being associated with other species, whereas carnivores, African elephants, Maasai giraffes and Kirk's dik-diks avoided being in mixed groups. During the dry season, we found that the betweenness (a measure of importance in the flow of information or disease) of species did not differ from a random expectation based on species abundance. In contrast, in the wet season, we found that these patterns were not simply explained by variations in abundances, suggesting that heterospecific associations were actively formed. These seasonal differences in observed patterns suggest that interspecific associations may be driven by resource overlap when resources are limited and by resource partitioning or anti-predator advantages when resources are abundant. We discuss potential mechanisms that could drive seasonal variation in the cost-benefit tradeoffs that underpin the formation of mixed-species groups.

  11. Group therapy model for refugee and torture survivors.

    Science.gov (United States)

    Kira, Ibrahim A; Ahmed, Asha; Mahmoud, Vanessa; Wasim, Fatima

    2010-01-01

    The paper discusses the Center for Torture and Trauma Survivors' therapy group model for torture survivors and describes two of its variants: The Bashal group for African and Somali women and the Bhutanese multi-family therapy group. Group therapies in this model extend to community healing. Groups develop their cohesion to graduate to a social community club or initiate a community organization. New graduates from the group join the club and become part of the social advocacy process and of group and individual support and community healing. The BASHAL Somali women's group that developed spontaneously into a socio-political club for African women, and the Bhutanese family group that consciously developed into a Bhutanese community organization are discussed as two variants of this new model of group therapy with torture survivors.

  12. Rapid monitoring of large groups of internally contaminated people following a radiation accident

    International Nuclear Information System (INIS)

    1994-05-01

    In the management of an emergency, it is necessary to assess the radiation exposures of people in the affected areas. An essential component in the programme is the monitoring of internal contamination. Existing fixed installations for the assessment of incorporated radionuclides may be of limited value in these circumstances because they may be inconveniently sited, oversensitive for the purpose, or inadequately equipped and staffed to cope with the large numbers referred to them. The IAEA considered it important to produce guidance on rapid monitoring of large groups of internally contaminated people. The purpose of this document is to provide Member States with an overview on techniques that can be applied during abnormal or accidental situations. Refs and figs

  13. Drivers Advancing Oral Health in a Large Group Dental Practice Organization.

    Science.gov (United States)

    Simmons, Kristen; Gibson, Stephanie; White, Joel M

    2016-06-01

    Three change drivers are being implemented to high standards of patient centric and evidence-based oral health care within the context of a large multispecialty dental group practice organization based on the commitment of the dental hygienist chief operating officer and her team. A recent environmental scan elucidated 6 change drivers that can impact the provision of oral health care. Practitioners who can embrace and maximize aspects of these change drivers will move dentistry forward and create future opportunities. This article explains how 3 of these change drivers are being applied in a privately held, accountable risk-bearing entity that provides individualized treatment programs for more than 417,000 members. To facilitate integration of the conceptual changes related to the drivers, a multi-institutional, multidisciplinary, highly functioning collaborative work group was formed. The document Dental Hygiene at a Crossroads for Change(1) inspired the first author, a dental hygienist in a unique position as chief operating officer of a large group practice, to pursue evidence-based organizational change and to impact the quality of patient care. This was accomplished by implementing technological advances including dental diagnosis terminology in the electronic health record, clinical decision support, standardized treatment guidelines, quality metrics, and patient engagement to improve oral health outcomes at the patient and population levels. The systems and processes used to implement 3 change drivers into a large multi-practice dental setting is presented to inform and inspire others to implement change drivers with the potential for advancing oral health. Technology implementing best practices and improving patient engagement are excellent drivers to advance oral health and are an effective use of oral health care dollars. Improved oral health can be leveraged through technological advances to improve clinical practice. Copyright © 2016 Elsevier Inc

  14. Evaluation of receptivity of the medical students in a lecture of a large group

    Directory of Open Access Journals (Sweden)

    Vidyarthi SurendraK, Nayak RoopaP, GuptaSandeep K

    2014-04-01

    Full Text Available Background: Lecturing is widely used teaching method in higher education. Instructors of large classes may have only option to deliver lecture to convey informations to large group students.Aims and Objectives: The present study was to evaluate the effectiveness/receptivity of interactive lecturing in a large group of MBBS second year students. Material and Methods: The present study was conducted in the well-equipped lecture theater of Dhanalakshmi Srinivasan Medical College and Hospital (DSMCH, Tamil Nadu. A fully prepared interactive lecture on the specific topic was delivered by using power point presentation for second year MBBS students. Before start to deliver the lecture, instructor distributed multiple choice 10 questionnaires to attempt within 10 minutes. After 30 minutes of delivering lecture, again instructor distributed same 10 sets of multiple choice questionnaires to attempt in 10 minutes. The topic was never disclosed to the students before to deliver the lecture. Statistics: We analyzed the pre-lecture & post-lecture questions of each student by applying the paired t-test formula by using www.openepi.com version 3.01 online/offline software and by using Microsoft Excel Sheet Windows 2010. Results: The 31 male, 80 female including 111 students of average age 18.58 years baseline (pre-lecture receptivity mean % was 30.99 ± 14.64 and post-lecture receptivity mean % was increased upto 53.51± 19.52. The only 12 students out of 111 post-lecture receptivity values was less (mean % 25.8± 10.84 than the baseline (mean % 45± 9.05 receptive value and this reduction of receptivity was more towards negative side. Conclusion: In interactive lecture session with power point presentation students/learners can learn, even in large-class environments, but it should be active-learner centered.

  15. Group-based modeling of ecological trajectories in restored wetlands.

    Science.gov (United States)

    Matthews, Jeffrey W

    2015-03-01

    Repeated measures taken at the same restoration sites over time are used to describe restoration trajectories and identify sites that are trending toward unexpected outcomes. Analogously, social scientists use repeated measures of individuals to describe developmental trajectories of behaviors or other outcomes. Group-based trajectory modeling (GBTM) is one statistical method used in behavioral and health sciences for this purpose. I introduce the use of GBTM to identify clusters of similar restoration trajectories within a sample of sites. Data collected at 54 restored wetlands in Illinois for up to 15 years post-restoration were used to describe trajectories of six indicators: plant species richness, number of Carex (sedge) species, mean coefficient of conservatism (mean C), native plant cover, perennial plant cover, and planted species cover. For each indicator, I used GBTM to classify wetlands into three to four groups with distinct trajectories. In general, cover by native and planted species declined, while species richness and mean C increased over time or peaked then declined. Site context and management may explain trajectory group membership. Specifically, wetlands restored more recently and those restored within forested contexts were more likely to follow increasing trajectories. I show GBTM to be useful for identifying typical restoration trajectory patterns, developing hypotheses regarding factors driving those patterns and pinpointing critical times for intervention. Furthermore, GBTM might be applied more broadly in ecological research to identify common patterns of community assembly in large numbers of plots or sites.

  16. Regularization modeling for large-eddy simulation of diffusion flames

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Wesseling, P.; Oñate, E.; Périaux, J.

    We analyze the evolution of a diffusion flame in a turbulent mixing layer using large-eddy simulation. The large-eddy simulation includes Leray regularization of the convective transport and approximate inverse filtering to represent the chemical source terms. The Leray model is compared to the more

  17. Advances in Modelling of Large Scale Coastal Evolution

    NARCIS (Netherlands)

    Stive, M.J.F.; De Vriend, H.J.

    1995-01-01

    The attention for climate change impact on the world's coastlines has established large scale coastal evolution as a topic of wide interest. Some more recent advances in this field, focusing on the potential of mathematical models for the prediction of large scale coastal evolution, are discussed.

  18. Sexuality and the Elderly: A Group Counseling Model.

    Science.gov (United States)

    Capuzzi, Dave; Gossman, Larry

    1982-01-01

    Describes a 10-session group counseling model to facilitate awareness of sexuality and the legitimacy of its expression for older adults. Considers member selection, session length and setting, and group leadership. (Author/MCF)

  19. Nuclear spectroscopy in large shell model spaces: recent advances

    International Nuclear Information System (INIS)

    Kota, V.K.B.

    1995-01-01

    Three different approaches are now available for carrying out nuclear spectroscopy studies in large shell model spaces and they are: (i) the conventional shell model diagonalization approach but taking into account new advances in computer technology; (ii) the recently introduced Monte Carlo method for the shell model; (iii) the spectral averaging theory, based on central limit theorems, in indefinitely large shell model spaces. The various principles, recent applications and possibilities of these three methods are described and the similarity between the Monte Carlo method and the spectral averaging theory is emphasized. (author). 28 refs., 1 fig., 5 tabs

  20. Social management of laboratory rhesus macaques housed in large groups using a network approach: A review.

    Science.gov (United States)

    McCowan, Brenda; Beisner, Brianne; Hannibal, Darcy

    2017-12-07

    Biomedical facilities across the nation and worldwide aim to develop cost-effective methods for the reproductive management of macaque breeding groups, typically by housing macaques in large, multi-male multi-female social groups that provide monkey subjects for research as well as appropriate socialization for their psychological well-being. One of the most difficult problems in managing socially housed macaques is their propensity for deleterious aggression. From a management perspective, deleterious aggression (as opposed to less intense aggression that serves to regulate social relationships) is undoubtedly the most problematic behavior observed in group-housed macaques, which can readily escalate to the degree that it causes social instability, increases serious physical trauma leading to group dissolution, and reduces psychological well-being. Thus for both welfare and other management reasons, aggression among rhesus macaques at primate centers and facilities needs to be addressed with a more proactive approach.Management strategies need to be instituted that maximize social housing while also reducing problematic social aggression due to instability using efficacious methods for detection and prevention in the most cost effective manner. Herein we review a new proactive approach using social network analysis to assess and predict deleterious aggression in macaque groups. We discovered three major pathways leading to instability, such as unusually high rates and severity of trauma and social relocations.These pathways are linked either directly or indirectly to network structure in rhesus macaque societies. We define these pathways according to the key intrinsic and extrinsic variables (e.g., demographic, genetic or social factors) that influence network and behavioral measures of stability (see Fig. 1). They are: (1) presence of natal males, (2) matrilineal genetic fragmentation, and (3) the power structure and conflict policing behavior supported by this

  1. Efficacy of formative evaluation using a focus group for a large classroom setting in an accelerated pharmacy program.

    Science.gov (United States)

    Nolette, Shaun; Nguyen, Alyssa; Kogan, David; Oswald, Catherine; Whittaker, Alana; Chakraborty, Arup

    2017-07-01

    Formative evaluation is a process utilized to improve communication between students and faculty. This evaluation method allows the ability to address pertinent issues in a timely manner; however, implementation of formative evaluation can be a challenge, especially in a large classroom setting. Using mediated formative evaluation, the purpose of this study is to determine if a student based focus group is a viable option to improve efficacy of communication between an instructor and students as well as time management in a large classroom setting. Out of 140 total students, six students were selected to form a focus group - one from each of six total sections of the classroom. Each focus group representative was responsible for collecting all the questions from students of their corresponding sections and submitting them to the instructor two to three times a day. Responses from the instructor were either passed back to pertinent students by the focus group representatives or addressed directly with students by the instructor. This study was conducted using a fifteen-question survey after the focus group model was utilized for one month. A printed copy of the survey was distributed in the class by student investigators. Questions were of varying types, including Likert scale, yes/no, and open-ended response. One hundred forty surveys were administered, and 90 complete responses were collected. Surveys showed that 93.3% of students found that use of the focus group made them more likely to ask questions for understanding. The surveys also showed 95.5% of students found utilizing the focus group for questions allowed for better understanding of difficult concepts. General open-ended answer portions of the survey showed that most students found the focus group allowed them to ask questions more easily since they did not feel intimidated by asking in front of the whole class. No correlation was found between demographic characteristics and survey responses. This may

  2. Engaging the public with low-carbon energy technologies: Results from a Scottish large group process

    International Nuclear Information System (INIS)

    Howell, Rhys; Shackley, Simon; Mabon, Leslie; Ashworth, Peta; Jeanneret, Talia

    2014-01-01

    This paper presents the results of a large group process conducted in Edinburgh, Scotland investigating public perceptions of climate change and low-carbon energy technologies, specifically carbon dioxide capture and storage (CCS). The quantitative and qualitative results reported show that the participants were broadly supportive of efforts to reduce carbon dioxide emissions, and that there is an expressed preference for renewable energy technologies to be employed to achieve this. CCS was considered in detail during the research due to its climate mitigation potential; results show that the workshop participants were cautious about its deployment. The paper discusses a number of interrelated factors which appear to influence perceptions of CCS; factors such as the perceived costs and benefits of the technology, and people's personal values and trust in others all impacted upon participants’ attitudes towards the technology. The paper thus argues for the need to provide the public with broad-based, balanced and trustworthy information when discussing CCS, and to take seriously the full range of factors that influence public perceptions of low-carbon technologies. - Highlights: • We report the results of a Scottish large group workshop on energy technologies. • There is strong public support for renewable energy and mixed opinions towards CCS. • The workshop was successful in initiating discussion around climate change and energy technologies. • Issues of trust, uncertainty, costs, benefits, values and emotions all inform public perceptions. • Need to take seriously the full range of factors that inform perceptions

  3. Frequency and phase synchronization in large groups: Low dimensional description of synchronized clapping, firefly flashing, and cricket chirping

    Science.gov (United States)

    Ott, Edward; Antonsen, Thomas M.

    2017-05-01

    A common observation is that large groups of oscillatory biological units often have the ability to synchronize. A paradigmatic model of such behavior is provided by the Kuramoto model, which achieves synchronization through coupling of the phase dynamics of individual oscillators, while each oscillator maintains a different constant inherent natural frequency. Here we consider the biologically likely possibility that the oscillatory units may be capable of enhancing their synchronization ability by adaptive frequency dynamics. We propose a simple augmentation of the Kuramoto model which does this. We also show that, by the use of a previously developed technique [Ott and Antonsen, Chaos 18, 037113 (2008)], it is possible to reduce the resulting dynamics to a lower dimensional system for the macroscopic evolution of the oscillator ensemble. By employing this reduction, we investigate the dynamics of our system, finding a characteristic hysteretic behavior and enhancement of the quality of the achieved synchronization.

  4. WORK GROUP DEVELOPMENT MODELS – THE EVOLUTION FROM SIMPLE GROUP TO EFFECTIVE TEAM

    Directory of Open Access Journals (Sweden)

    Raluca ZOLTAN

    2016-02-01

    Full Text Available Currently, work teams are increasingly studied by virtue of the advantages they have compared to the work groups. But a true team does not appear overnight but must complete several steps to overcome the initial stage of its existence as a group. The question that arises is at what point a simple group is turning into an effective team. Even though the development process of group into a team is not a linear process, the models found in the literature provides a rich framework for analyzing and identifying the features which group acquires over time till it become a team in the true sense of word. Thus, in this article we propose an analysis of the main models of group development in order to point out, even in a relative manner, the stage when the simple work group becomes an effective work team.

  5. Seasonal patterns of mixed species groups in large East African mammals.

    Directory of Open Access Journals (Sweden)

    Christian Kiffner

    Full Text Available Mixed mammal species groups are common in East African savannah ecosystems. Yet, it is largely unknown if co-occurrences of large mammals result from random processes or social preferences and if interspecific associations are consistent across ecosystems and seasons. Because species may exchange important information and services, understanding patterns and drivers of heterospecific interactions is crucial for advancing animal and community ecology. We recorded 5403 single and multi-species clusters in the Serengeti-Ngorongoro and Tarangire-Manyara ecosystems during dry and wet seasons and used social network analyses to detect patterns of species associations. We found statistically significant associations between multiple species and association patterns differed spatially and seasonally. Consistently, wildebeest and zebras preferred being associated with other species, whereas carnivores, African elephants, Maasai giraffes and Kirk's dik-diks avoided being in mixed groups. During the dry season, we found that the betweenness (a measure of importance in the flow of information or disease of species did not differ from a random expectation based on species abundance. In contrast, in the wet season, we found that these patterns were not simply explained by variations in abundances, suggesting that heterospecific associations were actively formed. These seasonal differences in observed patterns suggest that interspecific associations may be driven by resource overlap when resources are limited and by resource partitioning or anti-predator advantages when resources are abundant. We discuss potential mechanisms that could drive seasonal variation in the cost-benefit tradeoffs that underpin the formation of mixed-species groups.

  6. Modelling and measurements of wakes in large wind farms

    DEFF Research Database (Denmark)

    Barthelmie, Rebecca Jane; Rathmann, Ole; Frandsen, Sten Tronæs

    2007-01-01

    The paper presents research conducted in the Flow workpackage of the EU funded UPWIND project which focuses on improving models of flow within and downwind of large wind farms in complex terrain and offshore. The main activity is modelling the behaviour of wind turbine wakes in order to improve p...

  7. Modeling and Forecasting Large Realized Covariance Matrices and Portfolio Choice

    NARCIS (Netherlands)

    Callot, Laurent A.F.; Kock, Anders B.; Medeiros, Marcelo C.

    2017-01-01

    We consider modeling and forecasting large realized covariance matrices by penalized vector autoregressive models. We consider Lasso-type estimators to reduce the dimensionality and provide strong theoretical guarantees on the forecast capability of our procedure. We show that we can forecast

  8. A Grouping Particle Swarm Optimizer with Personal-Best-Position Guidance for Large Scale Optimization.

    Science.gov (United States)

    Guo, Weian; Si, Chengyong; Xue, Yu; Mao, Yanfen; Wang, Lei; Wu, Qidi

    2017-05-04

    Particle Swarm Optimization (PSO) is a popular algorithm which is widely investigated and well implemented in many areas. However, the canonical PSO does not perform well in population diversity maintenance so that usually leads to a premature convergence or local optima. To address this issue, we propose a variant of PSO named Grouping PSO with Personal- Best-Position (Pbest) Guidance (GPSO-PG) which maintains the population diversity by preserving the diversity of exemplars. On one hand, we adopt uniform random allocation strategy to assign particles into different groups and in each group the losers will learn from the winner. On the other hand, we employ personal historical best position of each particle in social learning rather than the current global best particle. In this way, the exemplars diversity increases and the effect from the global best particle is eliminated. We test the proposed algorithm to the benchmarks in CEC 2008 and CEC 2010, which concern the large scale optimization problems (LSOPs). By comparing several current peer algorithms, GPSO-PG exhibits a competitive performance to maintain population diversity and obtains a satisfactory performance to the problems.

  9. A cellular automation model accounting for bicycle's group behavior

    Science.gov (United States)

    Tang, Tie-Qiao; Rui, Ying-Xu; Zhang, Jian; Shang, Hua-Yan

    2018-02-01

    Recently, bicycle has become an important traffic tool in China, again. Due to the merits of bicycle, the group behavior widely exists in urban traffic system. However, little effort has been made to explore the impacts of the group behavior on bicycle flow. In this paper, we propose a CA (cellular automaton) model with group behavior to explore the complex traffic phenomena caused by shoulder group behavior and following group behavior on an open road. The numerical results illustrate that the proposed model can qualitatively describe the impacts of the two kinds of group behaviors on bicycle flow and that the effects are related to the mode and size of group behaviors. The results can help us to better understand the impacts of the bicycle's group behaviors on urban traffic system and effectively control the bicycle's group behavior.

  10. The LGBTQ Responsive Model for Supervision of Group Work

    Science.gov (United States)

    Goodrich, Kristopher M.; Luke, Melissa

    2011-01-01

    Although supervision of group work has been linked to the development of multicultural and social justice competencies, there are no models for supervision of group work specifically designed to address the needs of lesbian, gay, bisexual, transgender, and questioning (LGBTQ) persons. This manuscript presents the LGBTQ Responsive Model for…

  11. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    Science.gov (United States)

    de Jong, Martijn G.; Steenkamp, Jan-Benedict E. M.

    2010-01-01

    We present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups of countries have different measurement operations, while…

  12. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  13. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  14. Active Exploration of Large 3D Model Repositories.

    Science.gov (United States)

    Gao, Lin; Cao, Yan-Pei; Lai, Yu-Kun; Huang, Hao-Zhi; Kobbelt, Leif; Hu, Shi-Min

    2015-12-01

    With broader availability of large-scale 3D model repositories, the need for efficient and effective exploration becomes more and more urgent. Existing model retrieval techniques do not scale well with the size of the database since often a large number of very similar objects are returned for a query, and the possibilities to refine the search are quite limited. We propose an interactive approach where the user feeds an active learning procedure by labeling either entire models or parts of them as "like" or "dislike" such that the system can automatically update an active set of recommended models. To provide an intuitive user interface, candidate models are presented based on their estimated relevance for the current query. From the methodological point of view, our main contribution is to exploit not only the similarity between a query and the database models but also the similarities among the database models themselves. We achieve this by an offline pre-processing stage, where global and local shape descriptors are computed for each model and a sparse distance metric is derived that can be evaluated efficiently even for very large databases. We demonstrate the effectiveness of our method by interactively exploring a repository containing over 100 K models.

  15. Modeling and Simulation Techniques for Large-Scale Communications Modeling

    National Research Council Canada - National Science Library

    Webb, Steve

    1997-01-01

    .... Tests of random number generators were also developed and applied to CECOM models. It was found that synchronization of random number strings in simulations is easy to implement and can provide significant savings for making comparative studies. If synchronization is in place, then statistical experiment design can be used to provide information on the sensitivity of the output to input parameters. The report concludes with recommendations and an implementation plan.

  16. Renormalization-group flow of the effective action of cosmological large-scale structures

    CERN Document Server

    Floerchinger, Stefan

    2017-01-01

    Following an approach of Matarrese and Pietroni, we derive the functional renormalization group (RG) flow of the effective action of cosmological large-scale structures. Perturbative solutions of this RG flow equation are shown to be consistent with standard cosmological perturbation theory. Non-perturbative approximate solutions can be obtained by truncating the a priori infinite set of possible effective actions to a finite subspace. Using for the truncated effective action a form dictated by dissipative fluid dynamics, we derive RG flow equations for the scale dependence of the effective viscosity and sound velocity of non-interacting dark matter, and we solve them numerically. Physically, the effective viscosity and sound velocity account for the interactions of long-wavelength fluctuations with the spectrum of smaller-scale perturbations. We find that the RG flow exhibits an attractor behaviour in the IR that significantly reduces the dependence of the effective viscosity and sound velocity on the input ...

  17. Group Membership, Group Change, and Intergroup Attitudes: A Recategorization Model Based on Cognitive Consistency Principles

    Directory of Open Access Journals (Sweden)

    Jenny Roth

    2018-04-01

    Full Text Available The present article introduces a model based on cognitive consistency principles to predict how new identities become integrated into the self-concept, with consequences for intergroup attitudes. The model specifies four concepts (self-concept, stereotypes, identification, and group compatibility as associative connections. The model builds on two cognitive principles, balance–congruity and imbalance–dissonance, to predict identification with social groups that people currently belong to, belonged to in the past, or newly belong to. More precisely, the model suggests that the relative strength of self-group associations (i.e., identification depends in part on the (incompatibility of the different social groups. Combining insights into cognitive representation of knowledge, intergroup bias, and explicit/implicit attitude change, we further derive predictions for intergroup attitudes. We suggest that intergroup attitudes alter depending on the relative associative strength between the social groups and the self, which in turn is determined by the (incompatibility between social groups. This model unifies existing models on the integration of social identities into the self-concept by suggesting that basic cognitive mechanisms play an important role in facilitating or hindering identity integration and thus contribute to reducing or increasing intergroup bias.

  18. Group Membership, Group Change, and Intergroup Attitudes: A Recategorization Model Based on Cognitive Consistency Principles.

    Science.gov (United States)

    Roth, Jenny; Steffens, Melanie C; Vignoles, Vivian L

    2018-01-01

    The present article introduces a model based on cognitive consistency principles to predict how new identities become integrated into the self-concept, with consequences for intergroup attitudes. The model specifies four concepts (self-concept, stereotypes, identification, and group compatibility) as associative connections. The model builds on two cognitive principles, balance-congruity and imbalance-dissonance, to predict identification with social groups that people currently belong to, belonged to in the past, or newly belong to. More precisely, the model suggests that the relative strength of self-group associations (i.e., identification) depends in part on the (in)compatibility of the different social groups. Combining insights into cognitive representation of knowledge, intergroup bias, and explicit/implicit attitude change, we further derive predictions for intergroup attitudes. We suggest that intergroup attitudes alter depending on the relative associative strength between the social groups and the self, which in turn is determined by the (in)compatibility between social groups. This model unifies existing models on the integration of social identities into the self-concept by suggesting that basic cognitive mechanisms play an important role in facilitating or hindering identity integration and thus contribute to reducing or increasing intergroup bias.

  19. Dynamics of group knowledge production in facilitated modelling workshops

    DEFF Research Database (Denmark)

    Tavella, Elena; Franco, L. Alberto

    2015-01-01

    , the workshop. Drawing on the knowledge-perspective of group communication, we conducted a micro-level analysis of a transcript of a facilitated modelling workshop held with the management team of an Alternative Food Network in the UK. Our analysis suggests that facilitated modelling interactions can take......The term ‘facilitated modelling’ is used in the literature to characterise an approach to structuring problems, developing options and evaluating decisions by groups working in a model-supported workshop environment, and assisted by a facilitator. The approach involves an interactive process...... by which models are jointly developed with group members interacting face-to-face, with or without computer support. The models produced are used to inform negotiations about the nature of the issues faced by the group, and how to address them. While the facilitated modelling literature is impressive...

  20. Estimation and Inference for Very Large Linear Mixed Effects Models

    OpenAIRE

    Gao, K.; Owen, A. B.

    2016-01-01

    Linear mixed models with large imbalanced crossed random effects structures pose severe computational problems for maximum likelihood estimation and for Bayesian analysis. The costs can grow as fast as $N^{3/2}$ when there are N observations. Such problems arise in any setting where the underlying factors satisfy a many to many relationship (instead of a nested one) and in electronic commerce applications, the N can be quite large. Methods that do not account for the correlation structure can...

  1. Modelling and transient stability of large wind farms

    DEFF Research Database (Denmark)

    Akhmatov, Vladislav; Knudsen, Hans; Nielsen, Arne Hejde

    2003-01-01

    by a physical model of grid-connected windmills. The windmill generators ate conventional induction generators and the wind farm is ac-connected to the power system. Improvements-of short-term voltage stability in case of failure events in the external power system are treated with use of conventional generator...... technology. This subject is treated as a parameter study with respect to the windmill electrical and mechanical parameters and with use of control strategies within the conventional generator technology. Stability improvements on the wind farm side of the connection point lead to significant reduction......The paper is dealing-with modelling and short-term Voltage stability considerations of large wind farms. A physical model of a large offshore wind farm consisting of a large number of windmills is implemented in the dynamic simulation tool PSS/E. Each windmill in the wind farm is represented...

  2. Large group influence for decreased drug use: findings from two contemporary religious sects.

    Science.gov (United States)

    Galanter, M; Buckley, P; Deutsch, A; Rabkin, R; Rabkin, J

    1980-01-01

    This paper reports on studies designed to clarify the role of large cohesive groups in effecting diminished drug use among their members. Subjects were drawn from two contemporary religious sects and data were obtained by administering self-report questionnaires under controlled conditions, in cooperation with the sects' leadership. Data which bear directly on changes in drug use are reported here. Members of the Divine Light Mission (DLM), many of whom had been involved in the "counterculture" of the early 1970s, reported incidence of drug use prior to joining which was much above that of a nonmember comparison group. Reported levels were considerably lower after joining, and the decline was maintained over an average membership of 2 years. Unification Church (UC) members showed a similar pattern but their drug use began at a somewhat lower level and declined further still; this reflects a stricter stance toward illicit intoxicants in the UC, and relatively less openness to transcendental altered consciousness, which is an integral part of DLM meditation. Data from persons registered for UC recruitment workshops corroborated retrospective reports of the long-standing members. Changes in the consumption of tranquilizers were also considered. Data on caffeine consumption reflected less strict commitment to controls over this agent. The decline in drug use was considered in relation to feelings of social cohesiveness toward fellow group members, which was a significant predictor of change in drug use in multiple regression analysis. The findings are examined in relation to the interplay between behavioral norms in a close-knit subculture and the role of its beliefs and values in determining levels of drug use.

  3. How the group affects the mind : A cognitive model of idea generation in groups

    NARCIS (Netherlands)

    Nijstad, Bernard A.; Stroebe, Wolfgang

    2006-01-01

    A model called search for ideas in associative memory (SIAM) is proposed to account for various research findings in the area of group idea generation. The model assumes that idea generation is a repeated search for ideas in associative memory, which proceeds in 2 stages (knowledge activation and

  4. Disinformative data in large-scale hydrological modelling

    Directory of Open Access Journals (Sweden)

    A. Kauffeldt

    2013-07-01

    Full Text Available Large-scale hydrological modelling has become an important tool for the study of global and regional water resources, climate impacts, and water-resources management. However, modelling efforts over large spatial domains are fraught with problems of data scarcity, uncertainties and inconsistencies between model forcing and evaluation data. Model-independent methods to screen and analyse data for such problems are needed. This study aimed at identifying data inconsistencies in global datasets using a pre-modelling analysis, inconsistencies that can be disinformative for subsequent modelling. The consistency between (i basin areas for different hydrographic datasets, and (ii between climate data (precipitation and potential evaporation and discharge data, was examined in terms of how well basin areas were represented in the flow networks and the possibility of water-balance closure. It was found that (i most basins could be well represented in both gridded basin delineations and polygon-based ones, but some basins exhibited large area discrepancies between flow-network datasets and archived basin areas, (ii basins exhibiting too-high runoff coefficients were abundant in areas where precipitation data were likely affected by snow undercatch, and (iii the occurrence of basins exhibiting losses exceeding the potential-evaporation limit was strongly dependent on the potential-evaporation data, both in terms of numbers and geographical distribution. Some inconsistencies may be resolved by considering sub-grid variability in climate data, surface-dependent potential-evaporation estimates, etc., but further studies are needed to determine the reasons for the inconsistencies found. Our results emphasise the need for pre-modelling data analysis to identify dataset inconsistencies as an important first step in any large-scale study. Applying data-screening methods before modelling should also increase our chances to draw robust conclusions from subsequent

  5. Coupled SWAT-MODFLOW Model Development for Large Basins

    Science.gov (United States)

    Aliyari, F.; Bailey, R. T.; Tasdighi, A.

    2017-12-01

    Water management in semi-arid river basins requires allocating water resources between urban, industrial, energy, and agricultural sectors, with the latter competing for necessary irrigation water to sustain crop yield. Competition between these sectors will intensify due to changes in climate and population growth. In this study, the recently developed SWAT-MODFLOW coupled hydrologic model is modified for application in a large managed river basin that provides both surface water and groundwater resources for urban and agricultural areas. Specific modifications include the linkage of groundwater pumping and irrigation practices and code changes to allow for the large number of SWAT hydrologic response units (HRU) required for a large river basin. The model is applied to the South Platte River Basin (SPRB), a 56,980 km2 basin in northeastern Colorado dominated by large urban areas along the front range of the Rocky Mountains and agriculture regions to the east. Irregular seasonal and annual precipitation and 150 years of urban and agricultural water management history in the basin provide an ideal test case for the SWAT-MODFLOW model. SWAT handles land surface and soil zone processes whereas MODFLOW handles groundwater flow and all sources and sinks (pumping, injection, bedrock inflow, canal seepage, recharge areas, groundwater/surface water interaction), with recharge and stream stage provided by SWAT. The model is tested against groundwater levels, deep percolation estimates, and stream discharge. The model will be used to quantify spatial groundwater vulnerability in the basin under scenarios of climate change and population growth.

  6. Group size, grooming and fission in primates: a modeling approach based on group structure.

    Science.gov (United States)

    Sueur, Cédric; Deneubourg, Jean-Louis; Petit, Odile; Couzin, Iain D

    2011-03-21

    In social animals, fission is a common mode of group proliferation and dispersion and may be affected by genetic or other social factors. Sociality implies preserving relationships between group members. An increase in group size and/or in competition for food within the group can result in decrease certain social interactions between members, and the group may split irreversibly as a consequence. One individual may try to maintain bonds with a maximum of group members in order to keep group cohesion, i.e. proximity and stable relationships. However, this strategy needs time and time is often limited. In addition, previous studies have shown that whatever the group size, an individual interacts only with certain grooming partners. There, we develop a computational model to assess how dynamics of group cohesion are related to group size and to the structure of grooming relationships. Groups' sizes after simulated fission are compared to observed sizes of 40 groups of primates. Results showed that the relationship between grooming time and group size is dependent on how each individual attributes grooming time to its social partners, i.e. grooming a few number of preferred partners or grooming equally or not all partners. The number of partners seemed to be more important for the group cohesion than the grooming time itself. This structural constraint has important consequences on group sociality, as it gives the possibility of competition for grooming partners, attraction for high-ranking individuals as found in primates' groups. It could, however, also have implications when considering the cognitive capacities of primates. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Large animal and primate models of spinal cord injury for the testing of novel therapies.

    Science.gov (United States)

    Kwon, Brian K; Streijger, Femke; Hill, Caitlin E; Anderson, Aileen J; Bacon, Mark; Beattie, Michael S; Blesch, Armin; Bradbury, Elizabeth J; Brown, Arthur; Bresnahan, Jacqueline C; Case, Casey C; Colburn, Raymond W; David, Samuel; Fawcett, James W; Ferguson, Adam R; Fischer, Itzhak; Floyd, Candace L; Gensel, John C; Houle, John D; Jakeman, Lyn B; Jeffery, Nick D; Jones, Linda Ann Truett; Kleitman, Naomi; Kocsis, Jeffery; Lu, Paul; Magnuson, David S K; Marsala, Martin; Moore, Simon W; Mothe, Andrea J; Oudega, Martin; Plant, Giles W; Rabchevsky, Alexander Sasha; Schwab, Jan M; Silver, Jerry; Steward, Oswald; Xu, Xiao-Ming; Guest, James D; Tetzlaff, Wolfram

    2015-07-01

    Large animal and primate models of spinal cord injury (SCI) are being increasingly utilized for the testing of novel therapies. While these represent intermediary animal species between rodents and humans and offer the opportunity to pose unique research questions prior to clinical trials, the role that such large animal and primate models should play in the translational pipeline is unclear. In this initiative we engaged members of the SCI research community in a questionnaire and round-table focus group discussion around the use of such models. Forty-one SCI researchers from academia, industry, and granting agencies were asked to complete a questionnaire about their opinion regarding the use of large animal and primate models in the context of testing novel therapeutics. The questions centered around how large animal and primate models of SCI would be best utilized in the spectrum of preclinical testing, and how much testing in rodent models was warranted before employing these models. Further questions were posed at a focus group meeting attended by the respondents. The group generally felt that large animal and primate models of SCI serve a potentially useful role in the translational pipeline for novel therapies, and that the rational use of these models would depend on the type of therapy and specific research question being addressed. While testing within these models should not be mandatory, the detection of beneficial effects using these models lends additional support for translating a therapy to humans. These models provides an opportunity to evaluate and refine surgical procedures prior to use in humans, and safety and bio-distribution in a spinal cord more similar in size and anatomy to that of humans. Our results reveal that while many feel that these models are valuable in the testing of novel therapies, important questions remain unanswered about how they should be used and how data derived from them should be interpreted. Copyright © 2015 Elsevier

  8. Large N scalars: From glueballs to dynamical Higgs models

    Science.gov (United States)

    Sannino, Francesco

    2016-05-01

    We construct effective Lagrangians, and corresponding counting schemes, valid to describe the dynamics of the lowest lying large N stable massive composite state emerging in strongly coupled theories. The large N counting rules can now be employed when computing quantum corrections via an effective Lagrangian description. The framework allows for systematic investigations of composite dynamics of a non-Goldstone nature. Relevant examples are the lightest glueball states emerging in any Yang-Mills theory. We further apply the effective approach and associated counting scheme to composite models at the electroweak scale. To illustrate the formalism we consider the possibility that the Higgs emerges as the lightest glueball of a new composite theory; the large N scalar meson in models of dynamical electroweak symmetry breaking; the large N pseudodilaton useful also for models of near-conformal dynamics. For each of these realizations we determine the leading N corrections to the electroweak precision parameters. The results nicely elucidate the underlying large N dynamics and can be used to confront first principle lattice results featuring composite scalars with a systematic effective approach.

  9. The Beyond the standard model working group: Summary report

    Energy Technology Data Exchange (ETDEWEB)

    G. Azuelos et al.

    2004-03-18

    In this working group we have investigated a number of aspects of searches for new physics beyond the Standard Model (SM) at the running or planned TeV-scale colliders. For the most part, we have considered hadron colliders, as they will define particle physics at the energy frontier for the next ten years at least. The variety of models for Beyond the Standard Model (BSM) physics has grown immensely. It is clear that only future experiments can provide the needed direction to clarify the correct theory. Thus, our focus has been on exploring the extent to which hadron colliders can discover and study BSM physics in various models. We have placed special emphasis on scenarios in which the new signal might be difficult to find or of a very unexpected nature. For example, in the context of supersymmetry (SUSY), we have considered: how to make fully precise predictions for the Higgs bosons as well as the superparticles of the Minimal Supersymmetric Standard Model (MSSM) (parts III and IV); MSSM scenarios in which most or all SUSY particles have rather large masses (parts V and VI); the ability to sort out the many parameters of the MSSM using a variety of signals and study channels (part VII); whether the no-lose theorem for MSSM Higgs discovery can be extended to the next-to-minimal Supersymmetric Standard Model (NMSSM) in which an additional singlet superfield is added to the minimal collection of superfields, potentially providing a natural explanation of the electroweak value of the parameter {micro} (part VIII); sorting out the effects of CP violation using Higgs plus squark associate production (part IX); the impact of lepton flavor violation of various kinds (part X); experimental possibilities for the gravitino and its sgoldstino partner (part XI); what the implications for SUSY would be if the NuTeV signal for di-muon events were interpreted as a sign of R-parity violation (part XII). Our other main focus was on the phenomenological implications of extra

  10. Solving large linear systems in an implicit thermohaline ocean model

    NARCIS (Netherlands)

    de Niet, Arie Christiaan

    2007-01-01

    The climate on earth is largely determined by the global ocean circulation. Hence it is important to predict how the flow will react to perturbation by for example melting icecaps. To answer questions about the stability of the global ocean flow, a computer model has been developed that is able to

  11. Application of Logic Models in a Large Scientific Research Program

    Science.gov (United States)

    O'Keefe, Christine M.; Head, Richard J.

    2011-01-01

    It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…

  12. Searches for phenomena beyond the Standard Model at the Large ...

    Indian Academy of Sciences (India)

    metry searches at the LHC is thus the channel with large missing transverse momentum and jets of high transverse momentum. No excess above the expected SM background is observed and limits are set on supersymmetric models. Figures 1 and 2 show the limits from ATLAS [11] and CMS [12]. In addition to setting limits ...

  13. A stochastic large deformation model for computational anatomy

    DEFF Research Database (Denmark)

    Arnaudon, Alexis; Holm, Darryl D.; Pai, Akshay Sadananda Uppinakudru

    2017-01-01

    In the study of shapes of human organs using computational anatomy, variations are found to arise from inter-subject anatomical differences, disease-specific effects, and measurement noise. This paper introduces a stochastic model for incorporating random variations into the Large Deformation...

  14. Quantifying fish escape behaviour through large mesh panels in trawls based on catch comparision data – model development and a case study from Skagerrak In: ICES (2012) Report of the ICES-FAO Working Group on Fishing Gear Technology and Fish Behaivour (WGFTFB), 23-27 April 2012, Lorient, France, . ICES CM 2012/SSGESST:07

    DEFF Research Database (Denmark)

    Krag, Ludvig Ahm; Herrmann, Bent; Karlsen, Junita

    Based on catch comparison data, it is demonstrated how detailed and quantitative information about species-specific and size dependent escape behaviour in relation to a large mesh panel can be extracted. A new analytical model is developed, applied, and compared to the traditional modelling...... the interpretation of the length based escapement behaviour over the large mesh panel. Our length based behavioural description is in good agreement with direct observations of the same species in the trawl cavity reported in literature. Fish behaviour understanding is essential. Observations are often difficult...

  15. The supersymmetric t-J model with quantum group invariance

    International Nuclear Information System (INIS)

    Foerster, A.; Karowski, M.

    1993-04-01

    An integrable quantum group deformation of the supersymmetric t-J model is introduced. Open boundary conditions lead to an spl q (2, 1) invariant hamiltonian. A general procedure to obtain such invariant models is proposed. To solve the model a generalized nested algebraic Bethe ansatz is constructed and the Bethe ansatz equations are obtained. The quantum supergroup structure of the model is investigated. (orig.)

  16. Adaptive Gaussian Predictive Process Models for Large Spatial Datasets

    Science.gov (United States)

    Guhaniyogi, Rajarshi; Finley, Andrew O.; Banerjee, Sudipto; Gelfand, Alan E.

    2011-01-01

    Large point referenced datasets occur frequently in the environmental and natural sciences. Use of Bayesian hierarchical spatial models for analyzing these datasets is undermined by onerous computational burdens associated with parameter estimation. Low-rank spatial process models attempt to resolve this problem by projecting spatial effects to a lower-dimensional subspace. This subspace is determined by a judicious choice of “knots” or locations that are fixed a priori. One such representation yields a class of predictive process models (e.g., Banerjee et al., 2008) for spatial and spatial-temporal data. Our contribution here expands upon predictive process models with fixed knots to models that accommodate stochastic modeling of the knots. We view the knots as emerging from a point pattern and investigate how such adaptive specifications can yield more flexible hierarchical frameworks that lead to automated knot selection and substantial computational benefits. PMID:22298952

  17. Particle production at large transverse momentum and hard collision models

    International Nuclear Information System (INIS)

    Ranft, G.; Ranft, J.

    1977-04-01

    The majority of the presently available experimental data is consistent with hard scattering models. Therefore the hard scattering model seems to be well established. There is good evidence for jets in large transverse momentum reactions as predicted by these models. The overall picture is however not yet well enough understood. We mention only the empirical hard scattering cross section introduced in most of the models, the lack of a deep theoretical understanding of the interplay between quark confinement and jet production, and the fact that we are not yet able to discriminate conclusively between the many proposed hard scattering models. The status of different hard collision models discussed in this paper is summarized. (author)

  18. A numerical shoreline model for shorelines with large curvature

    DEFF Research Database (Denmark)

    Kærgaard, Kasper Hauberg; Fredsøe, Jørgen

    2013-01-01

    This paper presents a new numerical model for shoreline change which can be used to model the evolution of shorelines with large curvature. The model is based on a one-line formulation in terms of coordinates which follow the shape of the shoreline, instead of the more common approach where the two...... orthogonal horizontal directions are used. The volume error in the sediment continuity equation which is thereby introduced is removed through an iterative procedure. The model treats the shoreline changes by computing the sediment transport in a 2D coastal area model, and then integrating the sediment...... transport field across the coastal profile to obtain the longshore sediment transport variation along the shoreline. The model is used to compute the evolution of a shoreline with a 90° change in shoreline orientation; due to this drastic change in orientation a migrating shoreline spit develops...

  19. An Automatic User Grouping Model for a Group Recommender System in Location-Based Social Networks

    Directory of Open Access Journals (Sweden)

    Elahe Khazaei

    2018-02-01

    Full Text Available Spatial group recommendation refers to suggesting places to a given set of users. In a group recommender system, members of a group should have similar preferences in order to increase the level of satisfaction. Location-based social networks (LBSNs provide rich content, such as user interactions and location/event descriptions, which can be leveraged for group recommendations. In this paper, an automatic user grouping model is introduced that obtains information about users and their preferences through an LBSN. The preferences of the users, proximity of the places the users have visited in terms of spatial range, users’ free days, and the social relationships among users are extracted automatically from location histories and users’ profiles in the LBSN. These factors are combined to determine the similarities among users. The users are partitioned into groups based on these similarities. Group size is the key to coordinating group members and enhancing their satisfaction. Therefore, a modified k-medoids method is developed to cluster users into groups with specific sizes. To evaluate the efficiency of the proposed method, its mean intra-cluster distance and its distribution of cluster sizes are compared to those of general clustering algorithms. The results reveal that the proposed method compares favourably with general clustering approaches, such as k-medoids and spectral clustering, in separating users into groups of a specific size with a lower mean intra-cluster distance.

  20. The Beyond the Standard Model Working Group: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Rizzo, Thomas G.

    2002-08-08

    Various theoretical aspects of physics beyond the Standard Model at hadron colliders are discussed. Our focus will be on those issues that most immediately impact the projects pursued as part of the BSM group at this meeting.

  1. AMPO Travel Modeling Working Group Meeting on Dynamic Traffic Assignment

    Science.gov (United States)

    2016-03-01

    On December 17-18, 2015, the Association of Metropolitan Planning Organizations (AMPO) convened a travel modeling working group meeting for the purpose of discussing Dynamic Traffic Assignment (DTA). Participants discussed the uses of DTA, challenges...

  2. Veal calves’ clinical/health status in large groups fed with automatic feeding devices

    Directory of Open Access Journals (Sweden)

    Giulio Cozzi

    2010-01-01

    Full Text Available Aim of the current study was to evaluate the clinical/health status of veal calves in 3 farms that adopt large group housing and automatic feeding stations in Italy. Visits were scheduled in three phases of the rearing cycle (early, middle, and end. Results showed a high incidence of coughing, skin infection and bloated rumen particularly in the middle phase while cross-sucking signs were present at the early stage when calves’ nibbling proclivity is still high. Throughout the rearing cycle, the frequency of bursitis increased reaching 53% of calves at the end. The percentage of calves with a poorer body condition than the mid-range of the batch raised gradually as well, likely due to the non-proportioned teat/calves ratio that increases competition for feed and reduces milk intake of the low ranking animals. The remarked growth differences among pen-mates and the mortality rate close to 7% showed by the use of automatic feeding devices for milk delivery seem not compensating the lower labour demand, therefore its sustainability at the present status is doubtful both for the veal calves’ welfare and the farm incomes.

  3. Large deflection of viscoelastic beams using fractional derivative model

    International Nuclear Information System (INIS)

    Bahranini, Seyed Masoud Sotoodeh; Eghtesad, Mohammad; Ghavanloo, Esmaeal; Farid, Mehrdad

    2013-01-01

    This paper deals with large deflection of viscoelastic beams using a fractional derivative model. For this purpose, a nonlinear finite element formulation of viscoelastic beams in conjunction with the fractional derivative constitutive equations has been developed. The four-parameter fractional derivative model has been used to describe the constitutive equations. The deflected configuration for a uniform beam with different boundary conditions and loads is presented. The effect of the order of fractional derivative on the large deflection of the cantilever viscoelastic beam, is investigated after 10, 100, and 1000 hours. The main contribution of this paper is finite element implementation for nonlinear analysis of viscoelastic fractional model using the storage of both strain and stress histories. The validity of the present analysis is confirmed by comparing the results with those found in the literature.

  4. Engineering Large Animal Species to Model Human Diseases.

    Science.gov (United States)

    Rogers, Christopher S

    2016-07-01

    Animal models are an important resource for studying human diseases. Genetically engineered mice are the most commonly used species and have made significant contributions to our understanding of basic biology, disease mechanisms, and drug development. However, they often fail to recreate important aspects of human diseases and thus can have limited utility as translational research tools. Developing disease models in species more similar to humans may provide a better setting in which to study disease pathogenesis and test new treatments. This unit provides an overview of the history of genetically engineered large animals and the techniques that have made their development possible. Factors to consider when planning a large animal model, including choice of species, type of modification and methodology, characterization, production methods, and regulatory compliance, are also covered. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  5. Functional renormalization group approach to SU(N ) Heisenberg models: Real-space renormalization group at arbitrary N

    Science.gov (United States)

    Buessen, Finn Lasse; Roscher, Dietrich; Diehl, Sebastian; Trebst, Simon

    2018-02-01

    The pseudofermion functional renormalization group (pf-FRG) is one of the few numerical approaches that has been demonstrated to quantitatively determine the ordering tendencies of frustrated quantum magnets in two and three spatial dimensions. The approach, however, relies on a number of presumptions and approximations, in particular the choice of pseudofermion decomposition and the truncation of an infinite number of flow equations to a finite set. Here we generalize the pf-FRG approach to SU (N )-spin systems with arbitrary N and demonstrate that the scheme becomes exact in the large-N limit. Numerically solving the generalized real-space renormalization group equations for arbitrary N , we can make a stringent connection between the physically most significant case of SU(2) spins and more accessible SU (N ) models. In a case study of the square-lattice SU (N ) Heisenberg antiferromagnet, we explicitly demonstrate that the generalized pf-FRG approach is capable of identifying the instability indicating the transition into a staggered flux spin liquid ground state in these models for large, but finite, values of N . In a companion paper [Roscher et al., Phys. Rev. B 97, 064416 (2018), 10.1103/PhysRevB.97.064416] we formulate a momentum-space pf-FRG approach for SU (N ) spin models that allows us to explicitly study the large-N limit and access the low-temperature spin liquid phase.

  6. Ecohydrological modeling for large-scale environmental impact assessment.

    Science.gov (United States)

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Large time periodic solutions to coupled chemotaxis-fluid models

    Science.gov (United States)

    Jin, Chunhua

    2017-12-01

    In this paper, we deal with the time periodic problem to coupled chemotaxis-fluid models. We prove the existence of large time periodic strong solutions for the full chemotaxis-Navier-Stokes system in spatial dimension N=2, and the existence of large time periodic strong solutions for the chemotaxis-Stokes system in spatial dimension N=3. On the basis of these, the regularity of the solutions can be further improved. More precisely speaking, if the time periodic source g and the potential force \

  8. Large Animal Models for Foamy Virus Vector Gene Therapy

    Directory of Open Access Journals (Sweden)

    Peter A. Horn

    2012-12-01

    Full Text Available Foamy virus (FV vectors have shown great promise for hematopoietic stem cell (HSC gene therapy. Their ability to efficiently deliver transgenes to multi-lineage long-term repopulating cells in large animal models suggests they will be effective for several human hematopoietic diseases. Here, we review FV vector studies in large animal models, including the use of FV vectors with the mutant O6-methylguanine-DNA methyltransferase, MGMTP140K to increase the number of genetically modified cells after transplantation. In these studies, FV vectors have mediated efficient gene transfer to polyclonal repopulating cells using short ex vivo transduction protocols designed to minimize the negative effects of ex vivo culture on stem cell engraftment. In this regard, FV vectors appear superior to gammaretroviral vectors, which require longer ex vivo culture to effect efficient transduction. FV vectors have also compared favorably with lentiviral vectors when directly compared in the dog model. FV vectors have corrected leukocyte adhesion deficiency and pyruvate kinase deficiency in the dog large animal model. FV vectors also appear safer than gammaretroviral vectors based on a reduced frequency of integrants near promoters and also near proto-oncogenes in canine repopulating cells. Together, these studies suggest that FV vectors should be highly effective for several human hematopoietic diseases, including those that will require relatively high percentages of gene-modified cells to achieve clinical benefit.

  9. Global Bedload Flux Modeling and Analysis in Large Rivers

    Science.gov (United States)

    Islam, M. T.; Cohen, S.; Syvitski, J. P.

    2017-12-01

    Proper sediment transport quantification has long been an area of interest for both scientists and engineers in the fields of geomorphology, and management of rivers and coastal waters. Bedload flux is important for monitoring water quality and for sustainable development of coastal and marine bioservices. Bedload measurements, especially for large rivers, is extremely scarce across time, and many rivers have never been monitored. Bedload measurements in rivers, is particularly acute in developing countries where changes in sediment yields is high. The paucity of bedload measurements is the result of 1) the nature of the problem (large spatial and temporal uncertainties), and 2) field costs including the time-consuming nature of the measurement procedures (repeated bedform migration tracking, bedload samplers). Here we present a first of its kind methodology for calculating bedload in large global rivers (basins are >1,000 km. Evaluation of model skill is based on 113 bedload measurements. The model predictions are compared with an empirical model developed from the observational dataset in an attempt to evaluate the differences between a physically-based numerical model and a lumped relationship between bedload flux and fluvial and basin parameters (e.g., discharge, drainage area, lithology). The initial study success opens up various applications to global fluvial geomorphology (e.g. including the relationship between suspended sediment (wash load) and bedload). Simulated results with known uncertainties offers a new research product as a valuable resource for the whole scientific community.

  10. Pile group program for full material modeling and progressive failure.

    Science.gov (United States)

    2008-12-01

    Strain wedge (SW) model formulation has been used, in previous work, to evaluate the response of a single pile or a group of piles (including its : pile cap) in layered soils to lateral loading. The SW model approach provides appropriate prediction f...

  11. A Creative Therapies Model for the Group Supervision of Counsellors.

    Science.gov (United States)

    Wilkins, Paul

    1995-01-01

    Sets forth a model of group supervision, drawing on a creative therapies approach which provides an effective way of delivering process issues, conceptualization issues, and personalization issues. The model makes particular use of techniques drawn from art therapy and from psychodrama, and should be applicable to therapists of many orientations.…

  12. Investigating the LGBTQ Responsive Model for Supervision of Group Work

    Science.gov (United States)

    Luke, Melissa; Goodrich, Kristopher M.

    2013-01-01

    This article reports an investigation of the LGBTQ Responsive Model for Supervision of Group Work, a trans-theoretical supervisory framework to address the needs of lesbian, gay, bisexual, transgender, and questioning (LGBTQ) persons (Goodrich & Luke, 2011). Findings partially supported applicability of the LGBTQ Responsive Model for Supervision…

  13. Deciphering the Crowd: Modeling and Identification of Pedestrian Group Motion

    Directory of Open Access Journals (Sweden)

    Norihiro Hagita

    2013-01-01

    Full Text Available Associating attributes to pedestrians in a crowd is relevant for various areas like surveillance, customer profiling and service providing. The attributes of interest greatly depend on the application domain and might involve such social relations as friends or family as well as the hierarchy of the group including the leader or subordinates. Nevertheless, the complex social setting inherently complicates this task. We attack this problem by exploiting the small group structures in the crowd. The relations among individuals and their peers within a social group are reliable indicators of social attributes. To that end, this paper identifies social groups based on explicit motion models integrated through a hypothesis testing scheme. We develop two models relating positional and directional relations. A pair of pedestrians is identified as belonging to the same group or not by utilizing the two models in parallel, which defines a compound hypothesis testing scheme. By testing the proposed approach on three datasets with different environmental properties and group characteristics, it is demonstrated that we achieve an identification accuracy of 87% to 99%. The contribution of this study lies in its definition of positional and directional relation models, its description of compound evaluations, and the resolution of ambiguities with our proposed uncertainty measure based on the local and global indicators of group relation.

  14. Deciphering the crowd: modeling and identification of pedestrian group motion.

    Science.gov (United States)

    Yücel, Zeynep; Zanlungo, Francesco; Ikeda, Tetsushi; Miyashita, Takahiro; Hagita, Norihiro

    2013-01-14

    Associating attributes to pedestrians in a crowd is relevant for various areas like surveillance, customer profiling and service providing. The attributes of interest greatly depend on the application domain and might involve such social relations as friends or family as well as the hierarchy of the group including the leader or subordinates. Nevertheless, the complex social setting inherently complicates this task. We attack this problem by exploiting the small group structures in the crowd. The relations among individuals and their peers within a social group are reliable indicators of social attributes. To that end, this paper identifies social groups based on explicit motion models integrated through a hypothesis testing scheme. We develop two models relating positional and directional relations. A pair of pedestrians is identified as belonging to the same group or not by utilizing the two models in parallel, which defines a compound hypothesis testing scheme. By testing the proposed approach on three datasets with different environmental properties and group characteristics, it is demonstrated that we achieve an identification accuracy of 87% to 99%. The contribution of this study lies in its definition of positional and directional relation models, its description of compound evaluations, and the resolution of ambiguities with our proposed uncertainty measure based on the local and global indicators of group relation.

  15. What is special about the group of the standard model?

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Brene, N.

    1989-03-01

    The standard model is based on the algebra of U 1 xSU 2 xSU 3 . The systematics of charges of the fundamental fermions seems to suggest the importance of a particular group having this algebra, viz. S(U 2 xU 3 ). This group is distinguished from all other connected compact non semisimple groups with dimensionality up to 12 by a characteristic property: it is very 'skew'. By this we mean that the group has relatively few 'generalised outer automorphisms'. One may speculate about physical reasons for this fact. (orig.)

  16. A model of interaction between anticorruption authority and corruption groups

    Energy Technology Data Exchange (ETDEWEB)

    Neverova, Elena G.; Malafeyef, Oleg A. [Saint-Petersburg State University, Saint-Petersburg, Russia, 35, Universitetskii prospekt, Petrodvorets, 198504 Email:elenaneverowa@gmail.com, malafeyevoa@mail.ru (Russian Federation)

    2015-03-10

    The paper provides a model of interaction between anticorruption unit and corruption groups. The main policy functions of the anticorruption unit involve reducing corrupt practices in some entities through an optimal approach to resource allocation and effective anticorruption policy. We develop a model based on Markov decision-making process and use Howard’s policy-improvement algorithm for solving an optimal decision strategy. We examine the assumption that corruption groups retaliate against the anticorruption authority to protect themselves. This model was implemented through stochastic game.

  17. A model of interaction between anticorruption authority and corruption groups

    International Nuclear Information System (INIS)

    Neverova, Elena G.; Malafeyef, Oleg A.

    2015-01-01

    The paper provides a model of interaction between anticorruption unit and corruption groups. The main policy functions of the anticorruption unit involve reducing corrupt practices in some entities through an optimal approach to resource allocation and effective anticorruption policy. We develop a model based on Markov decision-making process and use Howard’s policy-improvement algorithm for solving an optimal decision strategy. We examine the assumption that corruption groups retaliate against the anticorruption authority to protect themselves. This model was implemented through stochastic game

  18. Does the interpersonal model apply across eating disorder diagnostic groups? A structural equation modeling approach.

    Science.gov (United States)

    Ivanova, Iryna V; Tasca, Giorgio A; Proulx, Geneviève; Bissada, Hany

    2015-11-01

    Interpersonal model has been validated with binge-eating disorder (BED), but it is not yet known if the model applies across a range of eating disorders (ED). The goal of this study was to investigate the validity of the interpersonal model in anorexia nervosa (restricting type; ANR and binge-eating/purge type; ANBP), bulimia nervosa (BN), BED, and eating disorder not otherwise specified (EDNOS). Data from a cross-sectional sample of 1459 treatment-seeking women diagnosed with ANR, ANBP, BN, BED and EDNOS were examined for indirect effects of interpersonal problems on ED psychopathology mediated through negative affect. Findings from structural equation modeling demonstrated the mediating role of negative affect in four of the five diagnostic groups. There were significant, medium to large (.239, .558), indirect effects in the ANR, BN, BED and EDNOS groups but not in the ANBP group. The results of the first reverse model of interpersonal problems as a mediator between negative affect and ED psychopathology were nonsignificant, suggesting the specificity of these hypothesized paths. However, in the second reverse model ED psychopathology was related to interpersonal problems indirectly through negative affect. This is the first study to find support for the interpersonal model of ED in a clinical sample of women with diverse ED diagnoses, though there may be a reciprocal relationship between ED psychopathology and relationship problems through negative affect. Negative affect partially explains the relationship between interpersonal problems and ED psychopathology in women diagnosed with ANR, BN, BED and EDNOS. Interpersonal psychotherapies for ED may be addressing the underlying interpersonal-affective difficulties, thereby reducing ED psychopathology. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Extended Group Contribution Model for Polyfunctional Phase Equilibria

    DEFF Research Database (Denmark)

    Abildskov, Jens

    -liquid equilibria from data on binary mixtures, composed of structurally simple molecules with a single functional group. More complex is the situation with mixtures composed of structurally more complicated molecules or molecules with more than one functional group. The UNIFAC method is extended to handle...... on ideas applied to modelling of pure component properties. Chapter 2 describes the conceptual background of the approach. Three extensions of the present first-order UNIFAC model are formulated in chapter 3. These obey the Gibbs-Duhem restriction, and satisfy other traditional consistency requirements....... In chapter 4 parameters are estimated for the first-order UNIFAC model, based on which parameters are estimated for one of the second-order models described in chapter 3. The parameter estimation is based on measured binary data on around 4000 systems, covering 11 C-, H- and O-containing functional groups...

  20. Spatial associations between socioeconomic groups and NO2 air pollution exposure within three large Canadian cities.

    Science.gov (United States)

    Pinault, Lauren; Crouse, Daniel; Jerrett, Michael; Brauer, Michael; Tjepkema, Michael

    2016-05-01

    Previous studies of environmental justice in Canadian cities have linked lower socioeconomic status to greater air pollution exposures at coarse geographic scales, (i.e., Census Tracts). However, studies that examine these associations at finer scales are less common, as are comparisons among cities. To assess differences in exposure to air pollution among socioeconomic groups, we assigned estimates of exposure to ambient nitrogen dioxide (NO2), a marker for traffic-related pollution, from city-wide land use regression models to respondents of the 2006 Canadian census long-form questionnaire in Toronto, Montreal, and Vancouver. Data were aggregated at a finer scale than in most previous studies (i.e., by Dissemination Area (DA), which includes approximately 400-700 persons). We developed simultaneous autoregressive (SAR) models, which account for spatial autocorrelation, to identify associations between NO2 exposure and indicators of social and material deprivation. In Canada's three largest cities, DAs with greater proportions of tenants and residents who do not speak either English or French were characterised by greater exposures to ambient NO2. We also observed positive associations between NO2 concentrations and indicators of social deprivation, including the proportion of persons living alone (in Toronto), and the proportion of persons who were unmarried/not in a common-law relationship (in Vancouver). Other common measures of deprivation (e.g., lone-parent families, unemployment) were not associated with NO2 exposures. DAs characterised by selected indicators of deprivation were associated with higher concentrations of ambient NO2 air pollution in the three largest cities in Canada. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  1. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension of the para......Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...... of the parameter vector. A new design matrix free algorithm is proposed for computing the penalized maximum likelihood estimate for GLAMs, which, in particular, handles nondifferentiable penalty functions. The proposed algorithm is implemented and available via the R package glamlasso. It combines several ideas...

  2. Precise MRI-based stereotaxic surgery in large animal models

    DEFF Research Database (Denmark)

    Glud, A. N.; Bech, J.; Tvilling, L.

    and subcortical anatomical differences. NEW METHOD: We present a convenient method to make an MRI-visible skull fiducial for 3D MRI-based stereotaxic procedures in larger experimental animals. Plastic screws were filled with either copper-sulphate solution or MRI-visible paste from a commercially available......BACKGROUND: Stereotaxic neurosurgery in large animals is used widely in different sophisticated models, where precision is becoming more crucial as desired anatomical target regions are becoming smaller. Individually calculated coordinates are necessary in large animal models with cortical...... cranial head marker. The screw fiducials were inserted in the animal skulls and T1 weighted MRI was performed allowing identification of the inserted skull marker. RESULTS: Both types of fiducial markers were clearly visible on the MRÍs. This allows high precision in the stereotaxic space. COMPARISON...

  3. Model for large scale circulation of nuclides in nature, 1

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Teruaki

    1988-12-01

    A model for large scale circulation of nuclides was developed, and a computer code named COCAIN was made which simulates this circulation system-dynamically. The natural environment considered in the present paper consists of 2 atmospheres, 8 geospheres and 2 lithospheres. The biosphere is composed of 4 types of edible plants, 5 cattles and their products, 4 water biota and 16 human organs. The biosphere is assumed to be given nuclides from the natural environment mentioned above. With the use of COCAIN, two numerical case studies were carried out; the one is the study on nuclear pollution in nature by the radioactive nuclides originating from the past nuclear bomb tests, and the other is the study on the response of environment and biota to the pulse injection of nuclides into one compartment. From the former case study it was verified that this model can well explain the observation and properly simulate the large scale circulation of nuclides in nature.

  4. Large animal models and new therapies for glycogen storage disease.

    Science.gov (United States)

    Brooks, Elizabeth D; Koeberl, Dwight D

    2015-05-01

    Glycogen storage diseases (GSD), a unique category of inherited metabolic disorders, were first described early in the twentieth century. Since then, the biochemical and genetic bases of these disorders have been determined, and an increasing number of animal models for GSD have become available. At least seven large mammalian models have been developed for laboratory research on GSDs. These models have facilitated the development of new therapies, including gene therapy, which are undergoing clinical translation. For example, gene therapy prolonged survival and prevented hypoglycemia during fasting for greater than one year in dogs with GSD type Ia, and the need for periodic re-administration to maintain efficacy was demonstrated in that dog model. The further development of gene therapy could provide curative therapy for patients with GSD and other inherited metabolic disorders.

  5. Large Scale Community Detection Using a Small World Model

    Directory of Open Access Journals (Sweden)

    Ranjan Kumar Behera

    2017-11-01

    Full Text Available In a social network, small or large communities within the network play a major role in deciding the functionalities of the network. Despite of diverse definitions, communities in the network may be defined as the group of nodes that are more densely connected as compared to nodes outside the group. Revealing such hidden communities is one of the challenging research problems. A real world social network follows small world phenomena, which indicates that any two social entities can be reachable in a small number of steps. In this paper, nodes are mapped into communities based on the random walk in the network. However, uncovering communities in large-scale networks is a challenging task due to its unprecedented growth in the size of social networks. A good number of community detection algorithms based on random walk exist in literature. In addition, when large-scale social networks are being considered, these algorithms are observed to take considerably longer time. In this work, with an objective to improve the efficiency of algorithms, parallel programming framework like Map-Reduce has been considered for uncovering the hidden communities in social network. The proposed approach has been compared with some standard existing community detection algorithms for both synthetic and real-world datasets in order to examine its performance, and it is observed that the proposed algorithm is more efficient than the existing ones.

  6. ARMA modelling of neutron stochastic processes with large measurement noise

    International Nuclear Information System (INIS)

    Zavaljevski, N.; Kostic, Lj.; Pesic, M.

    1994-01-01

    An autoregressive moving average (ARMA) model of the neutron fluctuations with large measurement noise is derived from langevin stochastic equations and validated using time series data obtained during prompt neutron decay constant measurements at the zero power reactor RB in Vinca. Model parameters are estimated using the maximum likelihood (ML) off-line algorithm and an adaptive pole estimation algorithm based on the recursive prediction error method (RPE). The results show that subcriticality can be determined from real data with high measurement noise using much shorter statistical sample than in standard methods. (author)

  7. Modeling large underground experimental halls for the superconducting super collider

    International Nuclear Information System (INIS)

    Duan, F.; Mrugala, M.

    1993-01-01

    Geomechanical aspects of the excavation design, and analysis of two large underground experimental halls for the Superconducting Super Collider (SSC), being built in Texas, have been extensively investigated using computer modeling. Each chamber, measuring approximately 350 ft long, 110 ft wide, and 190 ft high, is to be excavated mainly through soft marl and overlying competent limestone. Wall stability is essential not only for ensuring excavation safety but also for meeting strict requirements for chamber stability over the 30-yr design life of the facility. Extensive numerical modeling has played a significant role in the selection of excavation methods, excavation sequence, and rock reinforcement systems. (Author)

  8. A Modeling & Simulation Implementation Framework for Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Song Xiao

    2012-10-01

    Full Text Available Classical High Level Architecture (HLA systems are facing development problems for lack of supporting fine-grained component integration and interoperation in large-scale complex simulation applications. To provide efficient methods of this issue, an extensible, reusable and composable simulation framework is proposed. To promote the reusability from coarse-grained federate to fine-grained components, this paper proposes a modelling & simulation framework which consists of component-based architecture, modelling methods, and simulation services to support and simplify the process of complex simulation application construction. Moreover, a standard process and simulation tools are developed to ensure the rapid and effective development of simulation application.

  9. Simulating large cosmology surveys with calibrated halo models

    OpenAIRE

    Lynn, Stuart

    2011-01-01

    In this thesis I present a novel method for constructing large scale mock galaxy and halo catalogues and apply this model to a number of important topics in modern cosmology. Traditionally such mocks are created through first evolving a high resolution particle simulation from a set of initial conditions to the present epoch, identifying bound structures and their evolution, and finally applying a semi-analytic prescription for galaxy formation. In contrast to this computatio...

  10. Shear viscosity from a large-Nc NJL model

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Robert; Kaiser, Norbert [TUM Physik Department, Garching (Germany); Weise, Wolfram [ECT, Villa Tambosi, Villazzano (Italy); TUM Physik Department, Garching (Germany)

    2015-07-01

    We calculate the ratio of shear viscosity to entropy density within a large-N{sub c} Nambu-Jona-Lasinio model. A consistent treatment of the Kubo formalism incorporating the full Dirac structure of the quark self-energy from mesonic fluctuations is presented. We compare our results to common approximation schemes applied to the Kubo formalism and to the quark self-energy.

  11. Protein homology model refinement by large-scale energy optimization.

    Science.gov (United States)

    Park, Hahnbeom; Ovchinnikov, Sergey; Kim, David E; DiMaio, Frank; Baker, David

    2018-03-20

    Proteins fold to their lowest free-energy structures, and hence the most straightforward way to increase the accuracy of a partially incorrect protein structure model is to search for the lowest-energy nearby structure. This direct approach has met with little success for two reasons: first, energy function inaccuracies can lead to false energy minima, resulting in model degradation rather than improvement; and second, even with an accurate energy function, the search problem is formidable because the energy only drops considerably in the immediate vicinity of the global minimum, and there are a very large number of degrees of freedom. Here we describe a large-scale energy optimization-based refinement method that incorporates advances in both search and energy function accuracy that can substantially improve the accuracy of low-resolution homology models. The method refined low-resolution homology models into correct folds for 50 of 84 diverse protein families and generated improved models in recent blind structure prediction experiments. Analyses of the basis for these improvements reveal contributions from both the improvements in conformational sampling techniques and the energy function.

  12. Crossed categorization beyond the two-group model.

    Science.gov (United States)

    Urada, Darren; Stenstrom, Douglas M; Miller, Norman

    2007-04-01

    Four studies examined processing of in-group and out-group information with stimuli that are more complex than those used in previous crossed categorization studies. A diverse set of predictions is generated by previous theoretical work to account for how participants will integrate information stemming from multiple group memberships. Heuristic, threshold-based processing of information was supported over algebraic processing. Participants appeared to divide stimuli into "in-grouplike" and "out-grouplike" metacategories. However, the threshold at which this distinction was made, and whether it was based on in-group favoritism or out-group derogation, was influenced by the nature of the situation and the task participants performed. Advantages of crossed categorization research that moves beyond the traditional two-group model are discussed.

  13. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz

    2017-01-01

    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  14. Classifiers as a model-free group comparison test.

    Science.gov (United States)

    Kim, Bommae; Oertzen, Timo von

    2018-02-01

    The conventional statistical methods to detect group differences assume correct model specification, including the origin of difference. Researchers should be able to identify a source of group differences and choose a corresponding method. In this paper, we propose a new approach of group comparison without model specification using classification algorithms in machine learning. In this approach, the classification accuracy is evaluated against a binomial distribution using Independent Validation. As an application example, we examined false-positive errors and statistical power of support vector machines to detect group differences in comparison to conventional statistical tests such as t test, Levene's test, K-S test, Fisher's z-transformation, and MANOVA. The SVMs detected group differences regardless of their origins (mean, variance, distribution shape, and covariance), and showed comparably consistent power across conditions. When a group difference originated from a single source, the statistical power of SVMs was lower than the most appropriate conventional test of the study condition; however, the power of SVMs increased when differences originated from multiple sources. Moreover, SVMs showed substantially improved performance with more variables than with fewer variables. Most importantly, SVMs were applicable to any types of data without sophisticated model specification. This study demonstrates a new application of classification algorithms as an alternative or complement to the conventional group comparison test. With the proposed approach, researchers can test two-sample data even when they are not certain which statistical test to use or when data violates the statistical assumptions of conventional methods.

  15. Resolving Microzooplankton Functional Groups In A Size-Structured Planktonic Model

    Science.gov (United States)

    Taniguchi, D.; Dutkiewicz, S.; Follows, M. J.; Jahn, O.; Menden-Deuer, S.

    2016-02-01

    Microzooplankton are important marine grazers, often consuming a large fraction of primary productivity. They consist of a great diversity of organisms with different behaviors, characteristics, and rates. This functional diversity, and its consequences, are not currently reflected in large-scale ocean ecological simulations. How should these organisms be represented, and what are the implications for their biogeography? We develop a size-structured, trait-based model to characterize a diversity of microzooplankton functional groups. We compile and examine size-based laboratory data on the traits, revealing some patterns with size and functional group that we interpret with mechanistic theory. Fitting the model to the data provides parameterizations of key rates and properties, which we employ in a numerical ocean model. The diversity of grazing preference, rates, and trophic strategies enables the coexistence of different functional groups of micro-grazers under various environmental conditions, and the model produces testable predictions of the biogeography.

  16. Group Elevator Peak Scheduling Based on Robust Optimization Model

    Directory of Open Access Journals (Sweden)

    ZHANG, J.

    2013-08-01

    Full Text Available Scheduling of Elevator Group Control System (EGCS is a typical combinatorial optimization problem. Uncertain group scheduling under peak traffic flows has become a research focus and difficulty recently. RO (Robust Optimization method is a novel and effective way to deal with uncertain scheduling problem. In this paper, a peak scheduling method based on RO model for multi-elevator system is proposed. The method is immune to the uncertainty of peak traffic flows, optimal scheduling is realized without getting exact numbers of each calling floor's waiting passengers. Specifically, energy-saving oriented multi-objective scheduling price is proposed, RO uncertain peak scheduling model is built to minimize the price. Because RO uncertain model could not be solved directly, RO uncertain model is transformed to RO certain model by elevator scheduling robust counterparts. Because solution space of elevator scheduling is enormous, to solve RO certain model in short time, ant colony solving algorithm for elevator scheduling is proposed. Based on the algorithm, optimal scheduling solutions are found quickly, and group elevators are scheduled according to the solutions. Simulation results show the method could improve scheduling performances effectively in peak pattern. Group elevators' efficient operation is realized by the RO scheduling method.

  17. Simulation of large-scale rule-based models

    Energy Technology Data Exchange (ETDEWEB)

    Hlavacek, William S [Los Alamos National Laboratory; Monnie, Michael I [Los Alamos National Laboratory; Colvin, Joshua [NON LANL; Faseder, James [NON LANL

    2008-01-01

    Interactions of molecules, such as signaling proteins, with multiple binding sites and/or multiple sites of post-translational covalent modification can be modeled using reaction rules. Rules comprehensively, but implicitly, define the individual chemical species and reactions that molecular interactions can potentially generate. Although rules can be automatically processed to define a biochemical reaction network, the network implied by a set of rules is often too large to generate completely or to simulate using conventional procedures. To address this problem, we present DYNSTOC, a general-purpose tool for simulating rule-based models. DYNSTOC implements a null-event algorithm for simulating chemical reactions in a homogenous reaction compartment. The simulation method does not require that a reaction network be specified explicitly in advance, but rather takes advantage of the availability of the reaction rules in a rule-based specification of a network to determine if a randomly selected set of molecular components participates in a reaction during a time step. DYNSTOC reads reaction rules written in the BioNetGen language which is useful for modeling protein-protein interactions involved in signal transduction. The method of DYNSTOC is closely related to that of STOCHSIM. DYNSTOC differs from STOCHSIM by allowing for model specification in terms of BNGL, which extends the range of protein complexes that can be considered in a model. DYNSTOC enables the simulation of rule-based models that cannot be simulated by conventional methods. We demonstrate the ability of DYNSTOC to simulate models accounting for multisite phosphorylation and multivalent binding processes that are characterized by large numbers of reactions. DYNSTOC is free for non-commercial use. The C source code, supporting documentation and example input files are available at .

  18. Thermodynamic Modeling of Organic-Inorganic Aerosols with the Group-Contribution Model AIOMFAC

    Science.gov (United States)

    Zuend, A.; Marcolli, C.; Luo, B. P.; Peter, T.

    2009-04-01

    Liquid aerosol particles are - from a physicochemical viewpoint - mixtures of inorganic salts, acids, water and a large variety of organic compounds (Rogge et al., 1993; Zhang et al., 2007). Molecular interactions between these aerosol components lead to deviations from ideal thermodynamic behavior. Strong non-ideality between organics and dissolved ions may influence the aerosol phases at equilibrium by means of liquid-liquid phase separations into a mainly polar (aqueous) and a less polar (organic) phase. A number of activity models exists to successfully describe the thermodynamic equilibrium of aqueous electrolyte solutions. However, the large number of different, often multi-functional, organic compounds in mixed organic-inorganic particles is a challenging problem for the development of thermodynamic models. The group-contribution concept as introduced in the UNIFAC model by Fredenslund et al. (1975), is a practical method to handle this difficulty and to add a certain predictability for unknown organic substances. We present the group-contribution model AIOMFAC (Aerosol Inorganic-Organic Mixtures Functional groups Activity Coefficients), which explicitly accounts for molecular interactions between solution constituents, both organic and inorganic, to calculate activities, chemical potentials and the total Gibbs energy of mixed systems (Zuend et al., 2008). This model enables the computation of vapor-liquid (VLE), liquid-liquid (LLE) and solid-liquid (SLE) equilibria within one framework. Focusing on atmospheric applications we considered eight different cations, five anions and a wide range of alcohols/polyols as organic compounds. With AIOMFAC, the activities of the components within an aqueous electrolyte solution are very well represented up to high ionic strength. We show that the semi-empirical middle-range parametrization of direct organic-inorganic interactions in alcohol-water-salt solutions enables accurate computations of vapor-liquid and liquid

  19. Validating modeled turbulent heat fluxes across large freshwater surfaces

    Science.gov (United States)

    Lofgren, B. M.; Fujisaki-Manome, A.; Gronewold, A.; Anderson, E. J.; Fitzpatrick, L.; Blanken, P.; Spence, C.; Lenters, J. D.; Xiao, C.; Charusambot, U.

    2017-12-01

    Turbulent fluxes of latent and sensible heat are important physical processes that influence the energy and water budgets of the Great Lakes. Validation and improvement of bulk flux algorithms to simulate these turbulent heat fluxes are critical for accurate prediction of hydrodynamics, water levels, weather, and climate over the region. Here we consider five heat flux algorithms from several model systems; the Finite-Volume Community Ocean Model, the Weather Research and Forecasting model, and the Large Lake Thermodynamics Model, which are used in research and operational environments and concentrate on different aspects of the Great Lakes' physical system, but interface at the lake surface. The heat flux algorithms were isolated from each model and driven by meteorological data from over-lake stations in the Great Lakes Evaporation Network. The simulation results were compared with eddy covariance flux measurements at the same stations. All models show the capacity to the seasonal cycle of the turbulent heat fluxes. Overall, the Coupled Ocean Atmosphere Response Experiment algorithm in FVCOM has the best agreement with eddy covariance measurements. Simulations with the other four algorithms are overall improved by updating the parameterization of roughness length scales of temperature and humidity. Agreement between modelled and observed fluxes notably varied with geographical locations of the stations. For example, at the Long Point station in Lake Erie, observed fluxes are likely influenced by the upwind land surface while the simulations do not take account of the land surface influence, and therefore the agreement is worse in general.

  20. Pelagic functional group modeling: Progress, challenges and prospects

    Science.gov (United States)

    Hood, Raleigh R.; Laws, Edward A.; Armstrong, Robert A.; Bates, Nicholas R.; Brown, Christopher W.; Carlson, Craig A.; Chai, Fei; Doney, Scott C.; Falkowski, Paul G.; Feely, Richard A.; Friedrichs, Marjorie A. M.; Landry, Michael R.; Keith Moore, J.; Nelson, David M.; Richardson, Tammi L.; Salihoglu, Baris; Schartau, Markus; Toole, Dierdre A.; Wiggert, Jerry D.

    2006-03-01

    In this paper, we review the state of the art and major challenges in current efforts to incorporate biogeochemical functional groups into models that can be applied on basin-wide and global scales, with an emphasis on models that might ultimately be used to predict how biogeochemical cycles in the ocean will respond to global warming. We define the term "biogeochemical functional group" to refer to groups of organisms that mediate specific chemical reactions in the ocean. Thus, according to this definition, "functional groups" have no phylogenetic meaning—these are composed of many different species with common biogeochemical functions. Substantial progress has been made in the last decade toward quantifying the rates of these various functions and understanding the factors that control them. For some of these groups, we have developed fairly sophisticated models that incorporate this understanding, e.g. for diazotrophs (e.g. Trichodesmium), silica producers (diatoms) and calcifiers (e.g. coccolithophorids and specifically Emiliania huxleyi). However, current representations of nitrogen fixation and calcification are incomplete, i.e., based primarily upon models of Trichodesmium and E. huxleyi, respectively, and many important functional groups have not yet been considered in open-ocean biogeochemical models. Progress has been made over the last decade in efforts to simulate dimethylsulfide (DMS) production and cycling (i.e., by dinoflagellates and prymnesiophytes) and denitrification, but these efforts are still in their infancy, and many significant problems remain. One obvious gap is that virtually all functional group modeling efforts have focused on autotrophic microbes, while higher trophic levels have been completely ignored. It appears that in some cases (e.g., calcification), incorporating higher trophic levels may be essential not only for representing a particular biogeochemical reaction, but also for modeling export. Another serious problem is our

  1. Architectural Large Constructed Environment. Modeling and Interaction Using Dynamic Simulations

    Science.gov (United States)

    Fiamma, P.

    2011-09-01

    How to use for the architectural design, the simulation coming from a large size data model? The topic is related to the phase coming usually after the acquisition of the data, during the construction of the model and especially after, when designers must have an interaction with the simulation, in order to develop and verify their idea. In the case of study, the concept of interaction includes the concept of real time "flows". The work develops contents and results that can be part of the large debate about the current connection between "architecture" and "movement". The focus of the work, is to realize a collaborative and participative virtual environment on which different specialist actors, client and final users can share knowledge, targets and constraints to better gain the aimed result. The goal is to have used a dynamic micro simulation digital resource that allows all the actors to explore the model in powerful and realistic way and to have a new type of interaction in a complex architectural scenario. On the one hand, the work represents a base of knowledge that can be implemented more and more; on the other hand the work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. The architectural design before, and the architectural fact after, both happen in a sort of "Spatial Analysis System". The way is open to offer to this "system", knowledge and theories, that can support architectural design work for every application and scale. We think that the presented work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. Architecture like a spatial configuration, that can be reconfigurable too through designing.

  2. ARCHITECTURAL LARGE CONSTRUCTED ENVIRONMENT. MODELING AND INTERACTION USING DYNAMIC SIMULATIONS

    Directory of Open Access Journals (Sweden)

    P. Fiamma

    2012-09-01

    Full Text Available How to use for the architectural design, the simulation coming from a large size data model? The topic is related to the phase coming usually after the acquisition of the data, during the construction of the model and especially after, when designers must have an interaction with the simulation, in order to develop and verify their idea. In the case of study, the concept of interaction includes the concept of real time "flows". The work develops contents and results that can be part of the large debate about the current connection between "architecture" and "movement". The focus of the work, is to realize a collaborative and participative virtual environment on which different specialist actors, client and final users can share knowledge, targets and constraints to better gain the aimed result. The goal is to have used a dynamic micro simulation digital resource that allows all the actors to explore the model in powerful and realistic way and to have a new type of interaction in a complex architectural scenario. On the one hand, the work represents a base of knowledge that can be implemented more and more; on the other hand the work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. The architectural design before, and the architectural fact after, both happen in a sort of "Spatial Analysis System". The way is open to offer to this "system", knowledge and theories, that can support architectural design work for every application and scale. We think that the presented work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. Architecture like a spatial configuration, that can be reconfigurable too through designing.

  3. Discrete time duration models with group-level heterogeneity

    DEFF Research Database (Denmark)

    Frederiksen, Anders; Honoré, Bo; Hu, Loujia

    2007-01-01

    Dynamic discrete choice panel data models have received a great deal of attention. In those models, the dynamics is usually handled by including the lagged outcome as an explanatory variable. In this paper we consider an alternative model in which the dynamics is handled by using the duration...... in the current state as a covariate. We propose estimators that allow for group-specific effect in parametric and semiparametric versions of the model. The proposed method is illustrated by an empirical analysis of job durations allowing for firm-level effects....

  4. Group Elevator Peak Scheduling Based on Robust Optimization Model

    OpenAIRE

    ZHANG, J.; ZONG, Q.

    2013-01-01

    Scheduling of Elevator Group Control System (EGCS) is a typical combinatorial optimization problem. Uncertain group scheduling under peak traffic flows has become a research focus and difficulty recently. RO (Robust Optimization) method is a novel and effective way to deal with uncertain scheduling problem. In this paper, a peak scheduling method based on RO model for multi-elevator system is proposed. The method is immune to the uncertainty of peak traffic flows, optimal scheduling is re...

  5. Acquisition Integration Models: How Large Companies Successfully Integrate Startups

    Directory of Open Access Journals (Sweden)

    Peter Carbone

    2011-10-01

    Full Text Available Mergers and acquisitions (M&A have been popular means for many companies to address the increasing pace and level of competition that they face. Large companies have pursued acquisitions to more quickly access technology, markets, and customers, and this approach has always been a viable exit strategy for startups. However, not all deals deliver the anticipated benefits, in large part due to poor integration of the acquired assets into the acquiring company. Integration can greatly impact the success of the acquisition and, indeed, the combined company’s overall market success. In this article, I explore the implementation of several integration models that have been put into place by a large company and extract principles that may assist negotiating parties with maximizing success. This perspective may also be of interest to smaller companies as they explore exit options while trying to ensure continued market success after acquisition. I assert that business success with acquisitions is dependent on an appropriate integration model, but that asset integration is not formulaic. Any integration effort must consider the specific market context and personnel involved.

  6. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  7. Effective models of new physics at the Large Hadron Collider

    International Nuclear Information System (INIS)

    Llodra-Perez, J.

    2011-07-01

    With the start of the Large Hadron Collider runs, in 2010, particle physicists will be soon able to have a better understanding of the electroweak symmetry breaking. They might also answer to many experimental and theoretical open questions raised by the Standard Model. Surfing on this really favorable situation, we will first present in this thesis a highly model-independent parametrization in order to characterize the new physics effects on mechanisms of production and decay of the Higgs boson. This original tool will be easily and directly usable in data analysis of CMS and ATLAS, the huge generalist experiments of LHC. It will help indeed to exclude or validate significantly some new theories beyond the Standard Model. In another approach, based on model-building, we considered a scenario of new physics, where the Standard Model fields can propagate in a flat six-dimensional space. The new spatial extra-dimensions will be compactified on a Real Projective Plane. This orbifold is the unique six-dimensional geometry which possesses chiral fermions and a natural Dark Matter candidate. The scalar photon, which is the lightest particle of the first Kaluza-Klein tier, is stabilized by a symmetry relic of the six dimension Lorentz invariance. Using the current constraints from cosmological observations and our first analytical calculation, we derived a characteristic mass range around few hundred GeV for the Kaluza-Klein scalar photon. Therefore the new states of our Universal Extra-Dimension model are light enough to be produced through clear signatures at the Large Hadron Collider. So we used a more sophisticated analysis of particle mass spectrum and couplings, including radiative corrections at one-loop, in order to establish our first predictions and constraints on the expected LHC phenomenology. (author)

  8. Ising model for collective decision making during group motion

    Science.gov (United States)

    Pinkoviezky, Itai; Gov, Nir; Couzin, Iain

    Collective decision making is a key feature during natural motion of animal groups and is also crucial for human groups. This phenomenon can be exemplified by the scenario of two subgroups that hold conflicting preferred directions of motion. The constraint of group cohesion drives the motion either towards a compromise or towards one of the preferred targets. The transition between compromise and decision has been found in simulations of flock models, but the nature of this transition is not well understood. We present a minimal spin model for this system where we interpret the spin-spin interaction as a social force. This model exhibits both first and second order transitions. The group motion changes from size-dependent diffusion at high temperatures to run-and-tumble motion below the critical temperature. In the presence of minority and majority subgroups, we find that there is a trade-off between the speed of reaching a target and the accuracy. We then compare the results of the spin model to detailed simulations of a flock model, and find overall very similar dynamics, with the role of the temperature taken by the inverse of the number of uninformed individuals.

  9. Multilevel Modeling of Individual and Group Level Mediated Effects.

    Science.gov (United States)

    Krull, J L; MacKinnon, D P

    2001-04-01

    This article combines procedures for single-level mediational analysis with multilevel modeling techniques in order to appropriately test mediational effects in clustered data. A simulation study compared the performance of these multilevel mediational models with that of single-level mediational models in clustered data with individual- or group-level initial independent variables, individual- or group-level mediators, and individual level outcomes. The standard errors of mediated effects from the multilevel solution were generally accurate, while those from the single-level procedure were downwardly biased, often by 20% or more. The multilevel advantage was greatest in those situations involving group-level variables, larger group sizes, and higher intraclass correlations in mediator and outcome variables. Multilevel mediational modeling methods were also applied to data from a preventive intervention designed to reduce intentions to use steroids among players on high school football teams. This example illustrates differences between single-level and multilevel mediational modeling in real-world clustered data and shows how the multilevel technique may lead to more accurate results.

  10. Report of cases of and taxonomic considerations for large-colony-forming Lancefield group C streptococcal bacteremia.

    OpenAIRE

    Carmeli, Y; Ruoff, K L

    1995-01-01

    Traditionally, group C streptococci include four species: Streptococcus equisimilis, S. zooepidemicus, S. equi, and S. dysgalactiae, the first three of which are group C beta-hemolytic streptococci (GCBHS). However, many of the beta-hemolytic streptococci carrying Lancefield group C antigen isolated from clinical specimens are S. milleri. These organisms can be differentiated by colony size. We retrospectively collected data concerning large-colony-forming GCBHS bacteremia that occurred durin...

  11. Distribution of ABO blood groups and rhesus factor in a Large Scale ...

    African Journals Online (AJOL)

    J. Torabizade maatoghi

    2015-08-20

    Aug 20, 2015 ... other blood transfusion dependent disorders in this province. Aim of the study: Due to the presence of various ethnic groups in Khuzestan province, several types of blood components are required. Knowing the distribution of blood groups in different blood collection centers and tribes is vital for proper ...

  12. Interactive modeling, design and analysis of large spacecraft

    Science.gov (United States)

    Garrett, L. B.

    1982-01-01

    An efficient computer aided design and analysis capability applicable to large space structures was developed to relieve the engineer of much of the effort required in the past. The automated capabilities can be used to rapidly synthesize, evaluate, and determine performance characteristics and costs for future large spacecraft concepts. The interactive design and evaluation of advanced spacecraft program (IDEAS) is used to illustrate the power, efficiency, and versatility of the approach. The coupling of space environment modeling algorithms with simplified analysis and design modules in the IDEAS program permits rapid evaluation of completing spacecraft and mission designs. The approach is particularly useful in the conceptual design phase of advanced space missions when a multiplicity of concepts must be considered before a limited set can be selected or more detailed analysis. Integrated spacecraft systems level data and data files are generated or subsystems and mission reexamination and/or refinement and for more rigorous analyses.

  13. Dimensional reduction of Markov state models from renormalization group theory

    Science.gov (United States)

    Orioli, S.; Faccioli, P.

    2016-09-01

    Renormalization Group (RG) theory provides the theoretical framework to define rigorous effective theories, i.e., systematic low-resolution approximations of arbitrary microscopic models. Markov state models are shown to be rigorous effective theories for Molecular Dynamics (MD). Based on this fact, we use real space RG to vary the resolution of the stochastic model and define an algorithm for clustering microstates into macrostates. The result is a lower dimensional stochastic model which, by construction, provides the optimal coarse-grained Markovian representation of the system's relaxation kinetics. To illustrate and validate our theory, we analyze a number of test systems of increasing complexity, ranging from synthetic toy models to two realistic applications, built form all-atom MD simulations. The computational cost of computing the low-dimensional model remains affordable on a desktop computer even for thousands of microstates.

  14. Large animal models for vaccine development and testing.

    Science.gov (United States)

    Gerdts, Volker; Wilson, Heather L; Meurens, Francois; van Drunen Littel-van den Hurk, Sylvia; Wilson, Don; Walker, Stewart; Wheler, Colette; Townsend, Hugh; Potter, Andrew A

    2015-01-01

    The development of human vaccines continues to rely on the use of animals for research. Regulatory authorities require novel vaccine candidates to undergo preclinical assessment in animal models before being permitted to enter the clinical phase in human subjects. Substantial progress has been made in recent years in reducing and replacing the number of animals used for preclinical vaccine research through the use of bioinformatics and computational biology to design new vaccine candidates. However, the ultimate goal of a new vaccine is to instruct the immune system to elicit an effective immune response against the pathogen of interest, and no alternatives to live animal use currently exist for evaluation of this response. Studies identifying the mechanisms of immune protection; determining the optimal route and formulation of vaccines; establishing the duration and onset of immunity, as well as the safety and efficacy of new vaccines, must be performed in a living system. Importantly, no single animal model provides all the information required for advancing a new vaccine through the preclinical stage, and research over the last two decades has highlighted that large animals more accurately predict vaccine outcome in humans than do other models. Here we review the advantages and disadvantages of large animal models for human vaccine development and demonstrate that much of the success in bringing a new vaccine to market depends on choosing the most appropriate animal model for preclinical testing. © The Author 2015. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  15. Large Scale Computing for the Modelling of Whole Brain Connectivity

    DEFF Research Database (Denmark)

    Albers, Kristoffer Jon

    , which allows us to couple and explore different models and sampling procedures in runtime, still being applied to full-sized data. Using the implemented tools, we demonstrate that the models successfully can be applied for clustering whole-brain connectivity networks. Without being informed of spatial......The human brain constitutes an impressive network formed by the structural and functional connectivity patterns between billions of neurons. Modern functional and diffusion magnetic resonance imaging (fMRI and dMRI) provides unprecedented opportunities for exploring the functional and structural...... organization of the brain in continuously increasing resolution. From these images, networks of structural and functional connectivity can be constructed. Bayesian stochastic block modelling provides a prominent data-driven approach for uncovering the latent organization, by clustering the networks into groups...

  16. Working group report: Flavor physics and model building

    Indian Academy of Sciences (India)

    While activities in flavor physics have been mainly focused on -physics, those in model building have been primarily devoted to neutrino physics. We present summary of working group discussions carried out during the workshop in the above fields, and also briefly review the progress made in some projects subsequently ...

  17. A stochastic large deformation model for computational anatomy

    DEFF Research Database (Denmark)

    Arnaudon, Alexis; Holm, Darryl D.; Pai, Akshay Sadananda Uppinakudru

    2017-01-01

    In the study of shapes of human organs using computational anatomy, variations are found to arise from inter-subject anatomical differences, disease-specific effects, and measurement noise. This paper introduces a stochastic model for incorporating random variations into the Large Deformation...... Diffeomorphic Metric Mapping (LDDMM) framework. By accounting for randomness in a particular setup which is crafted to fit the geometrical properties of LDDMM, we formulate the template estimation problem for landmarks with noise and give two methods for efficiently estimating the parameters of the noise fields...

  18. The Cognitive Complexity in Modelling the Group Decision Process

    Directory of Open Access Journals (Sweden)

    Barna Iantovics

    2010-06-01

    Full Text Available The paper investigates for some basic contextual factors (such
    us the problem complexity, the users' creativity and the problem space complexity the cognitive complexity associated with modelling the group decision processes (GDP in e-meetings. The analysis is done by conducting a socio-simulation experiment for an envisioned collaborative software tool that acts as a stigmergic environment for modelling the GDP. The simulation results revels some interesting design guidelines for engineering some contextual functionalities that minimize the cognitive complexity associated with modelling the GDP.

  19. Development and In silico Evaluation of Large-Scale Metabolite Identification Methods using Functional Group Detection for Metabolomics

    Directory of Open Access Journals (Sweden)

    Joshua M Mitchell

    2014-07-01

    Full Text Available Large-scale identification of metabolites is key to elucidating and modeling metabolism at the systems level. Advances in metabolomics technologies, particularly ultra-high resolution mass spectrometry enable comprehensive and rapid analysis of metabolites. However, a significant barrier to meaningful data interpretation is the identification of a wide range of metabolites including unknowns and the determination of their role(s in various metabolic networks. Chemoselective (CS probes to tag metabolite functional groups combined with high mass accuracy provide additional structural constraints for metabolite identification and quantification. We have developed a novel algorithm, Chemically Aware Substructure Search (CASS that efficiently detects functional groups within existing metabolite databases, allowing for combined molecular formula and functional group (from CS tagging queries to aid in metabolite identification without a priori knowledge. Analysis of the isomeric compounds in both Human Metabolome Database (HMDB and KEGG Ligand demonstrated a high percentage of isomeric molecular formulae (43% and 28% respectively, indicating the necessity for techniques such as CS-tagging. Furthermore, these two databases have only moderate overlap in molecular formulae. Thus, it is prudent to use multiple databases in metabolite assignment, since each major metabolite database represents different portions of metabolism within the biosphere. In silico analysis of various CS-tagging strategies under different conditions for adduct formation demonstrate that combined FT-MS derived molecular formulae and CS-tagging can uniquely identify up to 71% of KEGG and 37% of the combined KEGG/HMDB database versus 41% and 17% respectively without adduct formation. This difference between database isomer disambiguation highlights the strength of CS-tagging for non-lipid metabolite identification. However, unique identification of complex lipids still needs

  20. Soil carbon management in large-scale Earth system modelling

    DEFF Research Database (Denmark)

    Olin, S.; Lindeskog, M.; Pugh, T. A. M.

    2015-01-01

    Croplands are vital ecosystems for human well-being and provide important ecosystem services such as crop yields, retention of nitrogen and carbon storage. On large (regional to global)-scale levels, assessment of how these different services will vary in space and time, especially in response......, carbon sequestration and nitrogen leaching from croplands are evaluated and discussed. Compared to the version of LPJ-GUESS that does not include land-use dynamics, estimates of soil carbon stocks and nitrogen leaching from terrestrial to aquatic ecosystems were improved. Our model experiments allow us...... modelling C–N interactions in agricultural ecosystems under future environmental change and the effects these have on terrestrial biogeochemical cycles....

  1. Large-Signal DG-MOSFET Modelling for RFID Rectification

    Directory of Open Access Journals (Sweden)

    R. Rodríguez

    2016-01-01

    Full Text Available This paper analyses the undoped DG-MOSFETs capability for the operation of rectifiers for RFIDs and Wireless Power Transmission (WPT at microwave frequencies. For this purpose, a large-signal compact model has been developed and implemented in Verilog-A. The model has been numerically validated with a device simulator (Sentaurus. It is found that the number of stages to achieve the optimal rectifier performance is inferior to that required with conventional MOSFETs. In addition, the DC output voltage could be incremented with the use of appropriate mid-gap metals for the gate, as TiN. Minor impact of short channel effects (SCEs on rectification is also pointed out.

  2. Design and modelling of innovative machinery systems for large ships

    DEFF Research Database (Denmark)

    Larsen, Ulrik

    recovery (WHR) systems. Studies of alternative WHR systems in other applications suggests that the Kalina cycle and the organic Rankine cycle (ORC) can provide significant advantages over the steam Rankine cycle, which is currently used for marine WHR. This thesis aims at creating a better understanding...... consisting of a two-zone combustion and NOx emission model, a double Wiebe heat release model, the Redlich-Kwong equation of state and the Woschni heat loss correlation. A novel methodology is presented and used to determine the optimum organic Rankine cycle process layout, working fluid and process...... of the Kalina cycle and the ORC in the application on board large ships; the thermodynamic performances of the mentioned power cycles are compared. Recommendations of suitable system layouts and working fluids for the marine applications are provided along with methodologies useful for the design...

  3. Using Facebook Groups to Encourage Science Discussions in a Large-Enrollment Biology Class

    Science.gov (United States)

    Pai, Aditi; McGinnis, Gene; Bryant, Dana; Cole, Megan; Kovacs, Jennifer; Stovall, Kyndra; Lee, Mark

    2017-01-01

    This case study reports the instructional development, impact, and lessons learned regarding the use of Facebook as an educational tool within a large enrollment Biology class at Spelman College (Atlanta, GA). We describe the use of this social networking site to (a) engage students in active scientific discussions, (b) build community within the…

  4. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  5. Topic modeling for cluster analysis of large biological and medical datasets.

    Science.gov (United States)

    Zhao, Weizhong; Zou, Wen; Chen, James J

    2014-01-01

    The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than traditional methods, suggesting

  6. Geometric algorithms for electromagnetic modeling of large scale structures

    Science.gov (United States)

    Pingenot, James

    With the rapid increase in the speed and complexity of integrated circuit designs, 3D full wave and time domain simulation of chip, package, and board systems becomes more and more important for the engineering of modern designs. Much effort has been applied to the problem of electromagnetic (EM) simulation of such systems in recent years. Major advances in boundary element EM simulations have led to O(n log n) simulations using iterative methods and advanced Fast. Fourier Transform (FFT), Multi-Level Fast Multi-pole Methods (MLFMM), and low-rank matrix compression techniques. These advances have been augmented with an explosion of multi-core and distributed computing technologies, however, realization of the full scale of these capabilities has been hindered by cumbersome and inefficient geometric processing. Anecdotal evidence from industry suggests that users may spend around 80% of turn-around time manipulating the geometric model and mesh. This dissertation addresses this problem by developing fast and efficient data structures and algorithms for 3D modeling of chips, packages, and boards. The methods proposed here harness the regular, layered 2D nature of the models (often referred to as "2.5D") to optimize these systems for large geometries. First, an architecture is developed for efficient storage and manipulation of 2.5D models. The architecture gives special attention to native representation of structures across various input models and special issues particular to 3D modeling. The 2.5D structure is then used to optimize the mesh systems First, circuit/EM co-simulation techniques are extended to provide electrical connectivity between objects. This concept is used to connect independently meshed layers, allowing simple and efficient 2D mesh algorithms to be used in creating a 3D mesh. Here, adaptive meshing is used to ensure that the mesh accurately models the physical unknowns (current and charge). Utilizing the regularized nature of 2.5D objects and

  7. Modeling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Vandelli, Wainer; Froening, Holger

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  8. Modelling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  9. Graphs of groups on surfaces interactions and models

    CERN Document Server

    White, AT

    2001-01-01

    The book, suitable as both an introductory reference and as a text book in the rapidly growing field of topological graph theory, models both maps (as in map-coloring problems) and groups by means of graph imbeddings on sufaces. Automorphism groups of both graphs and maps are studied. In addition connections are made to other areas of mathematics, such as hypergraphs, block designs, finite geometries, and finite fields. There are chapters on the emerging subfields of enumerative topological graph theory and random topological graph theory, as well as a chapter on the composition of English

  10. Distribution of ABO blood groups and rhesus factor in a Large Scale ...

    African Journals Online (AJOL)

    Background: The demand for blood and blood products has increased due to advances in medical science, population growth and increased life expectancy. This has increased the need for various blood groups in Khuzestan province because of the higher incidence of thalassemia and other blood transfusion dependent ...

  11. Small groups, large profits: Calculating interest rates in community-managed microfinance

    DEFF Research Database (Denmark)

    Rasmussen, Ole Dahl

    2012-01-01

    , it is impossible to compare returns in savings groups with returns elsewhere. Moreover, the interest on savings is incomparable to the interest rate on loans. I argue for the use of a standardized comparable metric and suggest easy ways to implement it. Developments of new tools and standard along these lines...

  12. Compact groups in theory and practice - IV. The connection to large-scale structure

    Science.gov (United States)

    Mendel, J. Trevor; Ellison, Sara L.; Simard, Luc; Patton, David R.; McConnachie, Alan W.

    2011-12-01

    We investigate the properties of photometrically selected compact groups (CGs) in the Sloan Digital Sky Survey. In this paper, the fourth in a series, we focus on understanding the characteristics of our observed CG sample with particular attention paid to quantifying and removing contamination from projected foreground or background galaxies. Based on a simple comparison of pairwise redshift likelihoods, we find that approximately half of CGs in the parent sample contain one or more projected (interloping) members; our final clean sample contains 4566 galaxies in 1086 CGs. We show that half of the remaining CGs are associated with rich groups (or clusters), i.e. they are embedded sub-structure. The other half have spatial distributions and number-density profiles consistent with the interpretation that they are either independently distributed structures within the field (i.e. they are isolated) or associated with relatively poor structures. Comparisons of late-type and red-sequence fractions in radial annuli show that galaxies around apparently isolated CGs resemble the field population by 300 to 500 kpc from the group centre. In contrast, the galaxy population surrounding embedded CGs appears to remain distinct from the field out beyond 1 to 2 Mpc, consistent with results for rich groups. We take this as additional evidence that the observed distinction between CGs, i.e. isolated versus embedded, is a separation between different host environments.

  13. Bismut's way of the Malliavin calculus for large order generators on a Lie group

    Science.gov (United States)

    Léandre, Rémi

    2018-01-01

    We adapt Bismut's mechanism of the Malliavin Calculus to right invariant big order generator on a Lie group. We use deeply the symmetry in order to avoid the use of the Malliavin matrix. As an application, we deduce logarithmic estimates in small time of the heat kernel.

  14. Correlates of sedentary time in different age groups: results from a large cross sectional Dutch survey

    NARCIS (Netherlands)

    Bernaards, C.; Hildebrandt, V.H.; Hendriksen, I.J.

    2016-01-01

    Background. Evidence shows that prolonged sitting is associated with an increased risk of mortality, independent of physical activity (PA). The aim of the study was to identify correlates of sedentary time (ST) in different age groups and day types (i.e. school-/work day versus non-school-/non-work

  15. Data Mining Projects, Discoveries and Statistics in Large Astronomical Archives: The Astrostatistics Group of the Spanish Virtual Observatory

    Science.gov (United States)

    Sarro, L. M.; Torres, M. García; López, M.; Berihuete, A.; Márquez, M. J.; Sedano, F. García

    Part of the work carried out by the Spanish Virtual Observatory (SVO) is the development and test of techniques for the discovery of knowledge from large astronomical databases. The Virtual Observatory (VO) technology provides the astronomical community with archives containing large amounts of information which, analyzed with the proper tools, can lead to new scientific discoveries. In the SVO Astrostatistics Group we work on the application of techniques coming from the Statistic and Artificial Intelligence fields to large astronomical databases. In this paper we present some examples.

  16. Modeling containment of large wildfires using generalized linear mixed-model analysis

    Science.gov (United States)

    Mark Finney; Isaac C. Grenfell; Charles W. McHugh

    2009-01-01

    Billions of dollars are spent annually in the United States to contain large wildland fires, but the factors contributing to suppression success remain poorly understood. We used a regression model (generalized linear mixed-model) to model containment probability of individual fires, assuming that containment was a repeated-measures problem (fixed effect) and...

  17. Report of cases of and taxonomic considerations for large-colony-forming Lancefield group C streptococcal bacteremia.

    Science.gov (United States)

    Carmeli, Y; Ruoff, K L

    1995-08-01

    Traditionally, group C streptococci include four species: Streptococcus equisimilis, S. zooepidemicus, S. equi, and S. dysgalactiae, the first three of which are group C beta-hemolytic streptococci (GCBHS). However, many of the beta-hemolytic streptococci carrying Lancefield group C antigen isolated from clinical specimens are S. milleri. These organisms can be differentiated by colony size. We retrospectively collected data concerning large-colony-forming GCBHS bacteremia that occurred during a period of 8 years at the Massachusetts General Hospital. A total of 222 cases of beta-hemolytic streptococcal bacteremia were identified; data on the Lancefield grouping were available in 192 cases: 45 cases (23.6%) were group A, 96 cases (50%) were group B, 7 cases (3.6%) were group C (large colony forming), and 44 cases (22.9%) were group G. The medical records for cases of large-colony-forming GCBHS bacteremia were reviewed. In one case, the isolate was thought to be a contaminant; the other six cases are reported (five males and one female; mean age, 55 years). All patients had severe underlying conditions, and none had a history of exposure to animals. The clinical syndromes included two cases of cellulitis and one case each of endocarditis, myocardial infarction complicated by infection, pneumonia, and myofasciitis. The diagnoses for two patients with endovascular infections were delayed. Three of the six patients had fatal outcomes, and other two, after prolonged hospitalization, were transferred to a long-term rehabilitation center. We concluded that the severe outcomes reflect delay in diagnosis and treatment as well as the severity of the underlying diseases. The taxonomy of GCBHS is discussed. More reports differentiating large- and small-colony-forming GCBHS are needed.

  18. The monster sporadic group and a theory underlying superstring models

    International Nuclear Information System (INIS)

    Chapline, G.

    1996-09-01

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs

  19. Dynamics of a large extra dimension inspired hybrid inflation model

    International Nuclear Information System (INIS)

    Green, Anne M.; Mazumdar, Anupam

    2002-01-01

    In low scale quantum gravity scenarios the fundamental scale of nature can be as low as 1 TeV, in order to address the naturalness of the electroweak scale. A number of difficulties arise in constructing specific models: stabilization of the radius of the extra dimensions, avoidance of overproduction of Kaluza-Klein modes, achieving successful baryogenesis and production of a close to scale-invariant spectrum of density perturbations with the correct amplitude. We examine in detail the dynamics, including radion stabilization, of a hybrid inflation model that has been proposed in order to address these difficulties, where the inflaton is a gauge singlet residing in the bulk. We find that for a low fundamental scale the phase transition, which in standard four dimensional hybrid models usually ends inflation, is slow and there is a second phase of inflation lasting for a large number of e-foldings. The density perturbations on cosmologically interesting scales exit the Hubble radius during this second phase of inflation, and we find that their amplitude is far smaller than is required. We find that the duration of the second phase of inflation can be short, so that cosmologically interesting scales exit the Hubble radius prior to the phase transition, and the density perturbations have the correct amplitude, only if the fundamental scale takes an intermediate value. Finally we comment briefly on the implications of an intermediate fundamental scale for the production of primordial black holes and baryogenesis

  20. Improving large-scale groundwater models by considering fossil gradients

    Science.gov (United States)

    Schulz, Stephan; Walther, Marc; Michelsen, Nils; Rausch, Randolf; Dirks, Heiko; Al-Saud, Mohammed; Merz, Ralf; Kolditz, Olaf; Schüth, Christoph

    2017-05-01

    Due to limited availability of surface water, many arid to semi-arid countries rely on their groundwater resources. Despite the quasi-absence of present day replenishment, some of these groundwater bodies contain large amounts of water, which was recharged during pluvial periods of the Late Pleistocene to Early Holocene. These mostly fossil, non-renewable resources require different management schemes compared to those which are usually applied in renewable systems. Fossil groundwater is a finite resource and its withdrawal implies mining of aquifer storage reserves. Although they receive almost no recharge, some of them show notable hydraulic gradients and a flow towards their discharge areas, even without pumping. As a result, these systems have more discharge than recharge and hence are not in steady state, which makes their modelling, in particular the calibration, very challenging. In this study, we introduce a new calibration approach, composed of four steps: (i) estimating the fossil discharge component, (ii) determining the origin of fossil discharge, (iii) fitting the hydraulic conductivity with a pseudo steady-state model, and (iv) fitting the storage capacity with a transient model by reconstructing head drawdown induced by pumping activities. Finally, we test the relevance of our approach and evaluated the effect of considering or ignoring fossil gradients on aquifer parameterization for the Upper Mega Aquifer (UMA) on the Arabian Peninsula.

  1. A controlled trial of active versus passive learning strategies in a large group setting.

    Science.gov (United States)

    Haidet, Paul; Morgan, Robert O; O'Malley, Kimberly; Moran, Betty Jeanne; Richards, Boyd F

    2004-01-01

    To compare the effects of active and didactic teaching strategies on learning- and process-oriented outcomes. Controlled trial. After-hours residents' teaching session. Family and Community Medicine, Internal Medicine, and Pediatrics residents at two academic medical institutions. We randomly assigned residents to two groups. One group received a didactic lecture on effective use of diagnostic tests; during this session, the teacher spent a full hour delivering content. The other group received the same content in a session structured to foster resident-to-resident interactions. In the latter session, the teacher spent only 30 minutes directly delivering content to residents. We measured residents' knowledge about and attitudes toward the session content before, immediately after, and one month after each session. We measured residents' perceptions of engagement and session value immediately after each session. We employed blinded observers who used a structured instrument to observe residents' activities during each session. Both teaching methods led to improvements in residents' scores on both knowledge and attitude assessments. The amount of improvement was not statistically different between groups. Residents in the active learning session perceived themselves, and were observed to be, more engaged with the session content and each other than residents in the didactic session. Residents in the didactic session perceived greater educational value from the session compared to residents in the active session. We reduced the amount of time spent in teacher-driven content delivery by 50 percent and covered the same amount of content with no detrimental effects on knowledge acquisition or attitude enhancement. Teaching strategies that foster learner-to-learner interactions will lead to more active engagement among learners, however, these learners may value the session less. Further research is needed to explore learner perceptions of the teaching process and other

  2. Comparison of 12-step groups to mutual help alternatives for AUD in a large, national study: Differences in membership characteristics and group participation, cohesion, and satisfaction.

    Science.gov (United States)

    Zemore, Sarah E; Kaskutas, Lee Ann; Mericle, Amy; Hemberg, Jordana

    2017-02-01

    Many studies suggest that participation in 12-step groups contributes to better recovery outcomes, but people often object to such groups and most do not sustain regular involvement. Yet, research on alternatives to 12-step groups is very sparse. The present study aimed to extend the knowledge base on mutual help group alternatives for those with an alcohol use disorder (AUD), sampling from large, active, abstinence-focused groups including Women for Sobriety (WFS), LifeRing, and SMART Recovery (SMART). This paper presents a cross-sectional analysis of this longitudinal study, using baseline data to describe the profile and participation characteristics of attendees of these groups in comparison to 12-step members. Data from participants 18 and over with a lifetime AUD (N=651) were collected using Web-based surveys. Members of alternative 12-step groups were recruited in collaboration with group directors, who helped publicize the study by emailing meeting conveners and attendees and posting announcements on social media. A comparison group of current (past-30-day) 12-step attendees was recruited from an online meeting hub for recovering persons. Interested parties were directed to a Webpage where they were screened, and eligible participants completed an online survey assessing demographic and clinical variables; in-person and online mutual help involvement; and group satisfaction and cohesion. Analyses involved comparing those identifying WFS, SMART, and LifeRing as their primary group to 12-step members on the above characteristics. Compared to 12-step members, members of the mutual help alternatives were less religious and generally higher on education and income. WFS and LifeRing members were also older, more likely to be married, and lower on lifetime drug and psychiatric severity; meanwhile, LifeRing and SMART members were less likely to endorse the most stringent abstinence goal. Finally, despite lower levels of in-person meeting attendance, members of all

  3. A model for amalgamation in group decision making

    Science.gov (United States)

    Cutello, Vincenzo; Montero, Javier

    1992-01-01

    In this paper we present a generalization of the model proposed by Montero, by allowing non-complete fuzzy binary relations for individuals. A degree of unsatisfaction can be defined in this case, suggesting that any democratic aggregation rule should take into account not only ethical conditions or some degree of rationality in the amalgamating procedure, but also a minimum support for the set of alternatives subject to the group analysis.

  4. Study on dynamic multi-objective approach considering coal and water conflict in large scale coal group

    Science.gov (United States)

    Feng, Qing; Lu, Li

    2018-01-01

    In the process of coal mining, destruction and pollution of groundwater in has reached an imminent time, and groundwater is not only related to the ecological environment, but also affect the health of human life. Similarly, coal and water conflict is still one of the world's problems in large scale coal mining regions. Based on this, this paper presents a dynamic multi-objective optimization model to deal with the conflict of the coal and water in the coal group with multiple subordinate collieries and arrive at a comprehensive arrangement to achieve environmentally friendly coal mining strategy. Through calculation, this paper draws the output of each subordinate coal mine. And on this basis, we continue to adjust the environmental protection parameters to compare the coal production at different collieries at different stages under different attitude of the government. At last, the paper conclude that, in either case, it is the first arrangement to give priority to the production of low-drainage, high-yield coal mines.

  5. Comparing Indirect Effects in Different Groups in Single-Group and Multi-Group Structural Equation Models

    Directory of Open Access Journals (Sweden)

    Ehri Ryu

    2017-05-01

    Full Text Available In this article, we evaluated the performance of statistical methods in single-group and multi-group analysis approaches for testing group difference in indirect effects and for testing simple indirect effects in each group. We also investigated whether the performance of the methods in the single-group approach was affected when the assumption of equal variance was not satisfied. The assumption was critical for the performance of the two methods in the single-group analysis: the method using a product term for testing the group difference in a single path coefficient, and the Wald test for testing the group difference in the indirect effect. Bootstrap confidence intervals in the single-group approach and all methods in the multi-group approach were not affected by the violation of the assumption. We compared the performance of the methods and provided recommendations.

  6. Large geospatial images discovery: metadata model and technological framework

    Directory of Open Access Journals (Sweden)

    Lukáš Brůha

    2015-12-01

    Full Text Available The advancements in geospatial web technology triggered efforts for disclosure of valuable resources of historical collections. This paper focuses on the role of spatial data infrastructures (SDI in such efforts. The work describes the interplay between SDI technologies and potential use cases in libraries such as cartographic heritage. The metadata model is introduced to link up the sources from these two distinct fields. To enhance the data search capabilities, the work focuses on the representation of the content-based metadata of raster images, which is the crucial prerequisite to target the search in a more effective way. The architecture of the prototype system for automatic raster data processing, storage, analysis and distribution is introduced. The architecture responds to the characteristics of input datasets, namely to the continuous flow of very large raster data and related metadata. Proposed solutions are illustrated on the case study of cartometric analysis of digitised early maps and related metadata encoding.

  7. A large animal model for boron neutron capture therapy

    International Nuclear Information System (INIS)

    Gavin, P.R.; Kraft, S.L.; DeHaan, C.E.; Moore, M.P.; Griebenow, M.L.

    1992-01-01

    An epithermal neutron beam is needed to treat relatively deep seated tumors. The scattering characteristics of neutrons in this energy range dictate that in vivo experiments be conducted in a large animal to prevent unacceptable total body irradiation. The canine species has proven an excellent model to evaluate the various problems of boron neutron capture utilizing an epithermal neutron beam. This paper discusses three major components of the authors study: (1) the pharmacokinetics of borocaptate sodium (NA 2 B 12 H 11 SH or BSH) in dogs with spontaneously occurring brain tumors, (2) the radiation tolerance of normal tissues in the dog using an epithermal beam alone and in combination with borocaptate sodium, and (3) initial treatment of dogs with spontaneously occurring brain tumors utilizing borocaptate sodium and an epithermal neutron beam

  8. Modeling of large-scale oxy-fuel combustion processes

    DEFF Research Database (Denmark)

    Yin, Chungen

    2012-01-01

    Quite some studies have been conducted in order to implement oxy-fuel combustion with flue gas recycle in conventional utility boilers as an effective effort of carbon capture and storage. However, combustion under oxy-fuel conditions is significantly different from conventional air-fuel firing......, among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes. Oxy-fuel combustion of natural gas in a 609MW utility boiler is numerically studied, in which...... calculation of the oxy-fuel WSGGM remarkably over-predicts the radiative heat transfer to the furnace walls and under-predicts the gas temperature at the furnace exit plane, which also result in a higher incomplete combustion in the gray calculation. Moreover, the gray and non-gray calculations of the same...

  9. Parameterization of Fire Injection Height in Large Scale Transport Model

    Science.gov (United States)

    Paugam, R.; Wooster, M.; Atherton, J.; Val Martin, M.; Freitas, S.; Kaiser, J. W.; Schultz, M. G.

    2012-12-01

    The parameterization of fire injection height in global chemistry transport model is currently a subject of debate in the atmospheric community. The approach usually proposed in the literature is based on relationships linking injection height and remote sensing products like the Fire Radiative Power (FRP) which can measure active fire properties. In this work we present an approach based on the Plume Rise Model (PRM) developed by Freitas et al (2007, 2010). This plume model is already used in different host models (e.g. WRF, BRAMS). In its original version, the fire is modeled by: a convective heat flux (CHF; pre-defined by the land cover and evaluated as a fixed part of the total heat released) and a plume radius (derived from the GOES Wildfire-ABBA product) which defines the fire extension where the CHF is homogeneously distributed. Here in our approach the Freitas model is modified, in particular we added (i) an equation for mass conservation, (ii) a scheme to parameterize horizontal entrainment/detrainment, and (iii) a new initialization module which estimates the sensible heat released by the fire on the basis of measured FRP rather than fuel cover type. FRP and Active Fire (AF) area necessary for the initialization of the model are directly derived from a modified version of the Dozier algorithm applied to the MOD14 product. An optimization (using the simulating annealing method) of this new version of the PRM is then proposed based on fire plume characteristics derived from the official MISR plume height project and atmospheric profiles extracted from the ECMWF analysis. The data set covers the main fire region (Africa, Siberia, Indonesia, and North and South America) and is set up to (i) retain fires where plume height and FRP can be easily linked (i.e. avoid large fire cluster where individual plume might interact), (ii) keep fire which show decrease of FRP and AF area after MISR overpass (i.e. to minimize effect of the time period needed for the plume to

  10. Large scale solar district heating. Evaluation, modelling and designing

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the tool for design studies and on a local energy planning case. The evaluation of the central solar heating technology is based on measurements on the case plant in Marstal, Denmark, and on published and unpublished data for other, mainly Danish, CSDHP plants. Evaluations on the thermal, economical and environmental performances are reported, based on the experiences from the last decade. The measurements from the Marstal case are analysed, experiences extracted and minor improvements to the plant design proposed. For the detailed designing and energy planning of CSDHPs, a computer simulation model is developed and validated on the measurements from the Marstal case. The final model is then generalised to a 'generic' model for CSDHPs in general. The meteorological reference data, Danish Reference Year, is applied to find the mean performance for the plant designs. To find the expectable variety of the thermal performance of such plants, a method is proposed where data from a year with poor solar irradiation and a year with strong solar irradiation are applied. Equipped with a simulation tool design studies are carried out spreading from parameter analysis over energy planning for a new settlement to a proposal for the combination of plane solar collectors with high performance solar collectors, exemplified by a trough solar collector. The methodology of utilising computer simulation proved to be a cheap and relevant tool in the design of future solar heating plants. The thesis also exposed the demand for developing computer models for the more advanced solar collector designs and especially for the control operation of CSHPs. In the final chapter the CSHP technology is put into perspective with respect to other possible technologies to find the relevance of the application

  11. Multistability in Large Scale Models of Brain Activity.

    Directory of Open Access Journals (Sweden)

    Mathieu Golos

    2015-12-01

    Full Text Available Noise driven exploration of a brain network's dynamic repertoire has been hypothesized to be causally involved in cognitive function, aging and neurodegeneration. The dynamic repertoire crucially depends on the network's capacity to store patterns, as well as their stability. Here we systematically explore the capacity of networks derived from human connectomes to store attractor states, as well as various network mechanisms to control the brain's dynamic repertoire. Using a deterministic graded response Hopfield model with connectome-based interactions, we reconstruct the system's attractor space through a uniform sampling of the initial conditions. Large fixed-point attractor sets are obtained in the low temperature condition, with a bigger number of attractors than ever reported so far. Different variants of the initial model, including (i a uniform activation threshold or (ii a global negative feedback, produce a similarly robust multistability in a limited parameter range. A numerical analysis of the distribution of the attractors identifies spatially-segregated components, with a centro-medial core and several well-delineated regional patches. Those different modes share similarity with the fMRI independent components observed in the "resting state" condition. We demonstrate non-stationary behavior in noise-driven generalizations of the models, with different meta-stable attractors visited along the same time course. Only the model with a global dynamic density control is found to display robust and long-lasting non-stationarity with no tendency toward either overactivity or extinction. The best fit with empirical signals is observed at the edge of multistability, a parameter region that also corresponds to the highest entropy of the attractors.

  12. Correlates of sedentary time in different age groups: results from a large cross sectional Dutch survey

    Directory of Open Access Journals (Sweden)

    Claire M. Bernaards

    2016-10-01

    Full Text Available Abstract Background Evidence shows that prolonged sitting is associated with an increased risk of mortality, independent of physical activity (PA. The aim of the study was to identify correlates of sedentary time (ST in different age groups and day types (i.e. school-/work day versus non-school-/non-work day. Methods The study sample consisted of 1895 Dutch children (4–11 years, 1131 adolescents (12–17 years, 8003 adults (18–64 years and 1569 elderly (65 years and older who enrolled in the Dutch continuous national survey ‘Injuries and Physical Activity in the Netherlands’ between 2006 and 2011. Respondents estimated the number of sitting hours during a regular school-/workday and a regular non-school/non-work day. Multiple linear regression analyses on cross-sectional data were used to identify correlates of ST. Results Significant positive associations with ST were observed for: higher age (4-to-17-year-olds and elderly, male gender (adults, overweight (children, higher education (adults ≥ 30 years, urban environment (adults, chronic disease (adults ≥ 30 years, sedentary work (adults, not meeting the moderate to vigorous PA (MVPA guideline (children and adults ≥ 30 years and not meeting the vigorous PA (VPA guideline (4-to-17-year-olds. Correlates of ST that significantly differed between day types were working hours and meeting the VPA guideline. More working hours were associated with more ST on school-/work days. In children and adolescents, meeting the VPA guideline was associated with less ST on non-school/non-working days only. Conclusions This study provides new insights in the correlates of ST in different age groups and thus possibilities for interventions in these groups. Correlates of ST appear to differ between age groups and to a lesser degree between day types. This implies that interventions to reduce ST should be age specific. Longitudinal studies are needed to draw conclusions on causality of

  13. Correlates of sedentary time in different age groups: results from a large cross sectional Dutch survey.

    Science.gov (United States)

    Bernaards, Claire M; Hildebrandt, Vincent H; Hendriksen, Ingrid J M

    2016-10-26

    Evidence shows that prolonged sitting is associated with an increased risk of mortality, independent of physical activity (PA). The aim of the study was to identify correlates of sedentary time (ST) in different age groups and day types (i.e. school-/work day versus non-school-/non-work day). The study sample consisted of 1895 Dutch children (4-11 years), 1131 adolescents (12-17 years), 8003 adults (18-64 years) and 1569 elderly (65 years and older) who enrolled in the Dutch continuous national survey 'Injuries and Physical Activity in the Netherlands' between 2006 and 2011. Respondents estimated the number of sitting hours during a regular school-/workday and a regular non-school/non-work day. Multiple linear regression analyses on cross-sectional data were used to identify correlates of ST. Significant positive associations with ST were observed for: higher age (4-to-17-year-olds and elderly), male gender (adults), overweight (children), higher education (adults ≥ 30 years), urban environment (adults), chronic disease (adults ≥ 30 years), sedentary work (adults), not meeting the moderate to vigorous PA (MVPA) guideline (children and adults ≥ 30 years) and not meeting the vigorous PA (VPA) guideline (4-to-17-year-olds). Correlates of ST that significantly differed between day types were working hours and meeting the VPA guideline. More working hours were associated with more ST on school-/work days. In children and adolescents, meeting the VPA guideline was associated with less ST on non-school/non-working days only. This study provides new insights in the correlates of ST in different age groups and thus possibilities for interventions in these groups. Correlates of ST appear to differ between age groups and to a lesser degree between day types. This implies that interventions to reduce ST should be age specific. Longitudinal studies are needed to draw conclusions on causality of the relationship between identified correlates and ST.

  14. Large-dimension configuration-interaction calculations of positron binding to the group-II atoms

    International Nuclear Information System (INIS)

    Bromley, M. W. J.; Mitroy, J.

    2006-01-01

    The configuration-interaction (CI) method is applied to the calculation of the structures of a number of positron binding systems, including e + Be, e + Mg, e + Ca, and e + Sr. These calculations were carried out in orbital spaces containing about 200 electron and 200 positron orbitals up to l=12. Despite the very large dimensions, the binding energy and annihilation rate converge slowly with l, and the final values do contain an appreciable correction obtained by extrapolating the calculation to the l→∞ limit. The binding energies were 0.00317 hartree for e + Be, 0.0170 hartree for e + Mg, 0.0189 hartree for e + Ca, and 0.0131 hartree for e + Sr

  15. Use of New Methodologies for Students Assessment in Large Groups in Engineering Education

    Directory of Open Access Journals (Sweden)

    B. Tormos

    2014-03-01

    Full Text Available In this paper, a student evaluation methodology which applies the concept of continuous assessment proposed by Bologna is presented for new degrees in higher education. An important part of the student's final grade is based on the performance of several individual works throughout the semester. The paper shows the correction system used which is based on using a spreadsheet with macros and a template in which the student provides the solution of each task. The employ of this correction system together with the available e-learning platform allows the teachers to perform automatic tasks evaluations compatible with courses with large number of students. The paper also raises the different solutions adopted to avoid plagiarism and to try that the final grade reflects, as closely as possible, the knowledge acquired by the students.

  16. Examining the Content of Head Start Teachers' Literacy Instruction within Two Activity Contexts during Large-Group Circle Time

    Science.gov (United States)

    Zhang, Chenyi; Diamond, Karen E.; Powell, Douglas R.

    2015-01-01

    Large-group circle time is an important component of many preschool classrooms' daily schedules. This study scrutinized the teaching content of Head Start teachers' literacy instruction (i.e., the types of literacy concept embedded within the instruction, lexical characteristics of teachers' talk, and elaborations on literacy knowledge) in two…

  17. Large scale inference in the Infinite Relational Model: Gibbs sampling is not enough

    DEFF Research Database (Denmark)

    Albers, Kristoffer Jon; Moth, Andreas Leon Aagard; Mørup, Morten

    2013-01-01

    The stochastic block-model and its non-parametric extension, the Infinite Relational Model (IRM), have become key tools for discovering group-structure in complex networks. Identifying these groups is a combinatorial inference problem which is usually solved by Gibbs sampling. However, whether...... Gibbs sampling suffices and can be scaled to the modeling of large scale real world complex networks has not been examined sufficiently. In this paper we evaluate the performance and mixing ability of Gibbs sampling in the Infinite Relational Model (IRM) by implementing a high performance Gibbs sampler....... We find that Gibbs sampling can be computationally scaled to handle millions of nodes and billions of links. Investigating the behavior of the Gibbs sampler for different sizes of networks we find that the mixing ability decreases drastically with the network size, clearly indicating a need...

  18. GIS for large-scale watershed observational data model

    Science.gov (United States)

    Patino-Gomez, Carlos

    Because integrated management of a river basin requires the development of models that are used for many purposes, e.g., to assess risks and possible mitigation of droughts and floods, manage water rights, assess water quality, and simply to understand the hydrology of the basin, the development of a relational database from which models can access the various data needed to describe the systems being modeled is fundamental. In order for this concept to be useful and widely applicable, however, it must have a standard design. The recently developed ArcHydro data model facilitates the organization of data according to the "basin" principle and allows access to hydrologic information by models. The development of a basin-scale relational database for the Rio Grande/Bravo basin implemented in a Geographic Information System is one of the contributions of this research. This geodatabase represents the first major attempt to establish a more complete understanding of the basin as a whole, including spatial and temporal information obtained from the United States of America and Mexico. Difficulties in processing raster datasets over large regions are studied in this research. One of the most important contributions is the application of a Raster-Network Regionalization technique, which utilizes raster-based analysis at the subregional scale in an efficient manner and combines the resulting subregional vector datasets into a regional database. Another important contribution of this research is focused on implementing a robust structure for handling huge temporal data sets related to monitoring points such as hydrometric and climatic stations, reservoir inlets and outlets, water rights, etc. For the Rio Grande study area, the ArcHydro format is applied to the historical information collected in order to include and relate these time series to the monitoring points in the geodatabase. Its standard time series format is changed to include a relationship to the agency from

  19. SCIMAP: Modelling Diffuse Pollution in Large River Basins

    Science.gov (United States)

    Milledge, D.; Heathwaite, L.; Lane, S. N.; Reaney, S. M.

    2009-12-01

    Polluted rivers are a problem for the plants and animals that require clean water to survive. Watershed scale processes can influence instream aquatic ecosystems by delivering fine sediment, solutes and organic matter from diffuse sources. To improve our rivers we need to identify the pollution sources. Models can help us to do this but these rarely address the extent to which risky land uses are hydrologically-connected, and hence able to deliver, to the drainage network. Those that do tend to apply a full hydrological scheme, which is unfeasible for large watersheds. Here we develop a risk-based modelling framework, SCIMAP, for diffuse pollution from agriculture (Nitrate, Phosphate and Fine Sediment). In each case the basis of the analysis is the joint consideration of the probability of a unit of land (25 m2 cell) producing a particular environmental risk and then of that risk reaching the river. The components share a common treatment of hydrological connectivity but differ in their treatment of each pollution type. We test and apply SCIMAP using spatially-distributed instream water quality data for some of the UK’s largest catchments to infer the processes and the associated process parameters that matter in defining their concentrations. We use these to identify a series of risky field locations, where this land use is readily connected to the river system by overland flow.

  20. Exploring medical student learning in the large group teaching environment: examining current practice to inform curricular development.

    Science.gov (United States)

    Luscombe, Ciara; Montgomery, Julia

    2016-07-19

    Lectures continue to be an efficient and standardised way to deliver information to large groups of students. It has been well documented that students prefer interactive lectures, based on active learning principles, to didactic teaching in the large group setting. Despite this, it is often the case than many students do not engage with active learning tasks and attempts at interaction. By exploring student experiences, expectations and how they use lectures in their learning we will provide recommendations for faculty to support student learning both in the lecture theatre and during personal study time. This research employed a hermeneutic phenomenological approach. Three focus groups, consisting of 19 students in total, were used to explore the experiences of second year medical students in large group teaching sessions. Using generic thematic data analysis, these accounts have been developed into a meaningful account of experience. This study found there to be a well-established learning culture amongst students and with it, expectations as to the format of teaching sessions. Furthermore, there were set perceptions about the student role within the learning environment which had many implications, including the way that innovative teaching methods were received. Student learning was perceived to take place outside the lecture theatre, with a large emphasis placed on creating resources that can be taken away to use in personal study time. Presented here is a constructive review of reasons for student participation, interaction and engagement in large group teaching sessions. Based on this are recommendations constructed with the view to aid educators in engaging students within this setting. Short term, educators can implement strategies that monopolise on the established learning culture of students to encourage engagement with active learning strategies. Long term, it would be beneficial for educators to consider ways to shift the current student learning

  1. High Luminosity Large Hadron Collider A description for the European Strategy Preparatory Group

    CERN Document Server

    Rossi, L

    2012-01-01

    The Large Hadron Collider (LHC) is the largest scientific instrument ever built. It has been exploring the new energy frontier since 2009, gathering a global user community of 7,000 scientists. It will remain the most powerful accelerator in the world for at least two decades, and its full exploitation is the highest priority in the European Strategy for Particle Physics, adopted by the CERN Council and integrated into the ESFRI Roadmap. To extend its discovery potential, the LHC will need a major upgrade around 2020 to increase its luminosity (rate of collisions) by a factor of 10 beyond its design value. As a highly complex and optimized machine, such an upgrade of the LHC must be carefully studied and requires about 10 years to implement. The novel machine configuration, called High Luminosity LHC (HL-LHC), will rely on a number of key innovative technologies, representing exceptional technological challenges, such as cutting-edge 13 tesla superconducting magnets, very compact and ultra-precise superconduc...

  2. Modelling of heat transfer during torrefaction of large lignocellulosic biomass

    Science.gov (United States)

    Regmi, Bharat; Arku, Precious; Tasnim, Syeda Humaira; Mahmud, Shohel; Dutta, Animesh

    2018-02-01

    Preparation of feedstock is a major energy intensive process for the thermochemical conversion of biomass into fuel. By eliminating the need to grind biomass prior to the torrefaction process, there would be a potential gain in the energy requirements as the entire step would be eliminated. In regards to a commercialization of torrefaction technology, this study has examined heat transfer inside large cylindrical biomass both numerically and experimentally during torrefaction. A numerical axis-symmetrical 2-D model for heat transfer during torrefaction at 270°C for 1 h was created in COMSOL Multiphysics 5.1 considering heat generation evaluated from the experiment. The model analyzed the temperature distribution within the core and on the surface of biomass during torrefaction for various sizes. The model results showed similarities with experimental results. The effect of L/D ratio on temperature distribution within biomass was observed by varying length and diameter and compared with experiments in literature to find out an optimal range of cylindrical biomass size suitable for torrefaction. The research demonstrated that a cylindrical biomass sample of 50 mm length with L/D ratio of 2 can be torrefied with a core-surface temperature difference of less than 30 °C. The research also demonstrated that sample length has a negligible effect on core-surface temperature difference during torrefaction when the diameter is fixed at 25 mm. This information will help to design a torrefaction processing system and develop a value chain for biomass supply without using an energy-intensive grinding process.

  3. Use of the LQ model with large fraction sizes results in underestimation of isoeffect doses

    International Nuclear Information System (INIS)

    Sheu, Tommy; Molkentine, Jessica; Transtrum, Mark K.; Buchholz, Thomas A.; Withers, Hubert Rodney; Thames, Howard D.; Mason, Kathy A.

    2013-01-01

    Purpose: To test the appropriateness of the linear-quadratic (LQ) model to describe survival of jejunal crypt clonogens after split doses with variable (small 1–6 Gy, large 8–13 Gy) first dose, as a model of its appropriateness for both small and large fraction sizes. Methods: C3Hf/KamLaw mice were exposed to whole body irradiation using 300 kVp X-rays at a dose rate of 1.84 Gy/min, and the number of viable jejunal crypts was determined using the microcolony assay. 14 Gy total dose was split into unequal first and second fractions separated by 4 h. Data were analyzed using the LQ model, the lethal potentially lethal (LPL) model, and a repair-saturation (RS) model. Results: Cell kill was greater in the group receiving the larger fraction first, creating an asymmetry in the plot of survival vs size of first dose, as opposed to the prediction of the LQ model of a symmetric response. There was a significant difference in the estimated βs (higher β after larger first doses), but no significant difference in the αs, when large doses were given first vs small doses first. This difference results in underestimation (based on present data by approximately 8%) of isoeffect doses using LQ model parameters based on small fraction sizes. While the LPL model also predicted a symmetric response inconsistent with the data, the RS model results were consistent with the observed asymmetry. Conclusion: The LQ model underestimates doses for isoeffective crypt-cell survival with large fraction sizes (in the present setting, >9 Gy)

  4. Environmental Impacts of Large Scale Biochar Application Through Spatial Modeling

    Science.gov (United States)

    Huber, I.; Archontoulis, S.

    2017-12-01

    In an effort to study the environmental (emissions, soil quality) and production (yield) impacts of biochar application at regional scales we coupled the APSIM-Biochar model with the pSIMS parallel platform. So far the majority of biochar research has been concentrated on lab to field studies to advance scientific knowledge. Regional scale assessments are highly needed to assist decision making. The overall objective of this simulation study was to identify areas in the USA that have the most gain environmentally from biochar's application, as well as areas which our model predicts a notable yield increase due to the addition of biochar. We present the modifications in both APSIM biochar and pSIMS components that were necessary to facilitate these large scale model runs across several regions in the United States at a resolution of 5 arcminutes. This study uses the AgMERRA global climate data set (1980-2010) and the Global Soil Dataset for Earth Systems modeling as a basis for creating its simulations, as well as local management operations for maize and soybean cropping systems and different biochar application rates. The regional scale simulation analysis is in progress. Preliminary results showed that the model predicts that high quality soils (particularly those common to Iowa cropping systems) do not receive much, if any, production benefit from biochar. However, soils with low soil organic matter ( 0.5%) do get a noteworthy yield increase of around 5-10% in the best cases. We also found N2O emissions to be spatial and temporal specific; increase in some areas and decrease in some other areas due to biochar application. In contrast, we found increases in soil organic carbon and plant available water in all soils (top 30 cm) due to biochar application. The magnitude of these increases (% change from the control) were larger in soil with low organic matter (below 1.5%) and smaller in soils with high organic matter (above 3%) and also dependent on biochar

  5. Clustering of local group distances: publication bias or correlated measurements? I. The large Magellanic cloud

    Energy Technology Data Exchange (ETDEWEB)

    De Grijs, Richard [Kavli Institute for Astronomy and Astrophysics, Peking University, Yi He Yuan Lu 5, Hai Dian District, Beijing 100871 (China); Wicker, James E. [National Astronomical Observatories, Chinese Academy of Sciences, 20A Datun Road, Chaoyang District, Beijing 100012 (China); Bono, Giuseppe [Dipartimento di Fisica, Università di Roma Tor Vergata, via Della Ricerca Scientifica 1, I-00133 Roma (Italy)

    2014-05-01

    The distance to the Large Magellanic Cloud (LMC) represents a key local rung of the extragalactic distance ladder yet the galaxy's distance modulus has long been an issue of contention, in particular in view of claims that most newly determined distance moduli cluster tightly—and with a small spread—around the 'canonical' distance modulus, (m – M){sub 0} = 18.50 mag. We compiled 233 separate LMC distance determinations published between 1990 and 2013. Our analysis of the individual distance moduli, as well as of their two-year means and standard deviations resulting from this largest data set of LMC distance moduli available to date, focuses specifically on Cepheid and RR Lyrae variable-star tracer populations, as well as on distance estimates based on features in the observational Hertzsprung-Russell diagram. We conclude that strong publication bias is unlikely to have been the main driver of the majority of published LMC distance moduli. However, for a given distance tracer, the body of publications leading to the tightly clustered distances is based on highly non-independent tracer samples and analysis methods, hence leading to significant correlations among the LMC distances reported in subsequent articles. Based on a careful, weighted combination, in a statistical sense, of the main stellar population tracers, we recommend that a slightly adjusted canonical distance modulus of (m – M){sub 0} = 18.49 ± 0.09 mag be used for all practical purposes that require a general distance scale without the need for accuracies of better than a few percent.

  6. GROUP GUIDANCE SERVICES MANAGEMENT OF BEHAVIORAL TECHNIC HOMEWORK MODEL

    Directory of Open Access Journals (Sweden)

    Juhri A M.

    2013-09-01

    Full Text Available Abstract: This simple paper describes the implementation of management guidance service groups using the model home visits behavioral techniques (behavior technic homework. The ideas outlined in this paper are intended to add insight for counselors in the management of the implementation of counseling services group that carried out effectively. This simple paper is expected to be used as reference studies in theoretical matters relating to the management guidance services group, for counselors to students both need guidance services and those who passively as they face various problems difficulties martial jar and obstacles in the achievement of learning , In general, this study aims to provide insight in particular in the development of social skills for students, especially the ability to communicate with the participants of the service (students more While specifically to encourage the development of feelings, thoughts, perceptions, insights and attitudes that support embodiments behavior Iebih creative and effective in improving communication skills both verbal and non-verbal for students. Keyword: counselor, counseling, group, student

  7. A Three-groups Model for High Throughput Survival Screens

    Science.gov (United States)

    Shaby, Benjamin A.; Skibinski, Gaia; Ando, Michael; LaDow, Eva S.; Finkbeiner, Steven

    2016-01-01

    Summary Amyotrophic lateral sclerosis (ALS) is a neurodegenerative condition characterized by the progressive deterioration of motor neurons in the cortex and spinal cord. Using an automated robotic microscope platform that enables the longitudinal tracking of thousands of single neurons, we examine the effects a large library of compounds on modulating the survival of primary neurons expressing a mutation known to cause ALS. The goal of our analysis is to identify the few potentially beneficial compounds among the many assayed, the vast majority of which do not extend neuronal survival. This resembles the large-scale simultaneous inference scenario familiar from microarray analysis, but transferred to the survival analysis setting due to the novel experimental setup. We apply a three component mixture model to censored survival times of thousands of individual neurons subjected to hundreds of different compounds. The shrinkage induced by our model significantly improves performance in simulations relative to performing treatment-wise survival analysis and subsequent multiple testing adjustment. Our analysis identified compounds that provide insight into potential novel therapeutic strategies for ALS. PMID:26821783

  8. Computation of Large Molecules with the Hartree-Fock Model

    Science.gov (United States)

    Clementi, Enrico

    1972-01-01

    The usual way to compute Hartree-Fock type functions for molecules is by an expansion of the one-electron functions (molecular orbitals) in a linear combination of analytical functions (LCAO-MO-SCF, linear combination of atomic orbitals—Molecular Orbital—Self Consistent field). The expansion coefficients are obtained variationally. This technique requires the computation of several multicenter two-electron integrals (representing the electron-electron interaction) proportional to the fourth power of the basis set size. There are several types of basis sets; the Gaussian type introduced by S. F. Boys is used herein. Since it requires from a minimum of 10 (or 15) Gaussian-type functions to about 25 (or 30) Gaussian functions to describe a second-row atom in a molecule, the fourth power dependency of the basis set has been the de facto bottleneck of quantum chemical computations in the last decade. In this paper, the concept is introduced of a “dynamical” basis set, which allows for drastic computational simplifications while retaining full numerical accuracy. Examples are given that show that computational saving in computer time of more than a factor of one hundred is achieved and that large basis sets (up to the order of several hundred Gaussian functions per molecule) can be used routinely. It is noted that the limitation in the Hartree-Fock energy (correlation energy error) can be easily computed by use of a statistical model introduced by Wigner for solid-state systems in 1934. Thus, large molecules can now be simulated by computational techniques without reverting to semi-empirical parameterization and without requiring enormous computational time and storage. PMID:16592020

  9. Extended Group Contribution Model for Polyfunctional Phase Equilibria

    DEFF Research Database (Denmark)

    Abildskov, Jens

    Material and energy balances and equilibrium data form the basis of most design calculations. While material and energy balances may be stated without much difficulty, the design engineer is left with a choice between a wide variety of models for describing phase equilibria in the design...... of physical separation processes. In a thermodynamic sense, design requires detailed knowledge of activity coefficients in the phases at equilibrium. The prediction of these quantities from a minimum of experimental data is the broad scope of this thesis. Adequate equations exist for predicting vapor......-liquid equilibria from data on binary mixtures, composed of structurally simple molecules with a single functional group. More complex is the situation with mixtures composed of structurally more complicated molecules or molecules with more than one functional group. The UNIFAC method is extended to handle...

  10. Trials of large group teaching in Malaysian private universities: a cross sectional study of teaching medicine and other disciplines

    Directory of Open Access Journals (Sweden)

    Too LaySan

    2011-09-01

    Full Text Available Abstract Background This is a pilot cross sectional study using both quantitative and qualitative approach towards tutors teaching large classes in private universities in the Klang Valley (comprising Kuala Lumpur, its suburbs, adjoining towns in the State of Selangor and the State of Negeri Sembilan, Malaysia. The general aim of this study is to determine the difficulties faced by tutors when teaching large group of students and to outline appropriate recommendations in overcoming them. Findings Thirty-two academics from six private universities from different faculties such as Medical Sciences, Business, Information Technology, and Engineering disciplines participated in this study. SPSS software was used to analyse the data. The results in general indicate that the conventional instructor-student approach has its shortcoming and requires changes. Interestingly, tutors from Medicine and IT less often faced difficulties and had positive experience in teaching large group of students. Conclusion However several suggestions were proposed to overcome these difficulties ranging from breaking into smaller classes, adopting innovative teaching, use of interactive learning methods incorporating interactive assessment and creative technology which enhanced students learning. Furthermore the study provides insights on the trials of large group teaching which are clearly identified to help tutors realise its impact on teaching. The suggestions to overcome these difficulties and to maximize student learning can serve as a guideline for tutors who face these challenges.

  11. Trials of large group teaching in Malaysian private universities: a cross sectional study of teaching medicine and other disciplines.

    Science.gov (United States)

    Thomas, Susan; Subramaniam, Shamini; Abraham, Mathew; Too, Laysan; Beh, Loosee

    2011-09-09

    This is a pilot cross sectional study using both quantitative and qualitative approach towards tutors teaching large classes in private universities in the Klang Valley (comprising Kuala Lumpur, its suburbs, adjoining towns in the State of Selangor) and the State of Negeri Sembilan, Malaysia. The general aim of this study is to determine the difficulties faced by tutors when teaching large group of students and to outline appropriate recommendations in overcoming them. Thirty-two academics from six private universities from different faculties such as Medical Sciences, Business, Information Technology, and Engineering disciplines participated in this study. SPSS software was used to analyse the data. The results in general indicate that the conventional instructor-student approach has its shortcoming and requires changes. Interestingly, tutors from Medicine and IT less often faced difficulties and had positive experience in teaching large group of students. However several suggestions were proposed to overcome these difficulties ranging from breaking into smaller classes, adopting innovative teaching, use of interactive learning methods incorporating interactive assessment and creative technology which enhanced students learning. Furthermore the study provides insights on the trials of large group teaching which are clearly identified to help tutors realise its impact on teaching. The suggestions to overcome these difficulties and to maximize student learning can serve as a guideline for tutors who face these challenges.

  12. Empirical Models of Social Learning in a Large, Evolving Network.

    Directory of Open Access Journals (Sweden)

    Ayşe Başar Bener

    Full Text Available This paper advances theories of social learning through an empirical examination of how social networks change over time. Social networks are important for learning because they constrain individuals' access to information about the behaviors and cognitions of other people. Using data on a large social network of mobile device users over a one-month time period, we test three hypotheses: 1 attraction homophily causes individuals to form ties on the basis of attribute similarity, 2 aversion homophily causes individuals to delete existing ties on the basis of attribute dissimilarity, and 3 social influence causes individuals to adopt the attributes of others they share direct ties with. Statistical models offer varied degrees of support for all three hypotheses and show that these mechanisms are more complex than assumed in prior work. Although homophily is normally thought of as a process of attraction, people also avoid relationships with others who are different. These mechanisms have distinct effects on network structure. While social influence does help explain behavior, people tend to follow global trends more than they follow their friends.

  13. Empirical Models of Social Learning in a Large, Evolving Network.

    Science.gov (United States)

    Bener, Ayşe Başar; Çağlayan, Bora; Henry, Adam Douglas; Prałat, Paweł

    2016-01-01

    This paper advances theories of social learning through an empirical examination of how social networks change over time. Social networks are important for learning because they constrain individuals' access to information about the behaviors and cognitions of other people. Using data on a large social network of mobile device users over a one-month time period, we test three hypotheses: 1) attraction homophily causes individuals to form ties on the basis of attribute similarity, 2) aversion homophily causes individuals to delete existing ties on the basis of attribute dissimilarity, and 3) social influence causes individuals to adopt the attributes of others they share direct ties with. Statistical models offer varied degrees of support for all three hypotheses and show that these mechanisms are more complex than assumed in prior work. Although homophily is normally thought of as a process of attraction, people also avoid relationships with others who are different. These mechanisms have distinct effects on network structure. While social influence does help explain behavior, people tend to follow global trends more than they follow their friends.

  14. Analysis and Design Environment for Large Scale System Models and Collaborative Model Development Project

    Data.gov (United States)

    National Aeronautics and Space Administration — As NASA modeling efforts grow more complex and more distributed among many working groups, new tools and technologies are required to integrate their efforts...

  15. Analysis and Design Environment for Large Scale System Models and Collaborative Model Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — As NASA modeling efforts grow more complex and more distributed among many working groups, new tools and technologies are required to integrate their efforts...

  16. Renormalization group approach to a p-wave superconducting model

    International Nuclear Information System (INIS)

    Continentino, Mucio A.; Deus, Fernanda; Caldas, Heron

    2014-01-01

    We present in this work an exact renormalization group (RG) treatment of a one-dimensional p-wave superconductor. The model proposed by Kitaev consists of a chain of spinless fermions with a p-wave gap. It is a paradigmatic model of great actual interest since it presents a weak pairing superconducting phase that has Majorana fermions at the ends of the chain. Those are predicted to be useful for quantum computation. The RG allows to obtain the phase diagram of the model and to study the quantum phase transition from the weak to the strong pairing phase. It yields the attractors of these phases and the critical exponents of the weak to strong pairing transition. We show that the weak pairing phase of the model is governed by a chaotic attractor being non-trivial from both its topological and RG properties. In the strong pairing phase the RG flow is towards a conventional strong coupling fixed point. Finally, we propose an alternative way for obtaining p-wave superconductivity in a one-dimensional system without spin–orbit interaction.

  17. One decade of the Data Fusion Information Group (DFIG) model

    Science.gov (United States)

    Blasch, Erik

    2015-05-01

    The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.

  18. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam's Window.

    Science.gov (United States)

    Onorante, Luca; Raftery, Adrian E

    2016-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam's window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods.

  19. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam’s Window*

    Science.gov (United States)

    Onorante, Luca; Raftery, Adrian E.

    2015-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam’s window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods. PMID:26917859

  20. Progress in lung modelling by the ICRP Task Group

    International Nuclear Information System (INIS)

    James, A.C.; Birchall, A.

    1989-01-01

    The Task Group has reviewed the data on: (a) morphology and physiology of the human respiratory tract; (b) inspirability of aerosols and their deposition in anatomical regions as functions of respiratory parameters; (c) clearance of particles within and from the respiratory tract; (d) absorption of different materials into the blood in humans and in animals. The Task Group proposes a new model which predicts the deposition, retention and systemic uptake of materials, enabling doses absorbed by different respiratory tissues and other body organs to be evaluated. In the proposed model, clearance is described in terms of competition between the processes moving particles to the oropharynx or to lymph nodes and that of absorption into the blood. From studies with human subjects, characteristic rates and pathways are defined to represent mechanical clearance of particles from each region, which do not depend on the material. Conversely, the absorption rate is determined solely by the material: it is assumed to be the same in all parts of the respiratory tract and in other animal species. For several of the radiologically important forms of actinides, absorption rates can be derived from animal experiments, or, in some cases, directly from human data. Otherwise, default values are used, based on the current D, W and Y classification system. (author)

  1. Description of group-theoretical model of developed turbulence

    International Nuclear Information System (INIS)

    Saveliev, V L; Gorokhovski, M A

    2008-01-01

    We propose to associate the phenomenon of stationary turbulence with the special self-similar solutions of the Euler equations. These solutions represent the linear superposition of eigenfields of spatial symmetry subgroup generators and imply their dependence on time through the parameter of the symmetry transformation only. From this model, it follows that for developed turbulent process, changing the scale of averaging (filtering) of the velocity field is equivalent to composition of scaling, translation and rotation transformations. We call this property a renormalization-group invariance of filtered turbulent fields. The renormalization group invariance provides an opportunity to transform the averaged Navier-Stokes equation over a small scale (inner threshold of the turbulence) to larger scales by simple scaling. From the methodological point of view, it is significant to note that the turbulent viscosity term appeared not as a result of averaging of the nonlinear term in the Navier-Stokes equation, but from the molecular viscosity term with the help of renormalization group transformation.

  2. Description of group-theoretical model of developed turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, V L [Institute of Ionosphere, Almaty 050020 (Kazakhstan); Gorokhovski, M A [Laboratoire de Mecanique des Fluides et Acoustique, Ecole Centrale de Lyon, 36, Avenue Guy de Collongues, F69134 Ecully-Cedex (France)], E-mail: saveliev@topmail.kz, E-mail: mikhael.gorokhovski@ec-lyon.fr

    2008-12-15

    We propose to associate the phenomenon of stationary turbulence with the special self-similar solutions of the Euler equations. These solutions represent the linear superposition of eigenfields of spatial symmetry subgroup generators and imply their dependence on time through the parameter of the symmetry transformation only. From this model, it follows that for developed turbulent process, changing the scale of averaging (filtering) of the velocity field is equivalent to composition of scaling, translation and rotation transformations. We call this property a renormalization-group invariance of filtered turbulent fields. The renormalization group invariance provides an opportunity to transform the averaged Navier-Stokes equation over a small scale (inner threshold of the turbulence) to larger scales by simple scaling. From the methodological point of view, it is significant to note that the turbulent viscosity term appeared not as a result of averaging of the nonlinear term in the Navier-Stokes equation, but from the molecular viscosity term with the help of renormalization group transformation.

  3. Renormalization group flow of scalar models in gravity

    International Nuclear Information System (INIS)

    Guarnieri, Filippo

    2014-01-01

    In this Ph.D. thesis we study the issue of renormalizability of gravitation in the context of the renormalization group (RG), employing both perturbative and non-perturbative techniques. In particular, we focus on different gravitational models and approximations in which a central role is played by a scalar degree of freedom, since their RG flow is easier to analyze. We restrict our interest in particular to two quantum gravity approaches that have gained a lot of attention recently, namely the asymptotic safety scenario for gravity and the Horava-Lifshitz quantum gravity. In the so-called asymptotic safety conjecture the high energy regime of gravity is controlled by a non-Gaussian fixed point which ensures non-perturbative renormalizability and finiteness of the correlation functions. We then investigate the existence of such a non trivial fixed point using the functional renormalization group, a continuum version of the non-perturbative Wilson's renormalization group. In particular we quantize the sole conformal degree of freedom, which is an approximation that has been shown to lead to a qualitatively correct picture. The question of the existence of a non-Gaussian fixed point in an infinite-dimensional parameter space, that is for a generic f(R) theory, cannot however be studied using such a conformally reduced model. Hence we study it by quantizing a dynamically equivalent scalar-tensor theory, i.e. a generic Brans-Dicke theory with ω=0 in the local potential approximation. Finally, we investigate, using a perturbative RG scheme, the asymptotic freedom of the Horava-Lifshitz gravity, that is an approach based on the emergence of an anisotropy between space and time which lifts the Newton's constant to a marginal coupling and explicitly preserves unitarity. In particular we evaluate the one-loop correction in 2+1 dimensions quantizing only the conformal degree of freedom.

  4. Large Sanjiang basin groups outside of the Songliao Basin Meso-Senozoic Tectonic-sediment evolution and hydrocarbon accumulation

    Science.gov (United States)

    Zheng, M.; Wu, X.

    2015-12-01

    The basis geological problem is still the bottleneck of the exploration work of the lager Sanjiang basin groups. In general terms, the problems are including the prototype basins and basin forming mechanism of two aspects. In this paper, using the field geological survey and investigation, logging data analysis, seismic data interpretation technical means large Sanjiang basin groups and basin forming mechanism of the prototype are discussed. Main draw the following conclusions: 1. Sanjiang region group-level formation can be completely contrasted. 2. Tension faults, compressive faults, shear structure composition and structure combination of four kinds of compound fracture are mainly developed In the study area. The direction of their distribution can be divided into SN, EW, NNE, NEE, NNW, NWW to other groups of fracture. 3. Large Sanjiang basin has the SN and the EW two main directions of tectonic evolution. Cenozoic basins in Sanjiang region in group formation located the two tectonic domains of ancient Paleo-Asian Ocean and the Pacific Interchange. 4. Large Sanjiang basin has experienced in the late Mesozoic tectonic evolution of two-stage and nine times. The first stage, developmental stage basement, they are ① Since the Mesozoic era and before the Jurassic; ② Early Jurassic period; The second stage, cap stage of development, they are ③ Late Jurassic depression developmental stages of compression; ④ Early Cretaceous rifting stage; ⑤ depression in mid-Early Cretaceous period; ⑥ tensile Early Cretaceous rifting stage; ⑦ inversion of Late Cretaceous tectonic compression stage; ⑧ Paleogene - Neogene; ⑨ After recently Ji Baoquan Sedimentary Ridge. 5. Large Sanjiang basin group is actually a residual basin structure, and Can be divided into left - superimposed (Founder, Tangyuan depression, Hulin Basin), residual - inherited type (Sanjiang basin), residual - reformed (Jixi, Boli, Hegang basin). there are two developed depression and the mechanism

  5. The art therapy large group as a teaching method for the institutional and political aspects of professional training

    OpenAIRE

    Skaife, Sally; Jones, Kevin

    2009-01-01

    This paper discusses a unique experiential teaching method in the context of training for art psychotherapists and raises issues relevant to teaching for all workers in health and social care. The art therapy large experiential group of all the students and all the staff (80+), which is held six times a year on the 2-year full-time/3-year part-time programme, is identified with three educational components: learning about art therapy processes, learning about the educational process of becomi...

  6. Two self-splicing group I introns in the ribonucleotide reductase large subunit gene of Staphylococcus aureus phage Twort

    OpenAIRE

    Landthaler, Markus; Begley, Ulrike; Lau, Nelson C.; Shub, David A.

    2002-01-01

    We have recently described three group I introns inserted into a single gene, orf142, of the staphylococcal bacteriophage Twort and suggested the presence of at least two additional self-splicing introns in this phage genome. Here we report that two previously uncharacterized introns, 429 and 1087 nt in length, interrupt the Twort gene coding for the large subunit of ribonucleotide reductase (nrdE). Reverse transcription-polymerase chain reaction (RT-PCR) of RNA isolated from Staphylococcus a...

  7. An Example of Large-group Drama and Cross-year Peer Assessment for Teaching Science in Higher Education

    Science.gov (United States)

    Sloman, Katherine; Thompson, Richard

    2010-09-01

    Undergraduate students pursuing a three-year marine biology degree programme (n = 86) experienced a large-group drama aimed at allowing them to explore how scientific research is funded and the associated links between science and society. In the drama, Year 1 students played the "general public" who decided which environmental research areas should be prioritised for funding, Year 2 students were the "scientists" who had to prepare research proposals which they hoped to get funded, and Year 3 students were the "research panel" who decided which proposals to fund with input from the priorities set by the "general public". The drama, therefore, included an element of cross-year peer assessment where Year 3 students evaluated the research proposals prepared by the Year 2 students. Questionnaires were distributed at the end of the activity to gather: (1) student perceptions on the cross-year nature of the exercise, (2) the use of peer assessment, and (3) their overall views on the drama. The students valued the opportunity to interact with their peers from other years of the degree programme and most were comfortable with the use of cross-year peer assessment. The majority of students felt that they had increased their knowledge of how research proposals are funded and the perceived benefits of the large-group drama included increased critical thinking ability, confidence in presenting work to others, and enhanced communication skills. Only one student did not strongly advocate the use of this large-group drama in subsequent years.

  8. What determines area burned in large landscapes? Insights from a decade of comparative landscape-fire modelling

    Science.gov (United States)

    Geoffrey J. Cary; Robert E. Keane; Mike D. Flannigan; Ian D. Davies; Russ A. Parsons

    2015-01-01

    Understanding what determines area burned in large landscapes is critical for informing wildland fire management in fire-prone environments and for representing fire activity in Dynamic Global Vegetation Models. For the past ten years, a group of landscape-fire modellers have been exploring the relative influence of key determinants of area burned in temperate and...

  9. Exploring the Impact of Students' Learning Approach on Collaborative Group Modeling of Blood Circulation

    Science.gov (United States)

    Lee, Shinyoung; Kang, Eunhee; Kim, Heui-Baik

    2015-01-01

    This study aimed to explore the effect on group dynamics of statements associated with deep learning approaches (DLA) and their contribution to cognitive collaboration and model development during group modeling of blood circulation. A group was selected for an in-depth analysis of collaborative group modeling. This group constructed a model in a…

  10. Sensitivity in forward modeled hyperspectral reflectance due to phytoplankton groups

    Science.gov (United States)

    Manzo, Ciro; Bassani, Cristiana; Pinardi, Monica; Giardino, Claudia; Bresciani, Mariano

    2016-04-01

    Phytoplankton is an integral part of the ecosystem, affecting trophic dynamics, nutrient cycling, habitat condition, and fisheries resources. The types of phytoplankton and their concentrations are used to describe the status of water and the processes inside of this. This study investigates bio-optical modeling of phytoplankton functional types (PFT) in terms of pigment composition demonstrating the capability of remote sensing to recognize freshwater phytoplankton. In particular, a sensitivity analysis of simulated hyperspectral water reflectance (with band setting of HICO, APEX, EnMAP, PRISMA and Sentinel-3) of productive eutrophic waters of Mantua lakes (Italy) environment is presented. The bio-optical model adopted for simulating the hyperspectral water reflectance takes into account the reflectance dependency on geometric conditions of light field, on inherent optical properties (backscattering and absorption coefficients) and on concentrations of water quality parameters (WQPs). The model works in the 400-750nm wavelength range, while the model parametrization is based on a comprehensive dataset of WQP concentrations and specific inherent optical properties of the study area, collected in field surveys carried out from May to September of 2011 and 2014. The following phytoplankton groups, with their specific absorption coefficients, a*Φi(λ), were used during the simulation: Chlorophyta, Cyanobacteria with phycocyanin, Cyanobacteria and Cryptophytes with phycoerythrin, Diatoms with carotenoids and mixed phytoplankton. The phytoplankton absorption coefficient aΦ(λ) is modelled by multiplying the weighted sum of the PFTs, Σpia*Φi(λ), with the chlorophyll-a concentration (Chl-a). To highlight the variability of water reflectance due to variation of phytoplankton pigments, the sensitivity analysis was performed by keeping constant the WQPs (i.e., Chl-a=80mg/l, total suspended matter=12.58g/l and yellow substances=0.27m-1). The sensitivity analysis was

  11. Modelling radicalization: how small violent fringe sects develop into large indoctrinated societies

    Science.gov (United States)

    Short, Martin B.; McCalla, Scott G.; D'Orsogna, Maria R.

    2017-08-01

    We model radicalization in a society consisting of two competing religious, ethnic or political groups. Each of the `sects' is divided into moderate and radical factions, with intra-group transitions occurring either spontaneously or through indoctrination. We also include the possibility of one group violently attacking the other. The intra-group transition rates of one group are modelled to explicitly depend on the actions and characteristics of the other, including violent episodes, effectively coupling the dynamics of the two sects. We use a game theoretic framework and assume that radical factions may tune `strategic' parameters to optimize given utility functions aimed at maximizing their ranks while minimizing the damage inflicted by their rivals. Constraints include limited overall resources that must be optimally allocated between indoctrination and external attacks on the other group. Various scenarios are considered, from symmetric sects whose behaviours mirror each other, to totally asymmetric ones where one sect may have a larger population or a superior resource availability. We discuss under what conditions sects preferentially employ indoctrination or violence, and how allowing sects to readjust their strategies allows for small, violent sects to grow into large, indoctrinated communities.

  12. Large-scale parallel configuration interaction. II. Two- and four-component double-group general active space implementation with application to BiH

    DEFF Research Database (Denmark)

    Knecht, Stefan; Jensen, Hans Jørgen Aagaard; Fleig, Timo

    2010-01-01

    We present a parallel implementation of a large-scale relativistic double-group configuration interaction CIprogram. It is applicable with a large variety of two- and four-component Hamiltonians. The parallel algorithm is based on a distributed data model in combination with a static load balancing...... scheme. The excellent scalability of our parallelization scheme is demonstrated in large-scale four-component multireference CI (MRCI) benchmark tests on two of the most common computer architectures, and we also discuss hardware-dependent aspects with respect to possible speedup limitations....... With the new code we have been able to calculate accurate spectroscopic properties for the ground state and the first excited state of the BiH molecule using extensive basis sets. We focused, in particular, on an accurate description of the splitting of these two states which is caused by spin-orbit coupling...

  13. Modeling the spreading of large-scale wildland fires

    Science.gov (United States)

    Mohamed Drissi

    2015-01-01

    The objective of the present study is twofold. First, the last developments and validation results of a hybrid model designed to simulate fire patterns in heterogeneous landscapes are presented. The model combines the features of a stochastic small-world network model with those of a deterministic semi-physical model of the interaction between burning and non-burning...

  14. Modeling Large Time Series for Efficient Approximate Query Processing

    DEFF Research Database (Denmark)

    Perera, Kasun S; Hahmann, Martin; Lehner, Wolfgang

    2015-01-01

    -wise aggregation to derive the models. These models are initially created from the original data and are kept in the database along with it. Subsequent queries are answered using the stored models rather than scanning and processing the original datasets. In order to support model query processing, we maintain...

  15. Key Informant Models for Measuring Group-Level Variables in Small Groups: Application to Plural Subject Theory

    Science.gov (United States)

    Algesheimer, René; Bagozzi, Richard P.; Dholakia, Utpal M.

    2018-01-01

    We offer a new conceptualization and measurement models for constructs at the group-level of analysis in small group research. The conceptualization starts with classical notions of group behavior proposed by Tönnies, Simmel, and Weber and then draws upon plural subject theory by philosophers Gilbert and Tuomela to frame a new perspective…

  16. A Mathematical Images Group Model to Estimate the Sound Level in a Close-Fitting Enclosure

    Directory of Open Access Journals (Sweden)

    Michael J. Panza

    2014-01-01

    Full Text Available This paper describes a special mathematical images model to determine the sound level inside a close-fitting sound enclosure. Such an enclosure is defined as the internal air volume defined by a machine vibration noise source at one wall and a parallel reflecting wall located very close to it and acts as the outside radiating wall of the enclosure. Four smaller surfaces define a parallelepiped for the volume. The main reverberation group is between the two large parallel planes. Viewed as a discrete line-type source, the main group is extended as additional discrete line-type source image groups due to reflections from the four smaller surfaces. The images group approach provides a convergent solution for the case where hard reflective surfaces are modeled with absorption coefficients equal to zero. Numerical examples are used to calculate the sound pressure level incident on the outside wall and the effect of adding high absorption to the front wall. This is compared to the result from the general large room diffuse reverberant field enclosure formula for several hard wall absorption coefficients and distances between machine and front wall. The images group method is shown to have low sensitivity to hard wall absorption coefficient value and presents a method where zero sound absorption for hard surfaces can be used rather than an initial hard surface sound absorption estimate or measurement to predict the internal sound levels the effect of adding absorption.

  17. Multi-scale inference of interaction rules in animal groups using Bayesian model selection.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    2012-01-01

    Full Text Available Inference of interaction rules of animals moving in groups usually relies on an analysis of large scale system behaviour. Models are tuned through repeated simulation until they match the observed behaviour. More recent work has used the fine scale motions of animals to validate and fit the rules of interaction of animals in groups. Here, we use a Bayesian methodology to compare a variety of models to the collective motion of glass prawns (Paratya australiensis. We show that these exhibit a stereotypical 'phase transition', whereby an increase in density leads to the onset of collective motion in one direction. We fit models to this data, which range from: a mean-field model where all prawns interact globally; to a spatial Markovian model where prawns are self-propelled particles influenced only by the current positions and directions of their neighbours; up to non-Markovian models where prawns have 'memory' of previous interactions, integrating their experiences over time when deciding to change behaviour. We show that the mean-field model fits the large scale behaviour of the system, but does not capture fine scale rules of interaction, which are primarily mediated by physical contact. Conversely, the Markovian self-propelled particle model captures the fine scale rules of interaction but fails to reproduce global dynamics. The most sophisticated model, the non-Markovian model, provides a good match to the data at both the fine scale and in terms of reproducing global dynamics. We conclude that prawns' movements are influenced by not just the current direction of nearby conspecifics, but also those encountered in the recent past. Given the simplicity of prawns as a study system our research suggests that self-propelled particle models of collective motion should, if they are to be realistic at multiple biological scales, include memory of previous interactions and other non-Markovian effects.

  18. Parallel runs of a large air pollution model on a grid of Sun computers

    DEFF Research Database (Denmark)

    Alexandrov, V.N.; Owczarz, W.; Thomsen, Per Grove

    2004-01-01

    Large -scale air pollution models can successfully be used in different environmental studies. These models are described mathematically by systems of partial differential equations. Splitting procedures followed by discretization of the spatial derivatives leads to several large systems of ordin...

  19. Multicriteria decision group model for the selection of suppliers

    Directory of Open Access Journals (Sweden)

    Luciana Hazin Alencar

    2008-08-01

    Full Text Available Several authors have been studying group decision making over the years, which indicates how relevant it is. This paper presents a multicriteria group decision model based on ELECTRE IV and VIP Analysis methods, to those cases where there is great divergence among the decision makers. This model includes two stages. In the first, the ELECTRE IV method is applied and a collective criteria ranking is obtained. In the second, using criteria ranking, VIP Analysis is applied and the alternatives are selected. To illustrate the model, a numerical application in the context of the selection of suppliers in project management is used. The suppliers that form part of the project team have a crucial role in project management. They are involved in a network of connected activities that can jeopardize the success of the project, if they are not undertaken in an appropriate way. The question tackled is how to select service suppliers for a project on behalf of an enterprise that assists the multiple objectives of the decision-makers.Vários autores têm estudado decisão em grupo nos últimos anos, o que indica a relevância do assunto. Esse artigo apresenta um modelo multicritério de decisão em grupo baseado nos métodos ELECTRE IV e VIP Analysis, adequado aos casos em que se tem uma grande divergência entre os decisores. Esse modelo é composto por dois estágios. No primeiro, o método ELECTRE IV é aplicado e uma ordenação dos critérios é obtida. No próximo estágio, com a ordenação dos critérios, o método VIP Analysis é aplicado e as alternativas são selecionadas. Para ilustrar o modelo, uma aplicação numérica no contexto da seleção de fornecedores em projetos é realizada. Os fornecedores que fazem parte da equipe do projeto têm um papel fundamental no gerenciamento de projetos. Eles estão envolvidos em uma rede de atividades conectadas que, caso não sejam executadas de forma apropriada, podem colocar em risco o sucesso do

  20. Computer-aided polymer design using group contribution plus property models

    DEFF Research Database (Denmark)

    Satyanarayana, Kavitha Chelakara; Abildskov, Jens; Gani, Rafiqul

    2009-01-01

    The preliminary step for polymer product design is to identify the basic repeat unit structure of the polymer that matches the target properties. Computer-aided molecular design (CAMD) approaches can be applied for generating the polymer repeat unit structures that match the required constraints....... Polymer repeat unit property prediction models are required to calculate the properties of the generated repeat units. A systematic framework incorporating recently developed group contribution plus (GC(+)) models and an extended CAMD technique to include design of polymer repeat units is highlighted...... in this paper. The advantage of a GC(+) model in CAMD applications is that a very large number of polymer structures can be considered even though some of the group parameters may not be available. A number of case studies involving different polymer design problems have been solved through the developed...

  1. Modeling phytoplankton community in reservoirs. A comparison between taxonomic and functional groups-based models.

    Science.gov (United States)

    Di Maggio, Jimena; Fernández, Carolina; Parodi, Elisa R; Diaz, M Soledad; Estrada, Vanina

    2016-01-01

    In this paper we address the formulation of two mechanistic water quality models that differ in the way the phytoplankton community is described. We carry out parameter estimation subject to differential-algebraic constraints and validation for each model and comparison between models performance. The first approach aggregates phytoplankton species based on their phylogenetic characteristics (Taxonomic group model) and the second one, on their morpho-functional properties following Reynolds' classification (Functional group model). The latter approach takes into account tolerance and sensitivity to environmental conditions. The constrained parameter estimation problems are formulated within an equation oriented framework, with a maximum likelihood objective function. The study site is Paso de las Piedras Reservoir (Argentina), which supplies water for consumption for 450,000 population. Numerical results show that phytoplankton morpho-functional groups more closely represent each species growth requirements within the group. Each model performance is quantitatively assessed by three diagnostic measures. Parameter estimation results for seasonal dynamics of the phytoplankton community and main biogeochemical variables for a one-year time horizon are presented and compared for both models, showing the functional group model enhanced performance. Finally, we explore increasing nutrient loading scenarios and predict their effect on phytoplankton dynamics throughout a one-year time horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Large animal models of rare genetic disorders: sheep as phenotypically relevant models of human genetic disease.

    Science.gov (United States)

    Pinnapureddy, Ashish R; Stayner, Cherie; McEwan, John; Baddeley, Olivia; Forman, John; Eccles, Michael R

    2015-09-02

    Animals that accurately model human disease are invaluable in medical research, allowing a critical understanding of disease mechanisms, and the opportunity to evaluate the effect of therapeutic compounds in pre-clinical studies. Many types of animal models are used world-wide, with the most common being small laboratory animals, such as mice. However, rodents often do not faithfully replicate human disease, despite their predominant use in research. This discordancy is due in part to physiological differences, such as body size and longevity. In contrast, large animal models, including sheep, provide an alternative to mice for biomedical research due to their greater physiological parallels with humans. Completion of the full genome sequences of many species, and the advent of Next Generation Sequencing (NGS) technologies, means it is now feasible to screen large populations of domesticated animals for genetic variants that resemble human genetic diseases, and generate models that more accurately model rare human pathologies. In this review, we discuss the notion of using sheep as large animal models, and their advantages in modelling human genetic disease. We exemplify several existing naturally occurring ovine variants in genes that are orthologous to human disease genes, such as the Cln6 sheep model for Batten disease. These, and other sheep models, have contributed significantly to our understanding of the relevant human disease process, in addition to providing opportunities to trial new therapies in animals with similar body and organ size to humans. Therefore sheep are a significant species with respect to the modelling of rare genetic human disease, which we summarize in this review.

  3. Many large medical groups will need to acquire new skills and tools to be ready for payment reform.

    Science.gov (United States)

    Mechanic, Robert; Zinner, Darren E

    2012-09-01

    Federal and state policy makers are now experimenting with programs that hold health systems accountable for delivering care under predetermined budgets to help control health care spending. To assess how well prepared medical groups are to participate in these arrangements, we surveyed twenty-one large, multispecialty groups. We evaluated their participation in risk contracts such as capitation and the degree of operational support associated with these arrangements. On average, about 25 percent of the surveyed groups' patient care revenue stemmed from global capitation contracts and 9 percent from partial capitation or shared risk contracts. Groups with a larger share of revenue from risk contracts were more likely than others to have salaried physicians, advanced data management capabilities, preferred relationships with efficient specialists, and formal programs to coordinate care for high-risk patients. Our findings suggest that medical groups that lack risk contracting experience may need to develop new competencies and infrastructure to successfully navigate federal payment reform programs, including information systems that track performance and support clinicians in delivering good care; physician-level reward systems that are aligned with organizational goals; sound physician leadership; and an organizational commitment to supporting performance improvement. The difficulty of implementing these changes in complex health care organizations should not be underestimated.

  4. METHODOLOGY AND CALCULATIONS FOR THE ASSIGNMENT OF WASTE GROUPS FOR THE LARGE UNDERGROUND WASTE STORAGE TANKS AT THE HANFORD SITE

    Energy Technology Data Exchange (ETDEWEB)

    WEBER RA

    2009-01-16

    The Hanford Site contains 177 large underground radioactive waste storage tanks (28 double-shell tanks and 149 single-shell tanks). These tanks are categorized into one of three waste groups (A, B, and C) based on their waste and tank characteristics. These waste group assignments reflect a tank's propensity to retain a significant volume of flammable gases and the potential of the waste to release retained gas by a buoyant displacement gas release event. Assignments of waste groups to the 177 double-shell tanks and single-shell tanks, as reported in this document, are based on a Monte Carlo analysis of three criteria. The first criterion is the headspace flammable gas concentration following release of retained gas. This criterion determines whether the tank contains sufficient retained gas such that the well-mixed headspace flammable gas concentration would reach 100% of the lower flammability limit if the entire tank's retained gas were released. If the volume of retained gas is not sufficient to reach 100% of the lower flammability limit, then flammable conditions cannot be reached and the tank is classified as a waste group C tank independent of the method the gas is released. The second criterion is the energy ratio and considers whether there is sufficient supernatant on top of the saturated solids such that gas-bearing solids have the potential energy required to break up the material and release gas. Tanks that are not waste group C tanks and that have an energy ratio < 3.0 do not have sufficient potential energy to break up material and release gas and are assigned to waste group B. These tanks are considered to represent a potential induced flammable gas release hazard, but no spontaneous buoyant displacement flammable gas release hazard. Tanks that are not waste group C tanks and have an energy ratio {ge} 3.0, but that pass the third criterion (buoyancy ratio < 1.0, see below) are also assigned to waste group B. Even though the designation as

  5. METHODOLOGY AND CALCULATIONS FOR THE ASSIGNMENT OF WASTE GROUPS FOR THE LARGE UNDERGROUND WASTE STORAGE TANKS AT THE HANFORD SITE

    Energy Technology Data Exchange (ETDEWEB)

    FOWLER KD

    2007-12-27

    This document categorizes each of the large waste storage tanks into one of several categories based on each tank's waste characteristics. These waste group assignments reflect a tank's propensity to retain a significant volume of flammable gases and the potential of the waste to release retained gas by a buoyant displacement event. Revision 7 is the annual update of the calculations of the flammable gas Waste Groups for DSTs and SSTs. The Hanford Site contains 177 large underground radioactive waste storage tanks (28 double-shell tanks and 149 single-shell tanks). These tanks are categorized into one of three waste groups (A, B, and C) based on their waste and tank characteristics. These waste group assignments reflect a tank's propensity to retain a significant volume of flammable gases and the potential of the waste to release retained gas by a buoyant displacement gas release event. Assignments of waste groups to the 177 double-shell tanks and single-shell tanks, as reported in this document, are based on a Monte Carlo analysis of three criteria. The first criterion is the headspace flammable gas concentration following release of retained gas. This criterion determines whether the tank contains sufficient retained gas such that the well-mixed headspace flammable gas concentration would reach 100% of the lower flammability limit if the entire tank's retained gas were released. If the volume of retained gas is not sufficient to reach 100% of the lower flammability limit, then flammable conditions cannot be reached and the tank is classified as a waste group C tank independent of the method the gas is released. The second criterion is the energy ratio and considers whether there is sufficient supernatant on top of the saturated solids such that gas-bearing solids have the potential energy required to break up the material and release gas. Tanks that are not waste group C tanks and that have an energy ratio < 3.0 do not have sufficient

  6. Long-Run Properties of Large-Scale Macroeconometric Models

    OpenAIRE

    Kenneth F. WALLIS-; John D. WHITLEY

    1987-01-01

    We consider alternative approaches to the evaluation of the long-run properties of dynamic nonlinear macroeconometric models, namely dynamic simulation over an extended database, or the construction and direct solution of the steady-state version of the model. An application to a small model of the UK economy is presented. The model is found to be unstable, but a stable form can be produced by simple alterations to the structure.

  7. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2005-01-01

    the flowsheet structure. Just as a functional group is a collection of atoms, a process-group is a collection of operations forming an "unit" operation or a set of "unit" operations. The link between the process-groups are the streams similar to the bonds that are attachments to atoms/groups. Each process...

  8. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    NARCIS (Netherlands)

    Loon, van A.F.; Huijgevoort, van M.H.J.; Lanen, van H.A.J.

    2012-01-01

    Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological

  9. Study newsletters, community and ethics advisory boards, and focus group discussions provide ongoing feedback for a large biobank.

    Science.gov (United States)

    McCarty, Catherine A; Garber, Ann; Reeser, Jonathan C; Fost, Norman C

    2011-04-01

    The Personalized Medicine Research Project (PMRP) is a population-based biobank with more than 20,000 adult participants in central Wisconsin. A Community Advisory Group (CAG) and Ethics and Security Advisory Board (ESAB) provide ongoing feedback. In addition, the study newsletter is used as a two-way communication tool with study participants. The aim of this study was to assess and compare feedback received from these communication/consultation strategies with results from focus group discussions in relation to protocol changes. In summer 2009, enrollee focus groups were held addressing these topics: newsletter format, readability, and content of three articles written to solicit PMRP subject feedback. The CAG and ESAB jointly reviewed focus group results, discussed protocol changes to access residual blood samples, and made recommendations about the general communication approach. Nearly everyone in three focus groups stated that they wanted more information about PMRP. No focus group participant said that accessing stored samples would have changed their enrollment decision. Most said they wanted to be informed directly about changes affecting their original consent. For minimal-risk PMRP protocol changes, the community, CAG, and ESAB were comfortable with an opt-out model because of the initial broad consent. The planned duration of the biobank extends for decades; therefore regular, ongoing communication to enrollees is necessary to maintain awareness and trust, especially relating to protocol changes reflecting evolving science. The multi-faceted approach to communication including newsletters, external advisory boards, and focus group discussions has been successful for the PMRP biobank and may be a model for others to consider. Copyright © 2011 Wiley-Liss, Inc.

  10. Integrated modeling of the Canadian Very Large Optical Telescope

    Science.gov (United States)

    Roberts, Scott C.; Pazder, John S.; Fitzsimmons, Joeleff T.; Herriot, Glen; Loewen, Nathan; Smith, Malcolm J.; Dunn, Jennifer; Saddlemyer, Leslie K.

    2004-07-01

    We describe the VLOT integrated model, which simulates the telescope optical performance under the influence of external disturbances including wind. Details of the implementation in the MATLAB/SIMULINK environment are given, and the data structures are described. The structural to optical interface is detailed, including a discussion of coordinate transformations. The optical model includes both an interface with ZEMAX to perform raytracing analysis and an efficient Linear Optics Model for producing telescope optical path differences from within MATLAB. An extensive set of optical analysis routines has been developed for use with the integrated model. The telescope finite element model, state-space formulation and the high fidelity 1500 mode modal state-space structural dynamics model are presented. Control systems and wind models are described. We present preliminary results, showing the delivered image quality under the influence of wind on the primary mirror, with and without primary mirror control.

  11. Development of a Large Animal Model of Osteochondritis Dissecans of the Knee

    Science.gov (United States)

    Pfeifer, Christian G.; Kinsella, Stuart D.; Milby, Andrew H.; Fisher, Matthew B.; Belkin, Nicole S.; Mauck, Robert L.; Carey, James L.

    2015-01-01

    Background: Treatment of osteochondritis dissecans (OCD) of the knee is challenging, and evidence for stage-dependent treatment options is lacking. Basic science approaches utilizing animal models have provided insight into the etiology of OCD but have yet to produce a reliable and reproducible large animal model of the disease on which to test new surgical strategies. Purpose/Hypotheses: The purpose of this study was to develop an animal model featuring an OCD-like lesion in terms of size, location, and International Cartilage Repair Society (ICRS) grading. The hypothesis was that surgical creation of an osteochondral defect followed by placement of a barrier between parent bone and progeny fragment would generate a reproducible OCD-like lesion. Study Design: Controlled laboratory study. Methods: Bilateral osteochondral lesions were created in the medial femoral condyles of 9 Yucatan minipigs. After lesion creation, a biodegradable membrane was interposed between the progeny and parent bone. Five different treatment groups were evaluated at 2 weeks: a control with no membrane (ctrl group; n = 4), a slowly degrading nanofibrous poly(∊-caprolactone) membrane (PCL group; n = 4), a fenestrated PCL membrane with 1.5-mm holes covering 25% of surface area (fenPCL group; n = 4), a collagen membrane (Bio-Gide) (CM group; n = 3), and a fenestrated CM (fenCM group; n = 3). Five unperturbed lateral condyles (1 from each treatment group) served as sham controls. After euthanasia on day 14, the lesion was evaluated by gross inspection, fluoroscopy, micro–computed tomography (micro-CT), and histology. To quantify changes between groups, a scoring system based on gross appearance (0-2), fluoroscopy (0-2), and micro-CT (0-6) was established. Micro-CT was used to quantify bone volume per total volume (BV/TV) in a defined region surrounding and inclusive of the defect. Results: The no scaffold group showed healing of the subchondral bone at 2 weeks, with continuity of

  12. Thermal conductivity of the accidental degeneracy and enlarged symmetry group models for superconducting UPt3

    International Nuclear Information System (INIS)

    Graf, M.J.; Los Alamos National Lab., NM; Yip, S.K.; Sauls, J.A.

    1999-01-01

    The authors present theoretical calculations of the thermal conductivity for the accidental degeneracy and enlarged symmetry group models that have been proposed to explain the phase diagram of UPt 3 . The order parameters for these models possess point nodes or cross nodes, reflecting the broken symmetries of the ground state. These broken symmetries lead to robust predictions for the ratio of the low-temperature thermal conductivity for heat flow along the c axis and in the basal plane. The anisotropy of the heat current response at low temperatures is determined by the phase space for scattering by impurities. The measured anisotropy ratio, κ c /κ b , provides a strong constraint on theoretical models for the ground state order parameter. The accidental degeneracy and enlarged symmetry group models based on no spin-orbit coupling do not account for the thermal conductivity of UPt 3 . The models for the order parameter that fit the experimental data for the c and b directions of the heat current are the 2D E 1g and E 2u models, for which the order parameters possess line nodes in the ab-plane and point nodes along the c axis, and the A 1g circle-plus E 1g model of Zhitomirsky and Ueda. This model spontaneously breaks rotational symmetry in the ab-plane below T c2 and predicts a large anisotropy for the ab-plane heat current

  13. Modeling Temporal Behavior in Large Networks: A Dynamic Mixed-Membership Model

    Energy Technology Data Exchange (ETDEWEB)

    Rossi, R; Gallagher, B; Neville, J; Henderson, K

    2011-11-11

    Given a large time-evolving network, how can we model and characterize the temporal behaviors of individual nodes (and network states)? How can we model the behavioral transition patterns of nodes? We propose a temporal behavior model that captures the 'roles' of nodes in the graph and how they evolve over time. The proposed dynamic behavioral mixed-membership model (DBMM) is scalable, fully automatic (no user-defined parameters), non-parametric/data-driven (no specific functional form or parameterization), interpretable (identifies explainable patterns), and flexible (applicable to dynamic and streaming networks). Moreover, the interpretable behavioral roles are generalizable, computationally efficient, and natively supports attributes. We applied our model for (a) identifying patterns and trends of nodes and network states based on the temporal behavior, (b) predicting future structural changes, and (c) detecting unusual temporal behavior transitions. We use eight large real-world datasets from different time-evolving settings (dynamic and streaming). In particular, we model the evolving mixed-memberships and the corresponding behavioral transitions of Twitter, Facebook, IP-Traces, Email (University), Internet AS, Enron, Reality, and IMDB. The experiments demonstrate the scalability, flexibility, and effectiveness of our model for identifying interesting patterns, detecting unusual structural transitions, and predicting the future structural changes of the network and individual nodes.

  14. A hierarchical causal modeling for large industrial plants supervision

    International Nuclear Information System (INIS)

    Dziopa, P.; Leyval, L.

    1994-01-01

    A supervision system has to analyse the process current state and the way it will evolve after a modification of the inputs or disturbance. It is proposed to base this analysis on a hierarchy of models, witch differ by the number of involved variables and the abstraction level used to describe their temporal evolution. In a first step, special attention is paid to causal models building, from the most abstract one. Once the hierarchy of models has been build, the most detailed model parameters are estimated. Several models of different abstraction levels can be used for on line prediction. These methods have been applied to a nuclear reprocessing plant. The abstraction level could be chosen on line by the operator. Moreover when an abnormal process behaviour is detected a more detailed model is automatically triggered in order to focus the operator attention on the suspected subsystem. (authors). 11 refs., 11 figs

  15. Testing Group Mean Differences of Latent Variables in Multilevel Data Using Multiple-Group Multilevel CFA and Multilevel MIMIC Modeling.

    Science.gov (United States)

    Kim, Eun Sook; Cao, Chunhua

    2015-01-01

    Considering that group comparisons are common in social science, we examined two latent group mean testing methods when groups of interest were either at the between or within level of multilevel data: multiple-group multilevel confirmatory factor analysis (MG ML CFA) and multilevel multiple-indicators multiple-causes modeling (ML MIMIC). The performance of these methods were investigated through three Monte Carlo studies. In Studies 1 and 2, either factor variances or residual variances were manipulated to be heterogeneous between groups. In Study 3, which focused on within-level multiple-group analysis, six different model specifications were considered depending on how to model the intra-class group correlation (i.e., correlation between random effect factors for groups within cluster). The results of simulations generally supported the adequacy of MG ML CFA and ML MIMIC for multiple-group analysis with multilevel data. The two methods did not show any notable difference in the latent group mean testing across three studies. Finally, a demonstration with real data and guidelines in selecting an appropriate approach to multilevel multiple-group analysis are provided.

  16. Benchmark of Deep Learning Models on Large Healthcare MIMIC Datasets

    OpenAIRE

    Purushotham, Sanjay; Meng, Chuizheng; Che, Zhengping; Liu, Yan

    2017-01-01

    Deep learning models (aka Deep Neural Networks) have revolutionized many fields including computer vision, natural language processing, speech recognition, and is being increasingly used in clinical healthcare applications. However, few works exist which have benchmarked the performance of the deep learning models with respect to the state-of-the-art machine learning models and prognostic scoring systems on publicly available healthcare datasets. In this paper, we present the benchmarking res...

  17. VLSI (Very Large Scale Integrated Circuits) Device Reliability Models.

    Science.gov (United States)

    1984-12-01

    components have been particularly effective on phased array radars, including Cobra Dane, Pave Paws, Cobra Judy and AN/TPS-59. In spite of the large number...Quincy, MA 55. California Devices Promised Data San Jose, CA 56. Micro-Pac Industries Promised Data Garland TX 57. Teleydyne Philbrick No Data Available

  18. Truck Route Choice Modeling using Large Streams of GPS Data

    Science.gov (United States)

    2017-07-31

    The primary goal of this research was to use large streams of truck-GPS data to analyze travel routes (or paths) chosen by freight trucks to travel between different origin and destination (OD) location pairs in metropolitan regions of Florida. Two s...

  19. Symmetry in stochasticity: Random walk models of large-scale ...

    Indian Academy of Sciences (India)

    This paper describes the insights gained from the excursion set approach, in which various questions about the phenomenology of large-scale structure formation can be mapped to problems associated with the first crossing distribution of appropriately defined barriers by random walks. Much of this is summarized in R K ...

  20. Modelling the exposure of wildlife to radiation: key findings and activities of IAEA working groups

    Energy Technology Data Exchange (ETDEWEB)

    Beresford, Nicholas A. [NERC Centre for Ecology and Hydrology, Lancaster Environment Center, Library Av., Bailrigg, Lancaster, LA1 4AP (United Kingdom); School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Vives i Batlle, Jordi; Vandenhove, Hildegarde [Belgian Nuclear Research Centre, Belgian Nuclear Research Centre, Boeretang 200, 2400 Mol (Belgium); Beaugelin-Seiller, Karine [Institut de Radioprotection et de Surete Nucleaire (IRSN), PRP-ENV, SERIS, LM2E, Cadarache (France); Johansen, Mathew P. [ANSTO Australian Nuclear Science and Technology Organisation, New Illawarra Rd, Menai, NSW (Australia); Goulet, Richard [Canadian Nuclear Safety Commission, Environmental Risk Assessment Division, 280 Slater, Ottawa, K1A0H3 (Canada); Wood, Michael D. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Ruedig, Elizabeth [Department of Environmental and Radiological Health Sciences, Colorado State University, Fort Collins (United States); Stark, Karolina; Bradshaw, Clare [Department of Ecology, Environment and Plant Sciences, Stockholm University, SE-10691 (Sweden); Andersson, Pal [Swedish Radiation Safety Authority, SE-171 16, Stockholm (Sweden); Copplestone, David [Biological and Environmental Sciences, University of Stirling, Stirling, FK9 4LA (United Kingdom); Yankovich, Tamara L.; Fesenko, Sergey [International Atomic Energy Agency, Vienna International Centre, 1400, Vienna (Austria)

    2014-07-01

    In total, participants from 14 countries, representing 19 organisations, actively participated in the model application/inter-comparison activities of the IAEA's EMRAS II programme Biota Modelling Group. A range of models/approaches were used by participants (e.g. the ERICA Tool, RESRAD-BIOTA, the ICRP Framework). The agreed objectives of the group were: 'To improve Member State's capabilities for protection of the environment by comparing and validating models being used, or developed, for biota dose assessment (that may be used) as part of the regulatory process of licensing and compliance monitoring of authorised releases of radionuclides.' The activities of the group, the findings of which will be described, included: - An assessment of the predicted unweighted absorbed dose rates for 74 radionuclides estimated by 10 approaches for five of the ICRPs Reference Animal and Plant geometries assuming 1 Bq per unit organism or media. - Modelling the effect of heterogeneous distributions of radionuclides in sediment profiles on the estimated exposure of organisms. - Model prediction - field data comparisons for freshwater ecosystems in a uranium mining area and a number of wetland environments. - An evaluation of the application of available models to a scenario considering radioactive waste buried in shallow trenches. - Estimating the contribution of {sup 235}U to dose rates in freshwater environments. - Evaluation of the factors contributing to variation in modelling results. The work of the group continues within the framework of the IAEA's MODARIA programme, which was initiated in 2012. The work plan of the MODARIA working group has largely been defined by the findings of the previous EMRAS programme. On-going activities of the working group, which will be described, include the development of a database of dynamic parameters for wildlife dose assessment and exercises involving modelling the exposure of organisms in the marine coastal

  1. Comparative transcriptome analysis of muscular dystrophy models Large(myd), Dmd(mdx)/Large(myd) and Dmd(mdx): what makes them different?

    Science.gov (United States)

    Almeida, Camila F; Martins, Poliana Cm; Vainzof, Mariz

    2016-08-01

    Muscular dystrophies (MD) are a clinically and genetically heterogeneous group of Mendelian diseases. The underlying pathophysiology and phenotypic variability in each form are much more complex, suggesting the involvement of many other genes. Thus, here we studied the whole genome expression profile in muscles from three mice models for MD, at different time points: Dmd(mdx) (mutation in dystrophin gene), Large(myd-/-) (mutation in Large) and Dmd(mdx)/Large(myd-/-) (both mutations). The identification of altered biological functions can contribute to understand diseases and to find prognostic biomarkers and points for therapeutic intervention. We identified a substantial number of differentially expressed genes (DEGs) in each model, reflecting diseases' complexity. The main biological process affected in the three strains was immune system, accounting for the majority of enriched functional categories, followed by degeneration/regeneration and extracellular matrix remodeling processes. The most notable differences were in 21-day-old Dmd(mdx), with a high proportion of DEGs related to its regenerative capacity. A higher number of positive embryonic myosin heavy chain (eMyHC) fibers confirmed this. The new Dmd(mdx)/Large(myd-/-) model did not show a highly different transcriptome from the parental lineages, with a profile closer to Large(myd-/-), but not bearing the same regenerative potential as Dmd(mdx). This is the first report about transcriptome profile of a mouse model for congenital MD and Dmd(mdx)/Large(myd). By comparing the studied profiles, we conclude that alterations in biological functions due to the dystrophic process are very similar, and that the intense regeneration in Dmd(mdx) involves a large number of activated genes, not differentially expressed in the other two strains.

  2. Misspecified poisson regression models for large-scale registry data

    DEFF Research Database (Denmark)

    Grøn, Randi; Gerds, Thomas A.; Andersen, Per K.

    2016-01-01

    working models that are then likely misspecified. To support and improve conclusions drawn from such models, we discuss methods for sensitivity analysis, for estimation of average exposure effects using aggregated data, and a semi-parametric bootstrap method to obtain robust standard errors. The methods...

  3. Modelling expected train passenger delays on large scale railway networks

    DEFF Research Database (Denmark)

    Landex, Alex; Nielsen, Otto Anker

    2006-01-01

    Forecasts of regularity for railway systems have traditionally – if at all – been computed for trains, not for passengers. Relatively recently it has become possible to model and evaluate the actual passenger delays by a passenger regularity model for the operation already carried out. First...

  4. Comparing large-scale computational approaches to epidemic modeling: Agent-based versus structured metapopulation models

    Directory of Open Access Journals (Sweden)

    Merler Stefano

    2010-06-01

    Full Text Available Abstract Background In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. Methods We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. Results The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age

  5. Comparing large-scale computational approaches to epidemic modeling: agent-based versus structured metapopulation models.

    Science.gov (United States)

    Ajelli, Marco; Gonçalves, Bruno; Balcan, Duygu; Colizza, Vittoria; Hu, Hao; Ramasco, José J; Merler, Stefano; Vespignani, Alessandro

    2010-06-29

    In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM) model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age breakdown analysis shows that similar attack rates are

  6. Assessing the reliability of predictive activity coefficient models for molecules consisting of several functional groups

    Directory of Open Access Journals (Sweden)

    R. P. Gerber

    2013-03-01

    Full Text Available Currently, the most successful predictive models for activity coefficients are those based on functional groups such as UNIFAC. In contrast, these models require a large amount of experimental data for the determination of their parameter matrix. A more recent alternative is the models based on COSMO, for which only a small set of universal parameters must be calibrated. In this work, a recalibrated COSMO-SAC model was compared with the UNIFAC (Do model employing experimental infinite dilution activity coefficient data for 2236 non-hydrogen-bonding binary mixtures at different temperatures. As expected, UNIFAC (Do presented better overall performance, with a mean absolute error of 0.12 ln-units against 0.22 for our COSMO-SAC implementation. However, in cases involving molecules with several functional groups or when functional groups appear in an unusual way, the deviation for UNIFAC was 0.44 as opposed to 0.20 for COSMO-SAC. These results show that COSMO-SAC provides more reliable predictions for multi-functional or more complex molecules, reaffirming its future prospects.

  7. Multilevel method for modeling large-scale networks.

    Energy Technology Data Exchange (ETDEWEB)

    Safro, I. M. (Mathematics and Computer Science)

    2012-02-24

    Understanding the behavior of real complex networks is of great theoretical and practical significance. It includes developing accurate artificial models whose topological properties are similar to the real networks, generating the artificial networks at different scales under special conditions, investigating a network dynamics, reconstructing missing data, predicting network response, detecting anomalies and other tasks. Network generation, reconstruction, and prediction of its future topology are central issues of this field. In this project, we address the questions related to the understanding of the network modeling, investigating its structure and properties, and generating artificial networks. Most of the modern network generation methods are based either on various random graph models (reinforced by a set of properties such as power law distribution of node degrees, graph diameter, and number of triangles) or on the principle of replicating an existing model with elements of randomization such as R-MAT generator and Kronecker product modeling. Hierarchical models operate at different levels of network hierarchy but with the same finest elements of the network. However, in many cases the methods that include randomization and replication elements on the finest relationships between network nodes and modeling that addresses the problem of preserving a set of simplified properties do not fit accurately enough the real networks. Among the unsatisfactory features are numerically inadequate results, non-stability of algorithms on real (artificial) data, that have been tested on artificial (real) data, and incorrect behavior at different scales. One reason is that randomization and replication of existing structures can create conflicts between fine and coarse scales of the real network geometry. Moreover, the randomization and satisfying of some attribute at the same time can abolish those topological attributes that have been undefined or hidden from

  8. A model for recovery kinetics of aluminum after large strain

    DEFF Research Database (Denmark)

    Yu, Tianbo; Hansen, Niels

    2012-01-01

    A model is suggested to analyze recovery kinetics of heavily deformed aluminum. The model is based on the hardness of isothermal annealed samples before recrystallization takes place, and it can be extrapolated to longer annealing times to factor out the recrystallization component of the hardness...... for conditions where recovery and recrystallization overlap. The model is applied to the isothermal recovery at temperatures between 140 and 220°C of commercial purity aluminum deformed to true strain 5.5. EBSD measurements have been carried out to detect the onset of discontinuous recrystallization. Furthermore...

  9. TOPICAL REVIEW: Nonlinear aspects of the renormalization group flows of Dyson's hierarchical model

    Science.gov (United States)

    Meurice, Y.

    2007-06-01

    We review recent results concerning the renormalization group (RG) transformation of Dyson's hierarchical model (HM). This model can be seen as an approximation of a scalar field theory on a lattice. We introduce the HM and show that its large group of symmetry simplifies drastically the blockspinning procedure. Several equivalent forms of the recursion formula are presented with unified notations. Rigourous and numerical results concerning the recursion formula are summarized. It is pointed out that the recursion formula of the HM is inequivalent to both Wilson's approximate recursion formula and Polchinski's equation in the local potential approximation (despite the very small difference with the exponents of the latter). We draw a comparison between the RG of the HM and functional RG equations in the local potential approximation. The construction of the linear and nonlinear scaling variables is discussed in an operational way. We describe the calculation of non-universal critical amplitudes in terms of the scaling variables of two fixed points. This question appears as a problem of interpolation between these fixed points. Universal amplitude ratios are calculated. We discuss the large-N limit and the complex singularities of the critical potential calculable in this limit. The interpolation between the HM and more conventional lattice models is presented as a symmetry breaking problem. We briefly introduce models with an approximate supersymmetry. One important goal of this review is to present a configuration space counterpart, suitable for lattice formulations, of functional RG equations formulated in momentum space (often called exact RG equations and abbreviated ERGE).

  10. Large deviations for Gaussian queues modelling communication networks

    CERN Document Server

    Mandjes, Michel

    2007-01-01

    Michel Mandjes, Centre for Mathematics and Computer Science (CWI) Amsterdam, The Netherlands, and Professor, Faculty of Engineering, University of Twente. At CWI Mandjes is a senior researcher and Director of the Advanced Communications Network group.  He has published for 60 papers on queuing theory, networks, scheduling, and pricing of networks.

  11. Traffic assignment models in large-scale applications

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Kjær

    of observations of actual behaviour to obtain estimates of the (monetary) value of different travel time components, thereby increasing the behavioural realism of largescale models. vii The generation of choice sets is a vital component in route choice models. This is, however, not a straight-forward task in real......-perceptions. It is the commonly adopted assumption that the distributed elements follow unbounded distributions which induces the need to enumerate all paths in the SUE, no matter how unattractive they might be. The Deterministic User Equilibrium (DUE), on the other hand, has a built-in criterion distinguishing definitely unused...... non-universal choice sets and (ii) flow distribution according to random utility maximisation theory. One model allows distinction between used and unused routes based on the distribution of the random error terms, while the other model allows this distinction by posing restrictions on the costs...

  12. Nonequilibrium Dynamics of Anisotropic Large Spins in the Kondo Regime: Time-Dependent Numerical Renormalization Group Analysis

    Science.gov (United States)

    Roosen, David; Wegewijs, Maarten R.; Hofstetter, Walter

    2008-02-01

    We investigate the time-dependent Kondo effect in a single-molecule magnet (SMM) strongly coupled to metallic electrodes. Describing the SMM by a Kondo model with large spin S>1/2, we analyze the underscreening of the local moment and the effect of anisotropy terms on the relaxation dynamics of the magnetization. Underscreening by single-channel Kondo processes leads to a logarithmically slow relaxation, while finite uniaxial anisotropy causes a saturation of the SMM’s magnetization. Additional transverse anisotropy terms induce quantum spin tunneling and a pseudospin-1/2 Kondo effect sensitive to the spin parity.

  13. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  14. A working group`s conclusion on site specific flow and transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. [Golder Associates AB (Sweden); Ahokas, H. [Fintact Oy, Helsinki (Finland); Koskinen, L.; Poteri, A. [VTT Energy, Espoo (Finland); Niemi, A. [Royal Inst. of Technology, Stockholm (Sweden). Hydraulic Engineering; Hautojaervi, A. [Posiva Oy, Helsinki (Finland)

    1998-03-01

    This document suggests a strategy plan for groundwater flow and transport modelling to be used in the site specific performance assessment analysis of spent nuclear fuel disposal to be used for the site selection planned by the year 2000. Considering suggested general regulations in Finland, as well as suggested regulations in Sweden and the approach taken in recent safety assessment exercises conducted in these countries, it is clear that in such an analysis, in addition to showing that the proposed repository is safe, there exist needs to strengthen the link between field data, groundwater flow modelling and derivation of safety assessment parameters, and needs to assess uncertainty and variability. The suggested strategy plan builds on an evaluation of different approaches to modelling the groundwater flow in crystalline basement rock, the abundance of data collected in the site investigation programme in Finland, and the modelling methodology developed in the programme so far. It is suggested to model the whole system using nested models, where larger scale models provide the boundary conditions for the smaller ones 62 refs.

  15. Nonlinear Synapses for Large-Scale Models: An Efficient Representation Enables Complex Synapse Dynamics Modeling in Large-Scale Simulations

    Directory of Open Access Journals (Sweden)

    Eric eHu

    2015-09-01

    Full Text Available Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  16. Group spike-and-slab lasso generalized linear models for disease prediction and associated genes detection by incorporating pathway information.

    Science.gov (United States)

    Tang, Zaixiang; Shen, Yueping; Li, Yan; Zhang, Xinyan; Wen, Jia; Qian, Chen'ao; Zhuang, Wenzhuo; Shi, Xinghua; Yi, Nengjun

    2018-03-15

    Large-scale molecular data have been increasingly used as an important resource for prognostic prediction of diseases and detection of associated genes. However, standard approaches for omics data analysis ignore the group structure among genes encoded in functional relationships or pathway information. We propose new Bayesian hierarchical generalized linear models, called group spike-and-slab lasso GLMs, for predicting disease outcomes and detecting associated genes by incorporating large-scale molecular data and group structures. The proposed model employs a mixture double-exponential prior for coefficients that induces self-adaptive shrinkage amount on different coefficients. The group information is incorporated into the model by setting group-specific parameters. We have developed a fast and stable deterministic algorithm to fit the proposed hierarchal GLMs, which can perform variable selection within groups. We assess the performance of the proposed method on several simulated scenarios, by varying the overlap among groups, group size, number of non-null groups, and the correlation within group. Compared with existing methods, the proposed method provides not only more accurate estimates of the parameters but also better prediction. We further demonstrate the application of the proposed procedure on three cancer datasets by utilizing pathway structures of genes. Our results show that the proposed method generates powerful models for predicting disease outcomes and detecting associated genes. The methods have been implemented in a freely available R package BhGLM (http://www.ssg.uab.edu/bhglm/). nyi@uab.edu. Supplementary data are available at Bioinformatics online.

  17. Topological σ Models and Large N Matrix Integral

    Science.gov (United States)

    Eguchi, Tohru; Hori, Kentaro; Yang, Sung-Kil

    In this paper we describe in some detail the representation of the topological CP1 model in terms of a matrix integral which we have introduced in a previous article. We first discuss the integrable structure of the CP1 model and show that it is governed by an extension of the one-dimensional Toda hierarchy. We then introduce a matrix model which reproduces the sum over holomorphic maps from arbitrary Riemann surfaces onto CP1. We compute intersection numbers on the moduli space of curves using a geometrical method and show that the results agree with those predicted by the matrix model. We also develop a Landau-Ginzburg (LG) description of the CP1 model using a superpotential eX + et0,Q e-X given by the Lax operator of the Toda hierarchy (X is the LG field and t0,Q is the coupling constant of the Kähler class). The form of the superpotential indicates the close connection between CP1 and N=2 supersymmetric sine-Gordon theory which was noted sometime ago by several authors. We also discuss possible generalizations of our construction to other manifolds and present an LG formulation of the topological CP2 model.

  18. Bayesian latent feature modeling for modeling bipartite networks with overlapping groups

    DEFF Research Database (Denmark)

    Jørgensen, Philip H.; Mørup, Morten; Schmidt, Mikkel Nørgaard

    2016-01-01

    by the notion of community structure such that the edge density within groups is higher than between groups. Our model further assumes that entities can have different propensities of generating links in one of the modes. The proposed framework is contrasted on both synthetic and real bi-partite networks...... to the infinite relational model and the infinite Bernoulli mixture model. We find that the model provides a new latent feature representation of structure while in link-prediction performing close to existing models. Our current extension of the notion of communities and collapsed inference to binary latent...... feature representations in bipartite networks provides a new framework for accounting for structure in bi-partite networks using binary latent feature representations providing interpretable representations that well characterize structure as quantified by link prediction....

  19. Efficient stochastic approaches for sensitivity studies of an Eulerian large-scale air pollution model

    Science.gov (United States)

    Dimov, I.; Georgieva, R.; Todorov, V.; Ostromsky, Tz.

    2017-10-01

    Reliability of large-scale mathematical models is an important issue when such models are used to support decision makers. Sensitivity analysis of model outputs to variation or natural uncertainties of model inputs is crucial for improving the reliability of mathematical models. A comprehensive experimental study of Monte Carlo algorithms based on Sobol sequences for multidimensional numerical integration has been done. A comparison with Latin hypercube sampling and a particular quasi-Monte Carlo lattice rule based on generalized Fibonacci numbers has been presented. The algorithms have been successfully applied to compute global Sobol sensitivity measures corresponding to the influence of several input parameters (six chemical reactions rates and four different groups of pollutants) on the concentrations of important air pollutants. The concentration values have been generated by the Unified Danish Eulerian Model. The sensitivity study has been done for the areas of several European cities with different geographical locations. The numerical tests show that the stochastic algorithms under consideration are efficient for multidimensional integration and especially for computing small by value sensitivity indices. It is a crucial element since even small indices may be important to be estimated in order to achieve a more accurate distribution of inputs influence and a more reliable interpretation of the mathematical model results.

  20. Report of study group 7.2: comparison of medium or large scale CHP and combined cycles, in various countries

    Energy Technology Data Exchange (ETDEWEB)

    Roncato, J.P. [Finergaz (France); Macchi, E. [Politecnico di Milano, Milan (Italy)

    2000-07-01

    At the turn of the third millennium, important changes are occurring in terms of energy policy and deregulation of the energy market, in numerous countries. It is clear that cogeneration has undergone an impressive development in a large number of countries, over the last five years. But it is also clear that gas fired electricity production will have to face a much more uncertain situation in the coming years with less predictability concerning both energy prices and costs of access to the grids in a de-regulated context, and more generally regarding economic environment of the projects. The aim of the present report is to show that even with different situations, concerning energy prices, conditions of access to the grid and incentives, there is a logical link between the profitability and the development rate of cogeneration or combined cycles in different countries. Detailed data have therefore been collected from a selection of countries, in order to compare on a consistent basis the profitability of several typical projects. From these data, the study group has then been able to compute the pay-back of these projects for each country, and to perform a sensitivity analysis to different parameters. Data were collected from Japan and 10 European countries represented in the study group. In spite of several contacts, it was unfortunately not possible to collect consistent data from a larger number of other countries, nevertheless the study group believes that the results obtained are representative of a significant range of situations. (authors)

  1. A wide-range model of two-group gross sections in the dynamics code HEXTRAN

    International Nuclear Information System (INIS)

    Kaloinen, E.; Peltonen, J.

    2002-01-01

    In dynamic analyses the thermal hydraulic conditions within the reactor core may have a large variation, which sets a special requirement on the modeling of cross sections. The standard model in the dynamics code HEXTRAN is the same as in the static design code HEXBU-3D/MODS. It is based on a linear and second order fitting of two-group cross sections on fuel and moderator temperature, moderator density and boron density. A new, wide-range model of cross sections developed in Fortum Nuclear Services for HEXBU-3D/MOD6 has been included as an option into HEXTRAN. In this model the nodal cross sections are constructed from seven state variables in a polynomial of more than 40 terms. Coefficients of the polynomial are created by a least squares fitting to the results of a large number of fuel assembly calculations. Depending on the choice of state variables for the spectrum calculations, the new cross section model is capable to cover local conditions from cold zero power to boiling at full power. The 5. dynamic benchmark problem of AER is analyzed with the new option and results are compared to calculations with the standard model of cross sections in HEXTRAN (Authors)

  2. Using Breakout Groups as an Active Learning Technique in a Large Undergraduate Nutrition Classroom at the University of Guelph

    Directory of Open Access Journals (Sweden)

    Genevieve Newton

    2012-12-01

    Full Text Available Breakout groups have been widely used under many different conditions, but the lack of published information related to their use in undergraduate settings highlights the need for research related to their use in this context. This paper describes a study investigating the use of breakout groups in undergraduate education as it specifically relates to teaching a large 4th year undergraduate Nutrition class in a physically constrained lecture space. In total, 220 students completed a midterm survey and 229 completed a final survey designed to measure student satisfaction. Survey results were further analyzed to measure relationships between student perception of breakout group effectiveness and (1 gender and (2 cumulative GPA. Results of both surveys revealed that over 85% of students either agreed or strongly agreed that using breakout groups enhanced their learning experience, with females showing a significantly greater level of satisfaction and higher final course grade than males. Although not stratified by gender, a consistent finding between surveys was a lower perception of breakout group effectiveness by students with a cumulative GPA above 90%. The majority of respondents felt that despite the awkward room space, the breakout groups were easy to create and participate in, which suggests that breakout groups can be successfully used in a large undergraduate classroom despite physical constraints. The findings of this work are relevant given the applicability of breakout groups to a wide range of disciplines, and the relative ease of integration into a traditional lecture format.Les enseignants ont recours aux petits groupes dans de nombreuses conditions différentes, cependant, le manque d’information publiée sur leur utilisation au premier cycle confirme la nécessité d’effectuer des recherches sur ce format dans ce contexte. Le présent article rend compte d’une étude portant sur l’utilisation des petits groupes au premier

  3. Modeling and simulation of large scale stirred tank

    Science.gov (United States)

    Neuville, John R.

    The purpose of this dissertation is to provide a written record of the evaluation performed on the DWPF mixing process by the construction of numerical models that resemble the geometry of this process. There were seven numerical models constructed to evaluate the DWPF mixing process and four pilot plants. The models were developed with Fluent software and the results from these models were used to evaluate the structure of the flow field and the power demand of the agitator. The results from the numerical models were compared with empirical data collected from these pilot plants that had been operated at an earlier date. Mixing is commonly used in a variety ways throughout industry to blend miscible liquids, disperse gas through liquid, form emulsions, promote heat transfer and, suspend solid particles. The DOE Sites at Hanford in Richland Washington, West Valley in New York, and Savannah River Site in Aiken South Carolina have developed a process that immobilizes highly radioactive liquid waste. The radioactive liquid waste at DWPF is an opaque sludge that is mixed in a stirred tank with glass frit particles and water to form slurry of specified proportions. The DWPF mixing process is composed of a flat bottom cylindrical mixing vessel with a centrally located helical coil, and agitator. The helical coil is used to heat and cool the contents of the tank and can improve flow circulation. The agitator shaft has two impellers; a radial blade and a hydrofoil blade. The hydrofoil is used to circulate the mixture between the top region and bottom region of the tank. The radial blade sweeps the bottom of the tank and pushes the fluid in the outward radial direction. The full scale vessel contains about 9500 gallons of slurry with flow behavior characterized as a Bingham Plastic. Particles in the mixture have an abrasive characteristic that cause excessive erosion to internal vessel components at higher impeller speeds. The desire for this mixing process is to ensure the

  4. The pig as a large preclinical model for therapeutic human anti-cancer vaccine development

    DEFF Research Database (Denmark)

    Overgaard, Nana Haahr; Frøsig, Thomas Mørch; Welner, Simon

    2016-01-01

    Development of therapeutic cancer vaccines has largely been based on rodent models and the majority failed to establish therapeutic responses in clinical trials. We therefore used pigs as a large animal model for human cancer vaccine development due to the large similarity between the porcine...

  5. Description of the East Brazil Large Marine Ecosystem using a trophic model

    Directory of Open Access Journals (Sweden)

    Kátia M.F. Freire

    2008-09-01

    Full Text Available The objective of this study was to describe the marine ecosystem off northeastern Brazil. A trophic model was constructed for the 1970s using Ecopath with Ecosim. The impact of most of the forty-one functional groups was modest, probably due to the highly reticulated diet matrix. However, seagrass and macroalgae exerted a strong positive impact on manatee and herbivorous reef fishes, respectively. A high negative impact of omnivorous reef fishes on spiny lobsters and of sharks on swordfish was observed. Spiny lobsters and swordfish had the largest biomass changes for the simulation period (1978-2000; tunas, other large pelagics and sharks showed intermediate rates of biomass decline; and a slight increase in biomass was observed for toothed cetaceans, large carnivorous reef fishes, and dolphinfish. Recycling was an important feature of this ecosystem with low phytoplankton-originated primary production. The mean transfer efficiency between trophic levels was 11.4%. The gross efficiency of the fisheries was very low (0.00002, probably due to the low exploitation rate of most of the resources in the 1970s. Basic local information was missing for many groups. When information gaps are filled, this model may serve more credibly for the exploration of fishing policies for this area within an ecosystem approach.

  6. Dynamic Modeling, Optimization, and Advanced Control for Large Scale Biorefineries

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail

    Second generation biorefineries transform agricultural wastes into biochemicals with higher added value, e.g. bioethanol, which is thought to become a primary component in liquid fuels [1]. Extensive endeavors have been conducted to make the production process feasible on a large scale, and recen......Second generation biorefineries transform agricultural wastes into biochemicals with higher added value, e.g. bioethanol, which is thought to become a primary component in liquid fuels [1]. Extensive endeavors have been conducted to make the production process feasible on a large scale......-time monitoring. The Inbicon biorefinery converts wheat straw into bioethanol utilizing steam, enzymes, and genetically modified yeast. The biomass is first pretreated in a steam pressurized and continuous thermal reactor where lignin is relocated, and hemicellulose partially hydrolyzed such that cellulose...... becomes more accessible to enzymes. The biorefinery is integrated with a nearby power plant following the Integrated Biomass Utilization System (IBUS) principle for reducing steam costs [4]. During the pretreatment, by-products are also created such as organic acids, furfural, and pseudo-lignin, which act...

  7. Multiple-membership multiple-classification models for social network and group dependences.

    Science.gov (United States)

    Tranmer, Mark; Steel, David; Browne, William J

    2014-02-01

    The social network literature on network dependences has largely ignored other sources of dependence, such as the school that a student attends, or the area in which an individual lives. The multilevel modelling literature on school and area dependences has, in turn, largely ignored social networks. To bridge this divide, a multiple-membership multiple-classification modelling approach for jointly investigating social network and group dependences is presented. This allows social network and group dependences on individual responses to be investigated and compared. The approach is used to analyse a subsample of the Adolescent Health Study data set from the USA, where the response variable of interest is individual level educational attainment, and the three individual level covariates are sex, ethnic group and age. Individual, network, school and area dependences are accounted for in the analysis. The network dependences can be accounted for by including the network as a classification in the model, using various network configurations, such as ego-nets and cliques. The results suggest that ignoring the network affects the estimates of variation for the classifications that are included in the random part of the model (school, area and individual), as well as having some influence on the point estimates and standard errors of the estimates of regression coefficients for covariates in the fixed part of the model. From a substantive perspective, this approach provides a flexible and practical way of investigating variation in an individual level response due to social network dependences, and estimating the share of variation of an individual response for network, school and area classifications.

  8. How Hyperarousal and Sleep Reactivity Are Represented in Different Adult Age Groups: Results from a Large Cohort Study on Insomnia.

    Science.gov (United States)

    Altena, Ellemarije; Chen, Ivy Y; Daviaux, Yannick; Ivers, Hans; Philip, Pierre; Morin, Charles M

    2017-04-14

    Hyperarousal is a 24-h state of elevated cognitive and physiological activation, and is a core feature of insomnia. The extent to which sleep quality is affected by stressful events-so-called sleep reactivity-is a vulnerability factor for developing insomnia. Given the increasing prevalence of insomnia with age, we aimed to investigate how hyperarousal and sleep reactivity were related to insomnia severity in different adult age groups. Data were derived from a large cohort study investigating the natural history of insomnia in a population-based sample ( n = 1693). Baseline data of the Arousal Predisposition Scale (APS) and Ford Insomnia Response to Stress Test (FIRST) were examined across age and sleep/insomnia subgroups: 25-35 ( n = 448), 35-45 ( n = 528), and 45-55 year olds ( n = 717); good sleepers ( n = 931), individuals with insomnia symptoms ( n = 450), and individuals with an insomnia syndrome ( n = 312). Results from factorial analyses of variance (ANOVA) showed that APS scores decreased with increasing age, but increased with more severe sleep problems. FIRST scores were not significantly different across age groups, but showed the same strong increase as a function of sleep problem severity. The findings indicate that though arousal predisposition and sleep reactivity increase with more severe sleep problems, only arousal decreases with age. How arousing events affect an individual during daytime thus decreases with age, but how this arousal disrupts sleep is equivalent across different adult age groups. The main implication of these findings is that treatment of insomnia could be adapted for different age groups and take into consideration vulnerability factors such as hyperarousal and stress reactivity.

  9. Large-area dry bean yield prediction modeling in Mexico

    Science.gov (United States)

    Given the importance of dry bean in Mexico, crop yield predictions before harvest are valuable for authorities of the agricultural sector, in order to define support for producers. The aim of this study was to develop an empirical model to estimate the yield of dry bean at the regional level prior t...

  10. Models of 'obesity' in large animals and birds.

    Science.gov (United States)

    Clarke, Iain J

    2008-01-01

    Most laboratory-based research on obesity is carried out in rodents, but there are a number of other interesting models in the animal kingdom that are instructive. This includes domesticated animal species such as pigs and sheep, as well as wild, migrating and hibernating species. Larger animals allow particular experimental manipulations that are not possible in smaller animals and especially useful models have been developed to address issues such as manipulation of fetal development. Although some of the most well-studied models are ruminants, with metabolic control that differs from monogastrics, the general principles of metabolic regulation still pertain. It is possible to obtain much more accurate endocrine profiles in larger animals and this has provided important data in relation to leptin and ghrelin physiology. Genetic models have been created in domesticated animals through selection and these complement those of the laboratory rodent. This short review highlights particular areas of research in domesticated and wild species that expand our knowledge of systems that are important for our understanding of obesity and metabolism.

  11. Searches for phenomena beyond the Standard Model at the Large ...

    Indian Academy of Sciences (India)

    Keywords. LHC; ATLAS; CMS; BSM; supersymmetry; exotic. Abstract. The LHC has delivered several fb-1 of data in spring and summer 2011, opening new windows of opportunity for discovering phenomena beyond the Standard Model. A summary of the searches conducted by the ATLAS and CMS experiments based on ...

  12. Optimisation of a large WWTP thanks to mathematical modelling.

    Science.gov (United States)

    Printemps, C; Baudin, A; Dormoy, T; Zug, M; Vanrolleghem, P A

    2004-01-01

    Better controlling and optimising the plant's processes has become a priority for WWTP (Wastewater Treatment Plant) managers. The main objective of this project is to develop a simplified mathematical tool able to reproduce and anticipate the behaviour of the Tougas WWTP (Nantes, France). This tool is aimed to be used directly by the managers of the site. The mathematical WWTP model was created using the software WEST. This paper describes the studied site and the modelling results obtained during the stage of the model calibration and validation. The good simulation results have allowed to show that despite a first very simple description of the WWTP, the model was able to correctly predict the nitrogen composition (ammonia and nitrate) of the effluent and the daily sludge extraction. Then, a second more detailed configuration of the WWTP was implemented. It has allowed to independently study the behaviour of each of four biological trains. Once this first stage will be completely achieved, the remainder of the study will focus on the operational use of a simplified simulator with the purpose of optimising the Tougas WWTP operation.

  13. Energy-aware semantic modeling in large scale infrastructures

    NARCIS (Netherlands)

    Zhu, H.; van der Veldt, K.; Grosso, P.; Zhao, Z.; Liao, X.; de Laat, C.

    2012-01-01

    Including the energy profile of the computing infrastructure in the decision process for scheduling computing tasks and allocating resources is essential to improve the system energy efficiency. However, the lack of an effective model of the infrastructure energy information makes it difficult for

  14. Searches for phenomena beyond the Standard Model at the Large

    Indian Academy of Sciences (India)

    The LHC has delivered several fb-1 of data in spring and summer 2011, opening new windows of opportunity for discovering phenomena beyond the Standard Model. A summary of the searches conducted by the ATLAS and CMS experiments based on about 1 fb-1 of data is presented.

  15. Symmetry-guided large-scale shell-model theory

    Czech Academy of Sciences Publication Activity Database

    Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.

    2016-01-01

    Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell-model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016

  16. Misspecified poisson regression models for large-scale registry data: inference for 'large n and small p'.

    Science.gov (United States)

    Grøn, Randi; Gerds, Thomas A; Andersen, Per K

    2016-03-30

    Poisson regression is an important tool in register-based epidemiology where it is used to study the association between exposure variables and event rates. In this paper, we will discuss the situation with 'large n and small p', where n is the sample size and p is the number of available covariates. Specifically, we are concerned with modeling options when there are time-varying covariates that can have time-varying effects. One problem is that tests of the proportional hazards assumption, of no interactions between exposure and other observed variables, or of other modeling assumptions have large power due to the large sample size and will often indicate statistical significance even for numerically small deviations that are unimportant for the subject matter. Another problem is that information on important confounders may be unavailable. In practice, this situation may lead to simple working models that are then likely misspecified. To support and improve conclusions drawn from such models, we discuss methods for sensitivity analysis, for estimation of average exposure effects using aggregated data, and a semi-parametric bootstrap method to obtain robust standard errors. The methods are illustrated using data from the Danish national registries investigating the diabetes incidence for individuals treated with antipsychotics compared with the general unexposed population. Copyright © 2015 John Wiley & Sons, Ltd.

  17. The blink reflex test does not show abnormalities in a large group of patients with chronic migraine

    Directory of Open Access Journals (Sweden)

    Joseph Bruno Bidin Brooks

    2013-11-01

    Full Text Available The blink reflex – a simple, non-invasive and inexpensive test – may be indicative of lesions or dysfunctions of the brainstem, and particularly assesses the trigeminal-facial arch. Results from alterations of the blink reflex in patients with headaches have provided controversial data. Method Registration of the waves R1 and R2 (ipsilateral to the stimulus and R2c (contralateral to the stimulus by electroneuromyography. Results A large number of controls (n=160 and patients with chronic migraine (n=160 were studied. No significant differences were observed between the two groups. Conclusion It is possible that this relatively simple and primitive reflex is not affected unless there is significant damage to the brainstem.

  18. Airflow and radon transport modeling in four large buildings

    International Nuclear Information System (INIS)

    Fan, J.B.; Persily, A.K.

    1995-01-01

    Computer simulations of multizone airflow and contaminant transport were performed in four large buildings using the program CONTAM88. This paper describes the physical characteristics of the buildings and their idealizations as multizone building airflow systems. These buildings include a twelve-story multifamily residential building, a five-story mechanically ventilated office building with an atrium, a seven-story mechanically ventilated office building with an underground parking garage, and a one-story school building. The air change rates and interzonal airflows of these buildings are predicted for a range of wind speeds, indoor-outdoor temperature differences, and percentages of outdoor air intake in the supply air Simulations of radon transport were also performed in the buildings to investigate the effects of indoor-outdoor temperature difference and wind speed on indoor radon concentrations

  19. Informational and emotional elements in online support groups: a Bayesian approach to large-scale content analysis.

    Science.gov (United States)

    Deetjen, Ulrike; Powell, John A

    2016-05-01

    This research examines the extent to which informational and emotional elements are employed in online support forums for 14 purposively sampled chronic medical conditions and the factors that influence whether posts are of a more informational or emotional nature. Large-scale qualitative data were obtained from Dailystrength.org. Based on a hand-coded training dataset, all posts were classified into informational or emotional using a Bayesian classification algorithm to generalize the findings. Posts that could not be classified with a probability of at least 75% were excluded. The overall tendency toward emotional posts differs by condition: mental health (depression, schizophrenia) and Alzheimer's disease consist of more emotional posts, while informational posts relate more to nonterminal physical conditions (irritable bowel syndrome, diabetes, asthma). There is no gender difference across conditions, although prostate cancer forums are oriented toward informational support, whereas breast cancer forums rather feature emotional support. Across diseases, the best predictors for emotional content are lower age and a higher number of overall posts by the support group member. The results are in line with previous empirical research and unify empirical findings from single/2-condition research. Limitations include the analytical restriction to predefined categories (informational, emotional) through the chosen machine-learning approach. Our findings provide an empirical foundation for building theory on informational versus emotional support across conditions, give insights for practitioners to better understand the role of online support groups for different patients, and show the usefulness of machine-learning approaches to analyze large-scale qualitative health data from online settings. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Renormalization group study of the one-dimensional quantum Potts model

    International Nuclear Information System (INIS)

    Solyom, J.; Pfeuty, P.

    1981-01-01

    The phase transition of the classical two-dimensional Potts model, in particular the order of the transition as the number of components q increases, is studied by constructing renormalization group transformations on the equivalent one-dimensional quatum problem. It is shown that the block transformation with two sites per cell indicates the existence of a critical qsub(c) separating the small q and large q regions with different critical behaviours. The physically accessible fixed point for q>qsub(c) is a discontinuity fixed point where the specific heat exponent α=1 and therefore the transition is of first order. (author)

  1. Forward Modeling of Large-scale Structure: An Open-source Approach with Halotools

    Science.gov (United States)

    Hearin, Andrew P.; Campbell, Duncan; Tollerud, Erik; Behroozi, Peter; Diemer, Benedikt; Goldbaum, Nathan J.; Jennings, Elise; Leauthaud, Alexie; Mao, Yao-Yuan; More, Surhud; Parejko, John; Sinha, Manodeep; Sipöcz, Brigitta; Zentner, Andrew

    2017-11-01

    We present the first stable release of Halotools (v0.2), a community-driven Python package designed to build and test models of the galaxy-halo connection. Halotools provides a modular platform for creating mock universes of galaxies starting from a catalog of dark matter halos obtained from a cosmological simulation. The package supports many of the common forms used to describe galaxy-halo models: the halo occupation distribution, the conditional luminosity function, abundance matching, and alternatives to these models that include effects such as environmental quenching or variable galaxy assembly bias. Satellite galaxies can be modeled to live in subhalos or to follow custom number density profiles within their halos, including spatial and/or velocity bias with respect to the dark matter profile. The package has an optimized toolkit to make mock observations on a synthetic galaxy population—including galaxy clustering, galaxy-galaxy lensing, galaxy group identification, RSD multipoles, void statistics, pairwise velocities and others—allowing direct comparison to observations. Halotools is object-oriented, enabling complex models to be built from a set of simple, interchangeable components, including those of your own creation. Halotools has an automated testing suite and is exhaustively documented on http://halotools.readthedocs.io, which includes quickstart guides, source code notes and a large collection of tutorials. The documentation is effectively an online textbook on how to build and study empirical models of galaxy formation with Python.

  2. Forward Modeling of Large-scale Structure: An Open-source Approach with Halotools

    Energy Technology Data Exchange (ETDEWEB)

    Hearin, Andrew P.; Campbell, Duncan; Tollerud, Erik; Behroozi, Peter; Diemer, Benedikt; Goldbaum, Nathan J.; Jennings, Elise; Leauthaud, Alexie; Mao, Yao-Yuan; More, Surhud; Parejko, John; Sinha, Manodeep; Sipöcz, Brigitta; Zentner, Andrew

    2017-10-18

    We present the first stable release of Halotools (v0.2), a community-driven Python package designed to build and test models of the galaxy-halo connection. Halotools provides a modular platform for creating mock universes of galaxies starting from a catalog of dark matter halos obtained from a cosmological simulation. The package supports many of the common forms used to describe galaxy-halo models: the halo occupation distribution, the conditional luminosity function, abundance matching, and alternatives to these models that include effects such as environmental quenching or variable galaxy assembly bias. Satellite galaxies can be modeled to live in subhalos or to follow custom number density profiles within their halos, including spatial and/or velocity bias with respect to the dark matter profile. The package has an optimized toolkit to make mock observations on a synthetic galaxy population—including galaxy clustering, galaxy–galaxy lensing, galaxy group identification, RSD multipoles, void statistics, pairwise velocities and others—allowing direct comparison to observations. Halotools is object-oriented, enabling complex models to be built from a set of simple, interchangeable components, including those of your own creation. Halotools has an automated testing suite and is exhaustively documented on http://halotools.readthedocs.io, which includes quickstart guides, source code notes and a large collection of tutorials. The documentation is effectively an online textbook on how to build and study empirical models of galaxy formation with Python.

  3. Implementation of an Online Chemistry Model to a Large Eddy Simulation Model (PALM-4U0

    Science.gov (United States)

    Mauder, M.; Khan, B.; Forkel, R.; Banzhaf, S.; Russo, E. E.; Sühring, M.; Kanani-Sühring, F.; Raasch, S.; Ketelsen, K.

    2017-12-01

    Large Eddy Simulation (LES) models permit to resolve relevant scales of turbulent motion, so that these models can capture the inherent unsteadiness of atmospheric turbulence. However, LES models are so far hardly applied for urban air quality studies, in particular chemical transformation of pollutants. In this context, BMBF (Bundesministerium für Bildung und Forschung) funded a joint project, MOSAIK (Modellbasierte Stadtplanung und Anwendung im Klimawandel / Model-based city planning and application in climate change) with the main goal to develop a new highly efficient urban climate model (UCM) that also includes atmospheric chemical processes. The state-of-the-art LES model PALM; Maronga et al, 2015, Geosci. Model Dev., 8, doi:10.5194/gmd-8-2515-2015), has been used as a core model for the new UCM named as PALM-4U. For the gas phase chemistry, a fully coupled 'online' chemistry model has been implemented into PALM. The latest version of the Kinetic PreProcessor (KPP) Version 2.3, has been utilized for the numerical integration of chemical species. Due to the high computational demands of the LES model, compromises in the description of chemical processes are required. Therefore, a reduced chemistry mechanism, which includes only major pollutants namely O3, NO, NO2, CO, a highly simplified VOC chemistry and a small number of products have been implemented. This work shows preliminary results of the advection, and chemical transformation of atmospheric pollutants. Non-cyclic boundaries have been used for inflow and outflow in east-west directions while periodic boundary conditions have been implemented to the south-north lateral boundaries. For practical applications, our approach is to go beyond the simulation of single street canyons to chemical transformation, advection and deposition of air pollutants in the larger urban canopy. Tests of chemistry schemes and initial studies of chemistry-turbulence, transport and transformations are presented.

  4. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  5. A comparison of updating algorithms for large $N$ reduced models

    CERN Document Server

    Pérez, Margarita García; Keegan, Liam; Okawa, Masanori; Ramos, Alberto

    2015-01-01

    We investigate Monte Carlo updating algorithms for simulating $SU(N)$ Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole $SU(N)$ matrix at once, or iterating through $SU(2)$ subgroups of the $SU(N)$ matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  6. A noise generation and propagation model for large wind farms

    DEFF Research Database (Denmark)

    Bertagnolio, Franck

    2016-01-01

    A wind turbine noise calculation model is combined with a ray tracing method in order to estimate wind farm noise in its surrounding assuming an arbitrary topography. The wind turbine noise model is used to generate noise spectra for which each turbine is approximated as a point source. However......, the detailed three-dimensional directivity features are taken into account for the further calculation of noise propagation over the surrounding terrain. An arbitrary number of turbines constituting a wind farm can be spatially distributed. The noise from each individual turbine is propagated into the far......-field using the ray tracing method. These results are added up assuming the noise from each turbine is uncorrelated. The methodology permits to estimate a wind farm noise map over the surrounding terrain in a reasonable amount of computational time on a personal computer....

  7. Biofidelic Human Activity Modeling and Simulation with Large Variability

    Science.gov (United States)

    2014-11-25

    exact match or a close representation. Efforts were made to ensure that the activity models can be integrated into widely used game engines and image...integrated into widely used game engines and image generators. ABOUT THE AUTHORS Dr. John Camp is a computer research scientist employed by AFRL. Dr...M&S) has been increasingly used in simulation-based training and virtual reality ( VR ). However, human M&S technology currently used in various

  8. Large area application of a corn hazard model. [Soviet Union

    Science.gov (United States)

    Ashburn, P.; Taylor, T. W. (Principal Investigator)

    1981-01-01

    An application test of the crop calendar portion of a corn (maize) stress indicator model developed by the early warning, crop condition assessment component of AgRISTARS was performed over the corn for grain producing regions of the U.S.S.R. during the 1980 crop year using real data. Performance of the crop calendar submodel was favorable; efficiency gains in meteorological data analysis time were on a magnitude of 85 to 90 percent.

  9. Chemical Modeling for Large-Eddy Simulation of Turbulent Combustion

    Science.gov (United States)

    2009-03-31

    Swirl Burner 11 2 Development of an Interactive Platform for Generation, Comparison, and Evaluation of Kinetic Models for JP-8 Surrogate Fuels 13...the refined mesh resolution is increased. Application of the RLSG to a turbulent bunsen flame, however, showed that the flame front solution remained... bunsen flame. A schematic of this LES is shown in Fig. 4. The contour cut plane shows the temperature field, while the isocontour shows the level

  10. Radiation Therapy for Primary Cutaneous Anaplastic Large Cell Lymphoma: An International Lymphoma Radiation Oncology Group Multi-institutional Experience

    International Nuclear Information System (INIS)

    Million, Lynn; Yi, Esther J.; Wu, Frank; Von Eyben, Rie; Campbell, Belinda A.; Dabaja, Bouthaina; Tsang, Richard W.; Ng, Andrea; Wilson, Lynn D.; Ricardi, Umberto; Kirova, Youlia; Hoppe, Richard T.

    2016-01-01

    Purpose: To collect response rates of primary cutaneous anaplastic large cell lymphoma, a rare cutaneous T-cell lymphoma, to radiation therapy (RT), and to determine potential prognostic factors predictive of outcome. Methods and Materials: The study was a retrospective analysis of patients with primary cutaneous anaplastic large cell lymphoma who received RT as primary therapy or after surgical excision. Data collected include initial stage of disease, RT modality (electron/photon), total dose, fractionation, response to treatment, and local recurrence. Radiation therapy was delivered at 8 participating International Lymphoma Radiation Oncology Group institutions worldwide. Results: Fifty-six patients met the eligibility criteria, and 63 tumors were treated: head and neck (27%), trunk (14%), upper extremities (27%), and lower extremities (32%). Median tumor size was 2.25 cm (range, 0.6-12 cm). T classification included T1, 40 patients (71%); T2, 12 patients (21%); and T3, 4 patients (7%). The median radiation dose was 35 Gy (range, 6-45 Gy). Complete clinical response (CCR) was achieved in 60 of 63 tumors (95%) and partial response in 3 tumors (5%). After CCR, 1 tumor recurred locally (1.7%) after 36 Gy and 7 months after RT. This was the only patient to die of disease. Conclusions: Primary cutaneous anaplastic large cell lymphoma is a rare, indolent cutaneous lymphoma with a low death rate. This analysis, which was restricted to patients selected for treatment with radiation, indicates that achieving CCR was independent of radiation dose. Because there were too few failures (<2%) for statistical analysis on dose response, 30 Gy seems to be adequate for local control, and even lower doses may suffice.

  11. METHODOLOGY & CALCULATIONS FOR THE ASSIGNMENT OF WASTE GROUPS FOR THE LARGE UNDERGROUND WASTE STORAGE TANKS AT THE HANFORD SITE

    Energy Technology Data Exchange (ETDEWEB)

    BARKER, S.A.

    2006-07-27

    Waste stored within tank farm double-shell tanks (DST) and single-shell tanks (SST) generates flammable gas (principally hydrogen) to varying degrees depending on the type, amount, geometry, and condition of the waste. The waste generates hydrogen through the radiolysis of water and organic compounds, thermolytic decomposition of organic compounds, and corrosion of a tank's carbon steel walls. Radiolysis and thermolytic decomposition also generates ammonia. Nonflammable gases, which act as dilutents (such as nitrous oxide), are also produced. Additional flammable gases (e.g., methane) are generated by chemical reactions between various degradation products of organic chemicals present in the tanks. Volatile and semi-volatile organic chemicals in tanks also produce organic vapors. The generated gases in tank waste are either released continuously to the tank headspace or are retained in the waste matrix. Retained gas may be released in a spontaneous or induced gas release event (GRE) that can significantly increase the flammable gas concentration in the tank headspace as described in RPP-7771. The document categorizes each of the large waste storage tanks into one of several categories based on each tank's waste characteristics. These waste group assignments reflect a tank's propensity to retain a significant volume of flammable gases and the potential of the waste to release retained gas by a buoyant displacement event. Revision 5 is the annual update of the methodology and calculations of the flammable gas Waste Groups for DSTs and SSTs.

  12. Effects of uncertainty in model predictions of individual tree volume on large area volume estimates

    Science.gov (United States)

    Ronald E. McRoberts; James A. Westfall

    2014-01-01

    Forest inventory estimates of tree volume for large areas are typically calculated by adding model predictions of volumes for individual trees. However, the uncertainty in the model predictions is generally ignored with the result that the precision of the large area volume estimates is overestimated. The primary study objective was to estimate the effects of model...

  13. Large animals as potential models of human mental and behavioral disorders.

    Science.gov (United States)

    Danek, Michał; Danek, Janusz; Araszkiewicz, Aleksander

    2017-12-30

    Many animal models in different species have been developed for mental and behavioral disorders. This review presents large animals (dog, ovine, swine, horse) as potential models of this disorders. The article was based on the researches that were published in the peer-reviewed journals. Aliterature research was carried out using the PubMed database. The above issues were discussed in the several problem groups in accordance with the WHO International Statistical Classification of Diseases and Related Health Problems 10thRevision (ICD-10), in particular regarding: organic, including symptomatic, disorders; mental disorders (Alzheimer's disease and Huntington's disease, pernicious anemia and hepatic encephalopathy, epilepsy, Parkinson's disease, Creutzfeldt-Jakob disease); behavioral disorders due to psychoactive substance use (alcoholic intoxication, abuse of morphine); schizophrenia and other schizotypal disorders (puerperal psychosis); mood (affective) disorders (depressive episode); neurotic, stress-related and somatoform disorders (posttraumatic stress disorder, obsessive-compulsive disorder); behavioral syndromes associated with physiological disturbances and physical factors (anxiety disorders, anorexia nervosa, narcolepsy); mental retardation (Cohen syndrome, Down syndrome, Hunter syndrome); behavioral and emotional disorders (attention deficit hyperactivity disorder). This data indicates many large animal disorders which can be models to examine the above human mental and behavioral disorders.

  14. Modeling a Large Data Acquisition Network in a Simulation Framework

    CERN Document Server

    Colombo, Tommaso; The ATLAS collaboration

    2015-01-01

    The ATLAS detector at CERN records particle collision “events” delivered by the Large Hadron Collider. Its data-acquisition system is a distributed software system that identifies, selects, and stores interesting events in near real-time, with an aggregate throughput of several 10 GB/s. It is a distributed software system executed on a farm of roughly 2000 commodity worker nodes communicating via TCP/IP on an Ethernet network. Event data fragments are received from the many detector readout channels and are buffered, collected together, analyzed and either stored permanently or discarded. This system, and data-acquisition systems in general, are sensitive to the latency of the data transfer from the readout buffers to the worker nodes. Challenges affecting this transfer include the many-to-one communication pattern and the inherently bursty nature of the traffic. In this paper we introduce the main performance issues brought about by this workload, focusing in particular on the so-called TCP incast pathol...

  15. On-line core monitoring system based on buckling corrected modified one group model

    International Nuclear Information System (INIS)

    Freire, Fernando S.

    2011-01-01

    Nuclear power reactors require core monitoring during plant operation. To provide safe, clean and reliable core continuously evaluate core conditions. Currently, the reactor core monitoring process is carried out by nuclear code systems that together with data from plant instrumentation, such as, thermocouples, ex-core detectors and fixed or moveable In-core detectors, can easily predict and monitor a variety of plant conditions. Typically, the standard nodal methods can be found on the heart of such nuclear monitoring code systems. However, standard nodal methods require large computer running times when compared with standards course-mesh finite difference schemes. Unfortunately, classic finite-difference models require a fine mesh reactor core representation. To override this unlikely model characteristic we can usually use the classic modified one group model to take some account for the main core neutronic behavior. In this model a course-mesh core representation can be easily evaluated with a crude treatment of thermal neutrons leakage. In this work, an improvement made on classic modified one group model based on a buckling thermal correction was used to obtain a fast, accurate and reliable core monitoring system methodology for future applications, providing a powerful tool for core monitoring process. (author)

  16. Wess-Zumino-Witten model based on a nonsemisimple group

    International Nuclear Information System (INIS)

    Nappi, C.R.; Witten, E.

    1993-01-01

    We present a conformal field theory which describes a homogeneous four dimensional Lorentz-signature space-time. The model is an ungauged Wess-Zumino-Witten model based on a central extension of the Poincare algebra. The central charge of this theory is exactly four, just like four dimensional Minkowski space. The model can be interpreted as a four dimensional monochromatic plane wave. As there are three commuting isometries, other interesting geometries are expected to emerge via O(3,3) duality

  17. Systems Execution Modeling Technologies for Large-Scale Net-Centric Department of Defense Systems

    Science.gov (United States)

    2011-12-01

    problem, the design groups may use different constraint specific models, such as Ptolemy , RT-Maude, Excel, or UML, to model design constraints and...different types. For example, one group may use a Ptolemy model for analyzing fault tolerance requirements while another person may use an Excel model for

  18. Approximate Tests of Hypotheses in Regression Models with Grouped Data

    Science.gov (United States)

    1979-02-01

    in terms of Kolmogoroff -Smirnov statistic in the next section. I 1 1 I t A 4. Simulations Two models have been considered for simulations. Model I. Yuk...Fort Meade, MD 20755 2 Commanding Officer Navy LibraryrnhOffice o Naval Research National Space Technology LaboratoryBranch Office *Attn: Navy

  19. Simulation of Two-group IATE models with EAGLE code

    International Nuclear Information System (INIS)

    Nguyen, V. T.; Bae, B. U.; Song, C. H.

    2011-01-01

    The two-group transport equation should be employed in order to describe correctly the interfacial area transport in various two phase flow regimes, especially at the bubbly-to-slug flow transition. This is because the differences in bubble sizes or shapes cause substantial differences in their transport mechanisms and interaction phenomena. The basic concept of two group interfacial area transport equations have been formulated and demonstrated for vertical gas-liquid bubbly-to-slug flow transition by Hibiki and his coworkers. More than twelve adjustable parameters need to be determined based on extensive experimental database. It should be noted that these parameters were adjusted only in one-dimensional approach by area averaged flow parameters in a vertical pipe under adiabatic and steady conditions. This obviously brings up the following experimental issue: how to adjust all these parameters as independently as possible by considering experiments where a single physical phenomenon is of importance. The vertical air-water loop (VAWL) has been used for investigating the transport phenomena of two-phase flow at Korea Atomic Energy Research Institute (KAERI). The data for local void fraction and interfacial area concentration are measured by using five-sensor conductivity probe method and classified into two groups, the small spherical bubble group and the cap/slug one. The initial bubble size, which has a big influence on the interaction mechanism between phases, was controlled. In the present work, two-group interfacial area transport equation (IATE) was implemented in the EAGLE code and assessed against VAWL data. The purpose of this study is to investigate the capability of coefficients derived by Hibiki in the two-group interfacial area transport equations with CFD code

  20. Client perception of therapeutic factors in group psychotherapy and growth groups: an empirically-based hierarchical model.

    Science.gov (United States)

    Dierick, Paul; Lietaer, Germain

    2008-04-01

    To assess group participants' perceptions of therapeutic factors, we developed an extensive questionnaire of 155 items that was administered to 489 members of 78 psychotherapy and growth groups of client-centered/experiential, psychoanalytic, behavioral, Gestalt and drama- and bodily oriented orientations. Using multivariate analyses we found a model that reveals the structure and connections of therapeutic factors as they are differentiated in the experience of the group members. Our model encompasses three hierarchical levels of abstraction: 28 Basic scales that appeared to be structured into seven main scales (Group Cohesion, Interactional Confirmation, Cathartic Self-Revelation, Self-Insight and Progress, Observational Experiences, Getting Directives, and Interactional Confrontation) and two dimensions (Relational Climate and Psychological Work). Validity for these therapeutic factors was found in their grounded content, statistically analyzed constructs, importance ratings, and correlations to intermediate outcome measures.

  1. Working Group 2: A critical appraisal of model simulations

    International Nuclear Information System (INIS)

    MacCracken, M.; Cubasch, U.; Gates, W.L.; Harvey, L.D.; Hunt, B.; Katz, R.; Lorenz, E.; Manabe, S.; McAvaney, B.; McFarlane, N.; Meehl, G.; Meleshko, V.; Robock, A.; Stenchikov, G.; Stouffer, R.; Wang, W.C.; Washington, W.; Watts, R.; Zebiak, S.

    1990-01-01

    The complexity of the climate system and the absence of definitive analogs to the evolving climatic situation force use of theoretical models to project the future climatic influence of the relatively rapid and on-going increase in the atmospheric concentrations of CO 2 and other trace gases. A wide variety of climate models has been developed to look at particular aspects of the problem and to vary the mix of complexity and resource requirements needed to study various aspects of the problem; all such models have contributed to their insights of the problem

  2. A Renormalization Group Like Model for a Democratic Dictatorship

    Science.gov (United States)

    Galam, Serge

    2015-03-01

    We review a model of sociophysics which deals with democratic voting in bottom up hierarchical systems. The connection to the original physical model and technics are outlined underlining both the similarities and the differences. Emphasis is put on the numerous novel and counterintuitive results obtained with respect to the associated social and political framework. Using this model a real political event was successfully predicted with the victory of the French extreme right party in the 2000 first round of French presidential elections. The perspectives and the challenges to make sociophysics a predictive solid field of science are discussed.

  3. A Model for Establishing an Astronomy Education Discussion Group

    Science.gov (United States)

    Deming, Grace; Hayes-Gehrke, M.; Zauderer, B. A.; Bovill, M. S.; DeCesar, M.

    2010-01-01

    In October 2005, a group of astronomy faculty and graduate students met to establish departmental support for participants in the UM Center for Teaching Excellence University Teaching and Learning Program. This program seeks to increase graduate students’ understanding of effective teaching methods, awareness of student learning, and appreciation of education as a scholarly pursuit. Our group has facilitated the submission of successful graduate student educational development grant proposals to the Center for Teaching Excellence (CTE). Completion of the CTE program results in a notation on the graduate student's transcript. Our discussion group met monthly during the first two years. The Astronomy Education Review, The Physics Teacher, The Washington Post, The Chronicle of Higher Education, and National Research Council publications were used to provide background for discussion. Beginning in 2007, the group began sponsoring monthly astronomy education lunches during the academic year to which the entire department was invited. Over the past two years, speakers have included graduate students, faculty, and guests, such as Jay Labov from the National Research Council. Topics have included the Astronomy Diagnostic Test, intelligent design versus evolution, active learning techniques, introducing the use of lecture tutorials, using effective demonstrations, confronting student misconceptions, engagement through clickers (or cards), and fostering critical thinking with ranking tasks. The results of an informal evaluation will be presented.

  4. Explorations in combining cognitive models of individuals and system dynamics models of groups.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.

    2008-07-01

    This report documents a demonstration model of interacting insurgent leadership, military leadership, government leadership, and societal dynamics under a variety of interventions. The primary focus of the work is the portrayal of a token societal model that responds to leadership activities. The model also includes a linkage between leadership and society that implicitly represents the leadership subordinates as they directly interact with the population. The societal model is meant to demonstrate the efficacy and viability of using System Dynamics (SD) methods to simulate populations and that these can then connect to cognitive models depicting individuals. SD models typically focus on average behavior and thus have limited applicability to describe small groups or individuals. On the other hand, cognitive models readily describe individual behavior but can become cumbersome when used to describe populations. Realistic security situations are invariably a mix of individual and population dynamics. Therefore, the ability to tie SD models to cognitive models provides a critical capability that would be otherwise be unavailable.

  5. Do not Lose Your Students in Large Lectures: A Five-Step Paper-Based Model to Foster Students’ Participation

    Directory of Open Access Journals (Sweden)

    Mona Hassan Aburahma

    2015-07-01

    Full Text Available Like most of the pharmacy colleges in developing countries with high population growth, public pharmacy colleges in Egypt are experiencing a significant increase in students’ enrollment annually due to the large youth population, accompanied with the keenness of students to join pharmacy colleges as a step to a better future career. In this context, large lectures represent a popular approach for teaching the students as economic and logistic constraints prevent splitting them into smaller groups. Nevertheless, the impact of large lectures in relation to student learning has been widely questioned due to their educational limitations, which are related to the passive role the students maintain in lectures. Despite the reported feebleness underlying large lectures and lecturing in general, large lectures will likely continue to be taught in the same format in these countries. Accordingly, to soften the negative impacts of large lectures, this article describes a simple and feasible 5-step paper-based model to transform lectures from a passive information delivery space into an active learning environment. This model mainly suits educational establishments with financial constraints, nevertheless, it can be applied in lectures presented in any educational environment to improve active participation of students. The components and the expected advantages of employing the 5-step paper-based model in large lectures as well as its limitations and ways to overcome them are presented briefly. The impact of applying this model on students’ engagement and learning is currently being investigated.

  6. Do not Lose Your Students in Large Lectures: A Five-Step Paper-Based Model to Foster Students’ Participation

    Science.gov (United States)

    Aburahma, Mona Hassan

    2015-01-01

    Like most of the pharmacy colleges in developing countries with high population growth, public pharmacy colleges in Egypt are experiencing a significant increase in students’ enrollment annually due to the large youth population, accompanied with the keenness of students to join pharmacy colleges as a step to a better future career. In this context, large lectures represent a popular approach for teaching the students as economic and logistic constraints prevent splitting them into smaller groups. Nevertheless, the impact of large lectures in relation to student learning has been widely questioned due to their educational limitations, which are related to the passive role the students maintain in lectures. Despite the reported feebleness underlying large lectures and lecturing in general, large lectures will likely continue to be taught in the same format in these countries. Accordingly, to soften the negative impacts of large lectures, this article describes a simple and feasible 5-step paper-based model to transform lectures from a passive information delivery space into an active learning environment. This model mainly suits educational establishments with financial constraints, nevertheless, it can be applied in lectures presented in any educational environment to improve active participation of students. The components and the expected advantages of employing the 5-step paper-based model in large lectures as well as its limitations and ways to overcome them are presented briefly. The impact of applying this model on students’ engagement and learning is currently being investigated. PMID:28975906

  7. Do not Lose Your Students in Large Lectures: A Five-Step Paper-Based Model to Foster Students' Participation.

    Science.gov (United States)

    Aburahma, Mona Hassan

    2015-07-27

    Like most of the pharmacy colleges in developing countries with high population growth, public pharmacy colleges in Egypt are experiencing a significant increase in students' enrollment annually due to the large youth population, accompanied with the keenness of students to join pharmacy colleges as a step to a better future career. In this context, large lectures represent a popular approach for teaching the students as economic and logistic constraints prevent splitting them into smaller groups. Nevertheless, the impact of large lectures in relation to student learning has been widely questioned due to their educational limitations, which are related to the passive role the students maintain in lectures. Despite the reported feebleness underlying large lectures and lecturing in general, large lectures will likely continue to be taught in the same format in these countries. Accordingly, to soften the negative impacts of large lectures, this article describes a simple and feasible 5-step paper-based model to transform lectures from a passive information delivery space into an active learning environment. This model mainly suits educational establishments with financial constraints, nevertheless, it can be applied in lectures presented in any educational environment to improve active participation of students. The components and the expected advantages of employing the 5-step paper-based model in large lectures as well as its limitations and ways to overcome them are presented briefly. The impact of applying this model on students' engagement and learning is currently being investigated.

  8. Highly efficient model updating for structural condition assessment of large-scale bridges.

    Science.gov (United States)

    2015-02-01

    For eciently updating models of large-scale structures, the response surface (RS) method based on radial basis : functions (RBFs) is proposed to model the input-output relationship of structures. The key issues for applying : the proposed method a...

  9. The Spanish human papillomavirus vaccine consensus group: a working model.

    Science.gov (United States)

    Cortés-Bordoy, Javier; Martinón-Torres, Federico

    2010-08-01

    Successful implementation of Human Papillomavirus (HPV) vaccine in each country can only be achieved from a complementary and synergistic perspective, integrating all the different points of view of the diverse related professionals. It is this context where the Spanish HPV Vaccine Consensus Group (Grupo Español de Consenso sobre la Vacuna VPH, GEC-VPH) was created. GEC-VPH philosophy, objectives and experience are reported in this article, with particular attention to the management of negative publicity and anti-vaccine groups. Initiatives as GEC-VPH--adapted to each country's particular idiosyncrasies--might help to overcome the existing barriers and to achieve wide and early implementation of HPV vaccination.

  10. Small- and large-signal modeling of InP HBTs in transferred-substrate technology

    DEFF Research Database (Denmark)

    Johansen, Tom Keinicke; Rudolph, Matthias; Jensen, Thomas

    2014-01-01

    a direct parameter extraction methodology dedicated to III–V based HBTs. It is shown that the modeling of measured S-parameters can be improved in the millimeter-wave frequency range by augmenting the small-signal model with a description of AC current crowding. The extracted elements of the small......-signal model structure are employed as a starting point for the extraction of a large-signal model. The developed large-signal model for the TS-HBTs accurately predicts the DC over temperature and small-signal performance over bias as well as the large-signal performance at millimeter-wave frequencies....

  11. A simple model of group selection that cannot be analyzed with inclusive fitness

    NARCIS (Netherlands)

    van Veelen, M.; Luo, S.; Simon, B.

    2014-01-01

    A widespread claim in evolutionary theory is that every group selection model can be recast in terms of inclusive fitness. Although there are interesting classes of group selection models for which this is possible, we show that it is not true in general. With a simple set of group selection models,

  12. Modelling fuel demand for different socio-economic groups

    International Nuclear Information System (INIS)

    Wadud, Zia; Graham, Daniel J.; Noland, Robert B.

    2009-01-01

    The fuel demand literature provides a range of estimates of the long and short-run price and income elasticities of gasoline demand for different countries and states. These estimates can be very useful in predicting the overall impacts of policy approaches designed to reduce fuel consumption and to address concerns of carbon emissions or energy security. However, analysis of policy options based on elasticities that are homogenous across income groups provides no information about the relative distributional burden that may be faced by different sectors of the population. Different responses to the same change in price or income are likely to occur, dependent on both travel needs and income levels. This paper estimates gasoline demand elasticities for different income quintiles in the United States to test for heterogeneity in demand response. Group wise summary consumer expenditure data for 20 years is used to derive the elasticity estimates. The results show that the elasticities do vary across groups and follow a U-pattern from the lowest to the highest income quintile. The lowest income quintile is found to have the largest price elasticity. The lowest and the highest income quintiles appear to be statistically insensitive to any changes in income. The rebound effect also follows the U-pattern, with the highest rebound observed among the wealthiest households. Rural households appear to have lower price elasticity than households in urban areas. (author)

  13. Fast three-dimensional core optimization based on modified one-group model

    International Nuclear Information System (INIS)

    Freire, Fernando S.; Martinez, Aquilino S.; Silva, Fernando C. da

    2009-01-01

    The optimization of any nuclear reactor core is an extremely complex process that consumes a large amount of computer time. Fortunately, the nuclear designer can rely on a variety of methodologies able to approximate the analysis of each available core loading pattern. Two-dimensional codes are usually used to analyze the loading scheme. However, when particular axial effects are present in the core, two-dimensional analysis cannot produce good results and three-dimensional analysis can be required at all time. Basically, in this paper are presented the major advantages that can be found when one use the modified one-group diffusion theory coupled with a buckling correction model in optimization process. The results of the proposed model are very accurate when compared to benchmark results obtained from detailed calculations using three-dimensional nodal codes (author)

  14. Fast three-dimensional core optimization based on modified one-group model

    Energy Technology Data Exchange (ETDEWEB)

    Freire, Fernando S. [ELETROBRAS Termonuclear S.A. - ELETRONUCLEAR, Rio de Janeiro, RJ (Brazil). Dept. GCN-T], e-mail: freire@eletronuclear.gov.br; Martinez, Aquilino S.; Silva, Fernando C. da [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear], e-mail: aquilino@con.ufrj.br, e-mail: fernando@con.ufrj.br

    2009-07-01

    The optimization of any nuclear reactor core is an extremely complex process that consumes a large amount of computer time. Fortunately, the nuclear designer can rely on a variety of methodologies able to approximate the analysis of each available core loading pattern. Two-dimensional codes are usually used to analyze the loading scheme. However, when particular axial effects are present in the core, two-dimensional analysis cannot produce good results and three-dimensional analysis can be required at all time. Basically, in this paper are presented the major advantages that can be found when one use the modified one-group diffusion theory coupled with a buckling correction model in optimization process. The results of the proposed model are very accurate when compared to benchmark results obtained from detailed calculations using three-dimensional nodal codes (author)

  15. Integrated wetland management: an analysis with group model building based on system dynamics model.

    Science.gov (United States)

    Chen, Hsin; Chang, Yang-Chi; Chen, Kung-Chen

    2014-12-15

    The wetland system possesses diverse functions such as preserving water sources, mediating flooding, providing habitats for wildlife and stabilizing coastlines. Nonetheless, rapid economic growth and the increasing population have significantly deteriorated the wetland environment. To secure the sustainability of the wetland, it is essential to introduce integrated and systematic management. This paper examines the resource management of the Jiading Wetland by applying group model building (GMB) and system dynamics (SD). We systematically identify local stakeholders' mental model regarding the impact brought by the yacht industry, and further establish a SD model to simulate the dynamic wetland environment. The GMB process improves the stakeholders' understanding about the interaction between the wetland environment and management policies. Differences between the stakeholders' perceptions and the behaviors shown by the SD model also suggest that our analysis would facilitate the stakeholders to broaden their horizons and achieve consensus on the wetland resource management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Assessment of various natural orbitals as the basis of large active space density-matrix renormalization group calculations.

    Science.gov (United States)

    Ma, Yingjin; Ma, Haibo

    2013-06-14

    It is well-known that not only the orbital ordering but also the choice of the orbitals itself as the basis may significantly influence the computational efficiency of density-matrix renormalization group (DMRG) calculations. In this study, for assessing the efficiency of using various natural orbitals (NOs) as the DMRG basis, we performed benchmark DMRG calculations with different bases, which included the NOs obtained by various traditional electron correlation methods, as well as NOs acquired from preliminary moderate DMRG calculations (e.g., preserved states less than 500). The tested systems included N2, transition metal Cr2 systems, as well as 1D hydrogen polyradical chain systems under equilibrium and dissociation conditions and 2D hydrogen aggregates. The results indicate that a good compromise between the requirement for low computational costs of acquiring NOs and the demand for high efficiency of NOs as the basis of DMRG calculations may be very dependent on the studied systems' diverse electron correlation characteristics and the size of the active space. It is also shown that a DMRG-complete active space configuration interaction (DMRG-CASCI) calculation in a basis of carefully chosen NOs can provide a less expensive alternative to the standard DMRG-complete active space self-consistent field (DMRG-CASSCF) calculation and avoid the convergence difficulties of orbital optimization for large active spaces. The effect of different NO ordering schemes on DMRG-CASCI calculations is also discussed.

  17. Enhancement in evaluating small group work in courses with large number of students. Machine theory at industrial engineering degrees

    Directory of Open Access Journals (Sweden)

    Lluïsa Jordi Nebot

    2013-03-01

    Full Text Available This article examines new tutoring evaluation methods to be adopted in the course, Machine Theory, in the Escola Tècnica Superior d’Enginyeria Industrial de Barcelona (ETSEIB, Universitat Politècnica de Catalunya. These new methods have been developed in order to facilitate teaching staff work and include students in the evaluation process. Machine Theory is a required course with a large number of students. These students are divided into groups of three, and required to carry out a supervised work constituting 20% of their final mark. These new evaluation methods were proposed in response to the significant increase of students in spring semester of 2010-2011, and were pilot tested during fall semester of academic year 2011-2012, in the previous Industrial Engineering degree program. Pilot test results were highly satisfactory for students and teachers, alike, and met proposed educational objectives. For this reason, the new evaluation methodology was adopted in spring semester of 2011-2012, in the current bachelor’s degree program in Industrial Technology (Grau en Enginyeria en Tecnologies Industrials, GETI, where it has also achieved highly satisfactory results.

  18. Integrating an agent-based model into a large-scale hydrological model for evaluating drought management in California

    Science.gov (United States)

    Sheffield, J.; He, X.; Wada, Y.; Burek, P.; Kahil, M.; Wood, E. F.; Oppenheimer, M.

    2017-12-01

    California has endured record-breaking drought since winter 2011 and will likely experience more severe and persistent drought in the coming decades under changing climate. At the same time, human water management practices can also affect drought frequency and intensity, which underscores the importance of human behaviour in effective drought adaptation and mitigation. Currently, although a few large-scale hydrological and water resources models (e.g., PCR-GLOBWB) consider human water use and management practices (e.g., irrigation, reservoir operation, groundwater pumping), none of them includes the dynamic feedback between local human behaviors/decisions and the natural hydrological system. It is, therefore, vital to integrate social and behavioral dimensions into current hydrological modeling frameworks. This study applies the agent-based modeling (ABM) approach and couples it with a large-scale hydrological model (i.e., Community Water Model, CWatM) in order to have a balanced representation of social, environmental and economic factors and a more realistic representation of the bi-directional interactions and feedbacks in coupled human and natural systems. In this study, we focus on drought management in California and considers two types of agents, which are (groups of) farmers and state management authorities, and assumed that their corresponding objectives are to maximize the net crop profit and to maintain sufficient water supply, respectively. Farmers' behaviors are linked with local agricultural practices such as cropping patterns and deficit irrigation. More precisely, farmers' decisions are incorporated into CWatM across different time scales in terms of daily irrigation amount, seasonal/annual decisions on crop types and irrigated area as well as the long-term investment of irrigation infrastructure. This simulation-based optimization framework is further applied by performing different sets of scenarios to investigate and evaluate the effectiveness

  19. Multiple Imputation Strategies for Multiple Group Structural Equation Models

    Science.gov (United States)

    Enders, Craig K.; Gottschall, Amanda C.

    2011-01-01

    Although structural equation modeling software packages use maximum likelihood estimation by default, there are situations where one might prefer to use multiple imputation to handle missing data rather than maximum likelihood estimation (e.g., when incorporating auxiliary variables). The selection of variables is one of the nuances associated…

  20. Cox's regression model for dynamics of grouped unemployment data

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2003-01-01

    Roč. 10, č. 19 (2003), s. 151-162 ISSN 1212-074X R&D Projects: GA ČR GA402/01/0539 Institutional research plan: CEZ:AV0Z1075907 Keywords : mathematical statistics * survival analysis * Cox's model Subject RIV: BB - Applied Statistics, Operational Research

  1. The Cauchy problem for a model of immiscible gas flow with large data

    Energy Technology Data Exchange (ETDEWEB)

    Sande, Hilde

    2008-12-15

    The thesis consists of an introduction and two papers; 1. The solution of the Cauchy problem with large data for a model of a mixture of gases. 2. Front tracking for a model of immiscible gas flow with large data. (AG) refs, figs

  2. Induction of continuous expanding infrarenal aortic aneurysms in a large porcine animal model

    DEFF Research Database (Denmark)

    Kloster, Brian Ozeraitis; Lund, Lars; Lindholt, Jes S.

    2015-01-01

    frequent complication was a neurological deficit in the lower limbs. ConclusionIn pigs it’s possible to induce continuous expanding AAA’s based upon proteolytic degradation and pathological flow, resembling the real life dynamics of human aneurysms. Because the lumbars are preserved, it’s also a potential......BackgroundA large animal model with a continuous expanding infrarenal aortic aneurysm gives access to a more realistic AAA model with anatomy and physiology similar to humans, and thus allows for new experimental research in the natural history and treatment options of the disease. Methods10 pigs......, hereafter the pigs were euthanized for inspection and AAA wall sampling for histological analysis. ResultsIn group A, all pigs developed continuous expanding AAA’s with a mean increase in AP-diameter to 16.26 ± 0.93 mm equivalent to a 57% increase. In group B the AP-diameters increased to 11.33 ± 0.13 mm...

  3. International workshop of the Confinement Database and Modelling Expert Group in collaboration with the Edge and Pedestal Physics Expert Group

    International Nuclear Information System (INIS)

    Cordey, J.; Kardaun, O.

    2001-01-01

    A Workshop of the Confinement Database and Modelling Expert Group (EG) was held on 2-6 April at the Plasma Physics Research Center of Lausanne (CRPP), Switzerland. Presentations were held on the present status of the plasma pedestal (temperature and energy) scalings from an empirical and theoretical perspective. An integrated approach to modelling tokamaks incorporating core transport, edge pedestal and SOL, together with a model for ELMs was presented by JCT. New experimental data on on global H-mode confinement were discussed and presentations on L-H threshold power were made

  4. Working group report: Flavor physics and model building

    Indian Academy of Sciences (India)

    of the flavor physics subgroup were motivated by this experimental information as ... the study of physics beyond the Standard Model. This was ..... L2 = c12hT u iτ2hd(χ1 + χ2 + χ3). (4). The term L2 breaks A4 softly and in conjunction with L1, it gives rise to new ra- diative contribution known often as the Zee mechanism [22].

  5. Validation of CALMET/CALPUFF models simulations around a large power plant stack

    International Nuclear Information System (INIS)

    Hernandez-Garces, A.; Souto, J. A.; Rodriguez, A.; Saavedra, S.; Casares, J. J.

    2015-01-01

    Calmest/CALPUFF modeling system is frequently used in the study of atmospheric processes and pollution, and several validation tests were performed until now; nevertheless, most of them were based on experiments with a large compilation of surface and aloft meteorological measurements, rarely available. At the same time, the use of a large operational smokestack as tracer/pollutant source is not usual. In this work, first CALMET meteorological diagnostic model is nested to WRF meteorological prognostic model simulations (3x3 km 2 horizontal resolution) over a complex terrain and coastal domain at NW Spain, covering 100x100 km 2 , with a coal-fired power plant emitting SO 2 . Simulations were performed during three different periods when SO 2 hourly glc peaks were observed. NCEP reanalysis were applied as initial and boundary conditions. Yong Sei University-Pleim-Chang (YSU) PBL scheme was selected in the WRF model to provide the best input to three different CALMET horizontal resolutions, 1x1 km 2 , 0.5x0.5 km 2 , and 0.2x0.2 km 2 . The best results, very similar between them, were achieved using the last two resolutions; therefore, the 0.5x0.5 km 2 resolution was selected to test different CALMET meteorological inputs, using several combinations of WRF outputs and/or surface and upper-air measurements available in the simulation domain. With respect to meteorological aloft models output, CALMET PBL depth estimations are very similar to PBL depth estimations using upper-air measurements (rawinsondes), and significantly better than WRF PBL depth results. Regarding surface models surface output, the available meteorological sites were divided in two groups, one to provide meteorological input to CALMET (when applied), and another to models validation. Comparing WRF and CALMET outputs against surface measurements (from sites for models validation) the lowest RMSE was achieved using as CALMET input dataset WRF output combined with surface measurements (from sites for

  6. Validation of CALMET/CALPUFF models simulations around a large power plant stack

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez-Garces, A.; Souto Rodriguez, J.A.; Saavedra, S.; Casares, J.J.

    2015-07-01

    CALMET/CALPUFF modeling system is frequently used in the study of atmospheric processes and pollution, and several validation tests were performed until now; nevertheless, most of them were based on experiments with a large compilation of surface and aloft meteorological measurements, rarely available. At the same time, the use of a large operational smokestack as tracer/pollutant source is not usual. In this work, first CALMET meteorological diagnostic model is nested to WRF meteorological prognostic model simulations (3x3 km2 horizontal resolution) over a complex terrain and coastal domain at NW Spain, covering 100x100 km2 , with a coal-fired power plant emitting SO2. Simulations were performed during three different periods when SO2 hourly glc peaks were observed. NCEP reanalysis were applied as initial and boundary conditions. Yong Sei University-Pleim-Chang (YSU) PBL scheme was selected in the WRF model to provide the best input to three different CALMET horizontal resolutions, 1x1 km2 , 0.5x0.5 km2 , and 0.2x0.2 km2. The best results, very similar between them, were achieved using the last two resolutions; therefore, the 0.5x0.5 km2 resolution was selected to test different CALMET meteorological inputs, using several combinations of WRF outputs and/or surface and upper-air measurements available in the simulation domain. With respect to meteorological aloft models output, CALMET PBL depth estimations are very similar to PBL depth estimations using upper-air measurements (rawinsondes), and significantly better than WRF PBL depth results. Regarding surface models surface output, the available meteorological sites were divided in two groups, one to provide meteorological input to CALMET (when applied), and another to models validation. Comparing WRF and CALMET outputs against surface measurements (from sites for models validation) the lowest RMSE was achieved using as CALMET input dataset WRF output combined with surface measurements (from sites for CALMET model

  7. Validation of CALMET/CALPUFF models simulations around a large power plant stack

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez-Garces, A.; Souto, J. A.; Rodriguez, A.; Saavedra, S.; Casares, J. J.

    2015-07-01

    Calmest/CALPUFF modeling system is frequently used in the study of atmospheric processes and pollution, and several validation tests were performed until now; nevertheless, most of them were based on experiments with a large compilation of surface and aloft meteorological measurements, rarely available. At the same time, the use of a large operational smokestack as tracer/pollutant source is not usual. In this work, first CALMET meteorological diagnostic model is nested to WRF meteorological prognostic model simulations (3x3 km{sup 2} horizontal resolution) over a complex terrain and coastal domain at NW Spain, covering 100x100 km{sup 2}, with a coal-fired power plant emitting SO{sub 2}. Simulations were performed during three different periods when SO{sub 2} hourly glc peaks were observed. NCEP reanalysis were applied as initial and boundary conditions. Yong Sei University-Pleim-Chang (YSU) PBL scheme was selected in the WRF model to provide the best input to three different CALMET horizontal resolutions, 1x1 km{sup 2}, 0.5x0.5 km{sup 2}, and 0.2x0.2 km{sup 2}. The best results, very similar between them, were achieved using the last two resolutions; therefore, the 0.5x0.5 km{sup 2} resolution was selected to test different CALMET meteorological inputs, using several combinations of WRF outputs and/or surface and upper-air measurements available in the simulation domain. With respect to meteorological aloft models output, CALMET PBL depth estimations are very similar to PBL depth estimations using upper-air measurements (rawinsondes), and significantly better than WRF PBL depth results. Regarding surface models surface output, the available meteorological sites were divided in two groups, one to provide meteorological input to CALMET (when applied), and another to models validation. Comparing WRF and CALMET outputs against surface measurements (from sites for models validation) the lowest RMSE was achieved using as CALMET input dataset WRF output combined with

  8. Comparing the performance of SIMD computers by running large air pollution models

    DEFF Research Database (Denmark)

    Brown, J.; Hansen, Per Christian; Wasniewski, J.

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on these computers. Using a realistic large-scale model, we gained detailed insight about the performance of the computers involved when used to solve large-scale scientific...... problems that involve several types of numerical computations. The computers used in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  9. Black women, work, stress, and perceived discrimination: the focused support group model as an intervention for stress reduction.

    Science.gov (United States)

    Mays, V M

    1995-01-01

    This exploratory study examined the use of two components (small and large groups) of a community-based intervention, the Focused Support Group (FSG) model, to alleviate employment-related stressors in Black women. Participants were assigned to small groups based on occupational status. Groups met for five weekly 3-hr sessions in didactic or small- and large-group formats. Two evaluations following the didactic session and the small and large group sessions elicited information on satisfaction with each of the formats, self-reported change in stress, awareness of interpersonal and sociopolitical issues affecting Black women in the labor force, assessing support networks, and usefulness of specific discussion topics to stress reduction. Results indicated the usefulness of the small- and large-group formats in reduction of self-reported stress and increases in personal and professional sources of support. Discussions on race and sex discrimination in the workplace were effective in overall stress reduction. The study highlights labor force participation as a potential source of stress for Black women, and supports the development of culture- and gender-appropriate community interventions as viable and cost-effective methods for stress reduction.

  10. The sheep as a large osteoporotic model for orthopaedic research in humans

    DEFF Research Database (Denmark)

    Cheng, L.; Ding, Ming; Li, Z.

    2008-01-01

    Although small animals as rodents are very popular animals for osteoporosis models , large animals models are necessary for research of human osteoporotic diseases. Sheep osteoporosis models are becoming more important because of its unique advantages for osteoporosis reseach. Sheep are docile...... intake restriction and glucocorticoid application are the most effective methods for sheep osteoporosis model. Sheep osteoporosis model is an ideal animal model for studying various medicines reacting to osteoporosis and other treatment methods such as prosthetic replacement reacting to osteoporotic...

  11. Leader-based and self-organized communication: modelling group-mass recruitment in ants.

    Science.gov (United States)

    Collignon, Bertrand; Deneubourg, Jean Louis; Detrain, Claire

    2012-11-21

    For collective decisions to be made, the information acquired by experienced individuals about resources' location has to be shared with naïve individuals through recruitment. Here, we investigate the properties of collective responses arising from a leader-based recruitment and a self-organized communication by chemical trails. We develop a generalized model based on biological data drawn from Tetramorium caespitum ant species of which collective foraging relies on the coupling of group leading and trail recruitment. We show that for leader-based recruitment, small groups of recruits have to be guided in a very efficient way to allow a collective exploitation of food while large group requires less attention from their leader. In the case of self-organized recruitment through a chemical trail, a critical value of trail amount has to be laid per forager in order to launch collective food exploitation. Thereafter, ants can maintain collective foraging by emitting signal intensity below this threshold. Finally, we demonstrate how the coupling of both recruitment mechanisms may benefit to collectively foraging species. These theoretical results are then compared with experimental data from recruitment by T. caespitum ant colonies performing group-mass recruitment towards a single food source. We evidence the key role of leaders as initiators and catalysts of recruitment before this leader-based process is overtaken by self-organised communication through trails. This model brings new insights as well as a theoretical background to empirical studies about cooperative foraging in group-living species. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Student perceptions of gamified audience response system interactions in large group lectures and via lecture capture technology.

    Science.gov (United States)

    Pettit, Robin K; McCoy, Lise; Kinney, Marjorie; Schwartz, Frederic N

    2015-05-22

    Higher education students have positive attitudes about the use of audience response systems (ARS), but even technology-enhanced lessons can become tiresome if the pedagogical approach is exactly the same with each implementation. Gamification is the notion that gaming mechanics can be applied to routine activities. In this study, TurningPoint (TP) ARS interactions were gamified and implemented in 22 large group medical microbiology lectures throughout an integrated year 1 osteopathic medical school curriculum. A 32-item questionnaire was used to measure students' perceptions of the gamified TP interactions at the end of their first year. The survey instrument generated both Likert scale and open-ended response data that addressed game design and variety, engagement and learning features, use of TP questions after class, and any value of lecture capture technology for reviewing these interactive presentations. The Chi Square Test was used to analyze grouped responses to Likert scale questions. Responses to open-ended prompts were categorized using open-coding. Ninety-one students out of 106 (86 %) responded to the survey. A significant majority of the respondents agreed or strongly agreed that the games were engaging, and an effective learning tool. The questionnaire investigated the degree to which specific features of these interactions were engaging (nine items) and promoted learning (seven items). The most highly ranked engagement aspects were peer competition and focus on the activity (tied for highest ranking), and the most highly ranked learning aspect was applying theoretical knowledge to clinical scenarios. Another notable item was the variety of interactions, which ranked in the top three in both the engagement and learning categories. Open-ended comments shed light on how students use TP questions for exam preparation, and revealed engaging and non-engaging attributes of these interactive sessions for students who review them via lecture capture

  13. On the renormalization group flow in two dimensional superconformal models

    International Nuclear Information System (INIS)

    Ahn, Changrim; Stanishkov, Marian

    2014-01-01

    We extend the results on the RG flow in the next to leading order to the case of the supersymmetric minimal models SM p for p≫1. We explain how to compute the NS and Ramond fields conformal blocks in the leading order in 1/p and follow the renormalization scheme proposed in [1]. As a result we obtained the anomalous dimensions of certain NS and Ramond fields. It turns out that the linear combination expressing the infrared limit of these fields in term of the IR theory SM p−2 is exactly the same as those of the nonsupersymmetric minimal theory

  14. Toward a Group Empowerment Model in Mexican Organizations: A Structural Equation Modeling Approach-Edición Única

    OpenAIRE

    Mendoza Gómez, Joel

    2005-01-01

    A model of group empowerment within the context of Mexican organization is proposed and empirically tested. Studying groups in the workplace has attracted increasing attention during the last years from academics and practitioners. The construct of group empowerment has been scarcely studied; however, group motivation is a crucial element for the group effectiveness. The study of group motivation has not completely covered the process through which group empowerment is gener...

  15. Group Peer Mentoring: An Answer to the Faculty Mentoring Problem? A Successful Program at a Large Academic Department of Medicine.

    Science.gov (United States)

    Pololi, Linda H; Evans, Arthur T

    2015-01-01

    To address a dearth of mentoring and to avoid the pitfalls of dyadic mentoring, the authors implemented and evaluated a novel collaborative group peer mentoring program in a large academic department of medicine. The mentoring program aimed to facilitate faculty in their career planning, and targeted either early-career or midcareer faculty in 5 cohorts over 4 years, from 2010 to 2014. Each cohort of 9-12 faculty participated in a yearlong program with foundations in adult learning, relationship formation, mindfulness, and culture change. Participants convened for an entire day, once a month. Sessions incorporated facilitated stepwise and values-based career planning, skill development, and reflective practice. Early-career faculty participated in an integrated writing program and midcareer faculty in leadership development. Overall attendance of the 51 participants was 96%, and only 3 of 51 faculty who completed the program left the medical school during the 4 years. All faculty completed a written detailed structured academic development plan. Participants experienced an enhanced, inclusive, and appreciative culture; clarified their own career goals, values, strengths and priorities; enhanced their enthusiasm for collaboration; and developed skills. The program results highlight the need for faculty to personally experience the power of forming deep relationships with their peers for fostering successful career development and vitality. The outcomes of faculty humanity, vitality, professionalism, relationships, appreciation of diversity, and creativity are essential to the multiple missions of academic medicine. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.

  16. Very large intermediate breaking scale in the Gepner/Schimmrigk three generation model

    International Nuclear Information System (INIS)

    Wu Jizhi; Arnowitt, Richard

    1994-01-01

    A detailed study of the intermediate symmetry breaking scale, via the renormalization group equations, for a three generation heterotic string model arising from the N=2 superconformal construction is reported. The numerical study shows that the model admits a very large intermediate breaking scale > or ∼1.0x10 16 GeV. The role of the gauge singlets in this model is studied, and it is found that these fields play a crucial role in determining the directions and the scale of the intermediate symmetry breaking. The importance of the mixing in generation space is also studied. The generation mixing terms are found to have special effects in the intermediate symmetry breaking. Remarkably these terms can produce some new Yukawa couplings (not present at the Planck scale) through loops. These couplings are in general very small compared to the ones with non-vanishing tree level values and thus offer a new mechanism to solve the lepton/quark mass hierarchy problem. ((orig.))

  17. Large intraspecific genetic variation within the Saffron-Crocus group (Crocus L., Series Crocus; Iridaceae)

    DEFF Research Database (Denmark)

    Larsen, Bjarne; Orabi, Jihad; Pedersen, Carsten

    2015-01-01

    generally were grouped with C. sativus samples. Pollination and maintenance of genetic variation are discussed. The large intraspecific variation found within the three specifically studied species reflects dynamic population structures with potential to meet future ecological fluctuations. It emphasises...

  18. Formation of Large-Amplitude Wave Groups in an Experimental Model Basin

    Science.gov (United States)

    2008-08-01

    field: varying blower motor speeds supplying air to the pneumatic domes and motion amplitude variation of the flapper valve that controls air being...pumped in and out of the domes. Hydraulic cylinders with a ± 10V control signal are employed to actuate the flapper valves. The wave-maker also has a...of the four regular waves was controlled by blower rpm, maximum voltage (the amplitude of flapper motion), frequency, and the number of wave cycles

  19. Development of an adolescent inpatient sexual abuse group: application of Lewin's model of change.

    Science.gov (United States)

    Riddle, C R

    1994-01-01

    The development and implementation of an adolescent sexual abuse group on an inpatient psychiatric unit is described. Steps of Kurt Lewin's model of change are used as a framework for this planned change. Specific issues concerning group procedure and process are detailed. Recommendations for this group and broader use of the Lewin model are included.

  20. Framing Negotiation: Dynamics of Epistemological and Positional Framing in Small Groups during Scientific Modeling

    Science.gov (United States)

    Shim, Soo-Yean; Kim, Heui-Baik

    2018-01-01

    In this study, we examined students' epistemological and positional framing during small group scientific modeling to explore their context-dependent perceptions about knowledge, themselves, and others. We focused on two small groups of Korean eighth-grade students who participated in six modeling activities about excretion. The two groups were…

  1. A friendly Maple module for one and two group reactor model

    International Nuclear Information System (INIS)

    Baptista, Camila O.; Pavan, Guilherme A.; Braga, Kelmo L.; Silva, Marcelo V.; Pereira, P.G.S.; Werner, Rodrigo; Antunes, Valdir; Vellozo, Sergio O.

    2015-01-01

    The well known two energy groups core reactor design model is revisited. A simple and friendly Maple module was built to cover the steps calculations of a plate reactor in five situations: 1. one group bare reactor, 2. two groups bare reactor, 3. one group reflected reactor, 4. 1-1/2 groups reflected reactor and 5. two groups reflected reactor. The results show the convergent path of critical size, as it should be. (author)

  2. Modeling the Effect of Climate Change on Large Fire Size, Counts, and Intensities Using the Large Fire Simulator (FSim)

    Science.gov (United States)

    Riley, K. L.; Haas, J. R.; Finney, M.; Abatzoglou, J. T.

    2013-12-01

    Changes in climate can be expected to cause changes in wildfire activity due to a combination of shifts in weather (temperature, precipitation, relative humidity, wind speed and direction) and vegetation. Changes in vegetation could include type conversions, altered forest structure, and shifts in species composition, the effects of which could be mitigated or exacerbated by management activities. Further, changes in suppression response and effectiveness may alter potential wildfire activity, as well as the consequences of wildfire. Feedbacks among these factors are extremely complex and uncertain. The ability to anticipate changes driven by fire weather (largely outside of human control) can lead to development of fire and fuel management strategies aimed at mitigating current and future risk. Therefore, in this study we focus on isolating the effects of climate-induced changes in weather on wildfire activity. Specifically, we investigated the effect of changes in weather on fire activity in the Canadian Rockies ecoregion, which encompasses Glacier National Park and several large wilderness areas to the south. To model the ignition, growth, and containment of wildfires, we used the Large Fire Simulator (FSim), which we coupled with current and projected future climatic conditions. Weather streams were based on data from 14 downscaled Global Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase 5 (CMIP5) using the Representative Concentration Pathways (RCP) 45 and 85 for the years 2040-2060. While all GCMs indicate increases in temperature for this area, which would be expected to exacerbate fire activity, precipitation predictions for the summer wildfire season are more variable, ranging from a decrease of approximately 50 mm to an increase of approximately 50 mm. Windspeeds are generally predicted to decrease, which would reduce rates of spread and fire intensity. The net effect of these weather changes on the size, number, and intensity

  3. The Effects of Individual versus Group Incentive Systems on Student Learning and Attitudes in a Large Lecture Course

    Science.gov (United States)

    Shariff, Sya Azmeela Binti

    2012-01-01

    Promoting active learning among students may result in greater learning and more positive attitudes in university-level large lecture classes. One way of promoting active learning in large lecture classes is via the use of a think-pair-share instructional strategy, which combines student participation in class discussions via clicker technology…

  4. Using radar altimetry to update a large-scale hydrological model of the Brahmaputra river basin

    DEFF Research Database (Denmark)

    Finsen, F.; Milzow, Christian; Smith, R.

    2014-01-01

    of the Brahmaputra is excellent (17 high-quality virtual stations from ERS-2, 6 from Topex and 10 from Envisat are available for the Brahmaputra). In this study, altimetry data are used to update a large-scale Budyko-type hydrological model of the Brahmaputra river basin in real time. Altimetry measurements...... improved model performance considerably. The Nash-Sutcliffe model efficiency increased from 0.77 to 0.83. Real-time river basin modelling using radar altimetry has the potential to improve the predictive capability of large-scale hydrological models elsewhere on the planet....

  5. Renormalization group study of the minimal Majoronic dark radiation and dark matter model

    Energy Technology Data Exchange (ETDEWEB)

    Chang, We-Fu [Department of Physics, National Tsing Hua University,101, Sec. 2, KuangFu Rd., Hsinchu 300, Taiwan (China); Ng, John N. [Theory Group, TRIUMF,4004 Wesbrook Mall, Vancouver BC V6T 2A3 (Canada)

    2016-07-18

    We study the 1-loop renormalization group equation running in the simplest singlet Majoron model constructed by us earlier to accommodate the dark radiation and dark matter content in the universe. A comprehensive numerical study was performed to explore the whole model parameter space. A smaller effective number of neutrinos △N{sub eff}∼0.05, or a Majoron decoupling temperature higher than the charm quark mass, is preferred. We found that a heavy scalar dark matter, ρ, of mass 1.5–4 TeV is required by the stability of the scalar potential and an operational type-I see-saw mechanism for neutrino masses. A neutral scalar, S, of mass in the 10–100 GeV range and its mixing with the standard model Higgs as large as 0.1 is also predicted. The dominant decay modes are S into bb-bar and/or ωω. A sensitive search will come from rare Z decays via the chain Z→S+ff-bar, where f is a Standard Model fermion, followed by S into a pair of Majoron and/or b-quarks. The interesting consequences of dark matter bound state due to the sizable Sρρ-coupling are discussed as well. In particular, shower-like events with an apparent neutrino energy at M{sub ρ} could contribute to the observed effective neutrino flux in underground neutrino detectors such as IceCube.

  6. Analytical model of the statistical properties of contrast of large-scale ionospheric inhomogeneities.

    Science.gov (United States)

    Vsekhsvyatskaya, I. S.; Evstratova, E. A.; Kalinin, Yu. K.; Romanchuk, A. A.

    1989-08-01

    A new analytical model is proposed for the distribution of variations of the relative electron-density contrast of large-scale ionospheric inhomogeneities. The model is characterized by other-than-zero skewness and kurtosis. It is shown that the model is applicable in the interval of horizontal dimensions of inhomogeneities from hundreds to thousands of kilometers.

  7. Large-signal PIN diode model for ultra-fast photodetectors

    DEFF Research Database (Denmark)

    Krozer, Viktor; Fritsche, C

    2005-01-01

    A large-signal model for PIN photodetector is presented, which can be applied to ultra-fast photodetection and THz signal generation. The model takes into account the tunnelling and avalanche breakdown, which is important for avalanche photodiodes. The model is applied to ultra-fast superlattice ...

  8. Psicodinâmica da violência de grandes grupos e da violência de massas Large-group psychodynamics and massive violence

    Directory of Open Access Journals (Sweden)

    Vamik D. Volkan

    2006-01-01

    Full Text Available A partir de Freud, as teorias psicanalistas sobre grandes grupos focalizam, principalmente, as percepções e os significados que, psicologicamente, os indivíduos atribuem a eles. Este texto analisa alguns aspectos sobre a psicologia dos grandes grupos e sua psicodinâmica interna e específica. Toma como referência grupos étnicos, nacionais, religiosos e ideológicos cujo pertencimento dos sujeitos iniciou-se na infância. O autor faz uma comparação entre o processo de luto em indivíduos e o processo de luto em grandes grupos para ilustrar por que é necessário investir no conhecimento da psicologia destes últimos como um objeto específico. O autor descreve, ainda, sinais e sintomas de regressão em grandes grupos. Quando há ameaça à identidade coletiva, pode ocorrer um processo de violência de massas que obviamente influencia a saúde pública.Beginning with Freud, psychoanalytic theories concerning large groups have mainly focused on individuals' perceptions of what their large groups psychologically mean to them. This text examines some aspects of large-group psychology in its own right and studies psychodynamics of ethnic, national, religious or ideological groups, the membership of which originates in childhood. I will compare the mourning process in individuals with the mourning process in large groups to illustrate why we need to study large-group psychology as a subject in itself. As part of this discussion I will also describe signs and symptoms of large-group regression.When there is a threat against a large-group's identity, massive violence may be initiated and this violence in turn, has an obvious impact on public health.

  9. Large-group psychodynamics and massive violence Psicodinâmica da violência de grandes grupos e da violência de massas

    Directory of Open Access Journals (Sweden)

    Vamik D. Volkan

    2006-06-01

    Full Text Available Beginning with Freud, psychoanalytic theories concerning large groups have mainly focused on individuals' perceptions of what their large groups psychologically mean to them. This chapter examines some aspects of large-group psychology in its own right and studies psychodynamics of ethnic, national, religious or ideological groups, the membership of which originates in childhood. I will compare the mourning process in individuals with the mourning process in large groups to illustrate why we need to study large-group psychology as a subject in itself. As part of this discussion I will also describe signs and symptoms of large-group regression. When there is a threat against a large-group's identity, massive violence may be initiated and this violence in turn, has an obvious impact on public health.A partir de Freud, as teorias psicanalíticas a respeito de grandes grupos focalizam principalmente as percepções e os significados que os indivíduos psicologicamente atribuem a eles. Este texto analisa alguns aspectos sobre a psicologia dos grandes grupos e sua psicodinâmica interna e específica. Toma como referência grupos étnicos, nacionais, religiosos e ideológicos cujo pertencimento dos sujeitos iniciou-se na infância. Faz-se uma comparação entre o processo de luto em indivíduos e o processo de luto em grandes grupos para ilustrar por que é necessário investir no conhecimento da psicologia destes últimos, como um objeto específico. Descreve ainda sinais e sintomas de regressão em grandes grupos. Quando há ameaça à identidade coletiva pode ocorrer um processo de violência de massas que obviamente influencia na sua saúde coletiva.

  10. Modeling perspectives on echolocation strategies inspired by bats flying in groups.

    Science.gov (United States)

    Lin, Yuan; Abaid, Nicole

    2015-12-21

    Bats navigating with echolocation - which is a type of active sensing achieved by interpreting echoes resulting from self-generated ultrasonic pulses - exhibit unique behaviors during group flight. While bats may benefit from eavesdropping on their peers׳ echolocation, they also potentially suffer from confusion between their own and peers׳ pulses, caused by an effect called frequency jamming. This hardship of group flight is supported by experimental observations of bats simplifying their sound-scape by shifting their pulse frequencies or suppressing echolocation altogether. Here, we investigate eavesdropping and varying pulse emission rate from a modeling perspective to understand these behaviors׳ potential benefits and detriments. We define an agent-based model of echolocating bats avoiding collisions in a three-dimensional tunnel. Through simulation, we show that bats with reasonably accurate eavesdropping can reduce collisions compared to those neglecting information from peers. In large populations, bats minimize frequency jamming by decreasing pulse emission rate, while collision risk increases; conversely, increasing pulse emission rate minimizes collisions by allowing more sensing information generated per bat. These strategies offer benefits for both biological and engineered systems, since frequency jamming is a concern in systems using active sensing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Franchising as a Strategy for Combining Small and Large Group Advantages (Logics) in Social Entrepreneurship: A Hayekian Perspective

    OpenAIRE

    Beckmann, Markus; Zeyen, Anica

    2014-01-01

    This article develops a Hayekian perspective on social franchising that distinguishes between the end-connected logic of the small group and the rule-connected logic of the big group. Our key claim is that mission-driven social entrepreneurs often draw on the small-group logic when starting their social ventures and then face difficulties when the process of scaling shifts their operations toward a big-group logic. In this situation, social franchising offers a strategy to replicate the small...

  12. Association between ABO Blood Group and Risk of Congenital Heart Disease: A 6-year large cohort study.

    Science.gov (United States)

    Zu, Bailing; You, Guoling; Fu, Qihua; Wang, Jing

    2017-02-17

    ABO blood group, except its direct clinical implications for transfusion and organ transplantation, is generally accepted as an effect factor for coronary heart disease, but the associations between ABO blood group and congenital heart disease (CHD) are not coherent by previous reports. In this study, we evaluated the the potential relationship between ABO blood group and CHD risk. In 39,042 consecutive inpatients (19,795 CHD VS 19,247 controls), we used multivariable logistic regression to evaluate the roles of ABO blood group, gender, and RH for CHD. The associations between ABO blood group and CHD subgroups, were further evaluated using stratification analysis, adjusted by gender. A blood group demonstrated decreased risk for isolated CHD (OR 0.82; 95% CI, 0.78-0.87) in individuals with A blood group in the overall cohort analysis, and the finding was consistently replicated in independent subgroup analysis. ABO blood group may have a role for CHD, and this novel finding provides ABO blood group as a possible marker for CHD, but more studies need to be done.

  13. Estimation of group means when adjusting for covariates in generalized linear models.

    Science.gov (United States)

    Qu, Yongming; Luo, Junxiang

    2015-01-01

    Generalized linear models are commonly used to analyze categorical data such as binary, count, and ordinal outcomes. Adjusting for important prognostic factors or baseline covariates in generalized linear models may improve the estimation efficiency. The model-based mean for a treatment group produced by most software packages estimates the response at the mean covariate, not the mean response for this treatment group for the studied population. Although this is not an issue for linear models, the model-based group mean estimates in generalized linear models could be seriously biased for the true group means. We propose a new method to estimate the group mean consistently with the corresponding variance estimation. Simulation showed the proposed method produces an unbiased estimator for the group means and provided the correct coverage probability. The proposed method was applied to analyze hypoglycemia data from clinical trials in diabetes. Copyright © 2014 John Wiley & Sons, Ltd.

  14. How uncertainty in socio-economic variables affects large-scale transport model forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    time, especially with respect to large-scale transport models. The study described in this paper contributes to fill the gap by investigating the effects of uncertainty in socio-economic variables growth rate projections on large-scale transport model forecasts, using the Danish National Transport......A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...

  15. Does company size matter? Validation of an integrative model of safety behavior across small and large construction companies.

    Science.gov (United States)

    Guo, Brian H W; Yiu, Tak Wing; González, Vicente A

    2018-02-01

    Previous safety climate studies primarily focused on either large construction companies or the construction industry as a whole, while little is known about whether company size has significant effects on workers' understanding of safety climate measures and relationships between safety climate factors and safety behavior. Thus, this study aims to: (a) test the measurement equivalence (ME) of a safety climate measure across workers from small and large companies; (b) investigate if company size alters the causal structure of the integrative model developed by Guo, Yiu, and González (2016). Data were collected from 253 construction workers in New Zealand using a safety climate measure. This study used multi-group confirmatory factor analyses (MCFA) to test the measurement equivalence of the safety climate measure and structure invariance of the integrative model. Results indicate that workers from small and large companies understood the safety climate measure in a similar manner. In addition, it was suggested that company size does not change the causal structure and mediational processes of the integrative model. Both measurement equivalence of the safety climate measure and structural invariance of the integrative model were supported by this study. Practical applications: Findings of this study provided strong support for a meaningful use of the safety climate measure across construction companies in different sizes. Safety behavior promotion strategies designed based on the integrative model may be well suited for both large and small companies. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  16. Large-Signal Code TESLA: Improvements in the Implementation and in the Model

    National Research Council Canada - National Science Library

    Chernyavskiy, Igor A; Vlasov, Alexander N; Anderson, Jr., Thomas M; Cooke, Simon J; Levush, Baruch; Nguyen, Khanh T

    2006-01-01

    We describe the latest improvements made in the large-signal code TESLA, which include transformation of the code to a Fortran-90/95 version with dynamical memory allocation and extension of the model...

  17. Large Deviations for Stochastic Models of Two-Dimensional Second Grade Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Zhai, Jianliang, E-mail: zhaijl@ustc.edu.cn [University of Science and Technology of China, School of Mathematical Sciences (China); Zhang, Tusheng, E-mail: Tusheng.Zhang@manchester.ac.uk [University of Manchester, School of Mathematics (United Kingdom)

    2017-06-15

    In this paper, we establish a large deviation principle for stochastic models of incompressible second grade fluids. The weak convergence method introduced by Budhiraja and Dupuis (Probab Math Statist 20:39–61, 2000) plays an important role.

  18. Various approaches to the modelling of large scale 3-dimensional circulation in the Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Shaji, C.; Bahulayan, N.; Rao, A.D.; Dube, S.K.

    In this paper, the three different approaches to the modelling of large scale 3-dimensional flow in the ocean such as the diagnostic, semi-diagnostic (adaptation) and the prognostic are discussed in detail. Three-dimensional solutions are obtained...

  19. Analysis and Modelling of Pedestrian Movement Dynamics at Large-scale Events

    NARCIS (Netherlands)

    Duives, D.C.

    2016-01-01

    To what extent can we model the movements of pedestrians who walk across a large-scale event terrain? This dissertation answers this question by analysing the operational movement dynamics of pedestrians in crowds at several large music and sport events in the Netherlands and extracting the key

  20. Measurements in SUGRA Models with Large $\\tan\\beta$ at LHC

    CERN Document Server

    Hinchliffe, Ian

    1999-01-01

    We present an example of a scenario of particle production and decay in supersymmetry models in which the supersymmetry breaking is transmitted to the observable world via gravitational interactions. The case is chosen so that there is a large production of tau leptons in the final state. It is characteristic of large $\\tan\\beta$ that decays into muons and electrons may be supressed.

  1. Monte Carlo model of light transport in scintillating fibers and large scintillators

    International Nuclear Information System (INIS)

    Chakarova, R.

    1995-01-01

    A Monte Carlo model is developed which simulates the light transport in a scintillator surrounded by a transparent layer with different surface properties. The model is applied to analyse the light collection properties of scintillating fibers and a large scintillator wrapped in aluminium foil. The influence of the fiber interface characteristics on the light yield is investigated in detail. Light output results as well as time distributions are obtained for the large scintillator case. 15 refs, 16 figs

  2. Regional modeling of large wildfires under current and potential future climates in Colorado and Wyoming, USA

    Science.gov (United States)

    West, Amanda; Kumar, Sunil; Jarnevich, Catherine S.

    2016-01-01

    Regional analysis of large wildfire potential given climate change scenarios is crucial to understanding areas most at risk in the future, yet wildfire models are not often developed and tested at this spatial scale. We fit three historical climate suitability models for large wildfires (i.e. ≥ 400 ha) in Colorado andWyoming using topography and decadal climate averages corresponding to wildfire occurrence at the same temporal scale. The historical models classified points of known large wildfire occurrence with high accuracies. Using a novel approach in wildfire modeling, we applied the historical models to independent climate and wildfire datasets, and the resulting sensitivities were 0.75, 0.81, and 0.83 for Maxent, Generalized Linear, and Multivariate Adaptive Regression Splines, respectively. We projected the historic models into future climate space using data from 15 global circulation models and two representative concentration pathway scenarios. Maps from these geospatial analyses can be used to evaluate the changing spatial distribution of climate suitability of large wildfires in these states. April relative humidity was the most important covariate in all models, providing insight to the climate space of large wildfires in this region. These methods incorporate monthly and seasonal climate averages at a spatial resolution relevant to land management (i.e. 1 km2) and provide a tool that can be modified for other regions of North America, or adapted for other parts of the world.

  3. Large-watershed flood simulation and forecasting based on different-resolution distributed hydrological model

    Science.gov (United States)

    Li, J.

    2017-12-01

    Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.

  4. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    Full Text Available Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological drought? To answer this question, we evaluated the simulation of drought propagation in an ensemble mean of ten large-scale models, both land-surface models and global hydrological models, that participated in the model intercomparison project of WATCH (WaterMIP. For a selection of case study areas, we studied drought characteristics (number of droughts, duration, severity, drought propagation features (pooling, attenuation, lag, lengthening, and hydrological drought typology (classical rainfall deficit drought, rain-to-snow-season drought, wet-to-dry-season drought, cold snow season drought, warm snow season drought, composite drought.

    Drought characteristics simulated by large-scale models clearly reflected drought propagation; i.e. drought events became fewer and longer when moving through the hydrological cycle. However, more differentiation was expected between fast and slowly responding systems, with slowly responding systems having fewer and longer droughts in runoff than fast responding systems. This was not found using large-scale models. Drought propagation features were poorly reproduced by the large-scale models, because runoff reacted immediately to precipitation, in all case study areas. This fast reaction to precipitation, even in cold climates in winter and in semi-arid climates in summer, also greatly influenced the hydrological drought typology as identified by the large-scale models. In general, the large-scale models had the correct representation of drought types, but the percentages of occurrence had some important mismatches, e.g. an overestimation of classical rainfall deficit droughts, and an

  5. Multiconformation, Density Functional Theory-Based pKa Prediction in Application to Large, Flexible Organic Molecules with Diverse Functional Groups.

    Science.gov (United States)

    Bochevarov, Art D; Watson, Mark A; Greenwood, Jeremy R; Philipp, Dean M

    2016-12-13

    We consider the conformational flexibility of molecules and its implications for micro- and macro-pK a . The corresponding formulas are derived and discussed against the background of a comprehensive scientific and algorithmic description of the latest version of our computer program Jaguar pK a , a density functional theory-based pK a predictor, which is now capable of acting on multiple conformations explicitly. Jaguar pK a is essentially a complex computational workflow incorporating research and technologies from the fields of cheminformatics, molecular mechanics, quantum mechanics, and implicit solvation models. The workflow also makes use of automatically applied empirical corrections which account for the systematic errors resulting from the neglect of explicit solvent interactions in the algorithm's implicit solvent model. Applications of our program to large, flexible organic molecules representing several classes of functional groups are shown, with a particular emphasis in illustrations laid on drug-like molecules. It is demonstrated that a combination of aggressive conformational search and an explicit consideration of multiple conformations nearly eliminates the dependence of results on the initially chosen conformation. In certain cases this leads to unprecedented accuracy, which is sufficient for distinguishing stereoisomers that have slightly different pK a values. An application of Jaguar pK a to proton sponges, the pK a of which are strongly influenced by steric effects, showcases the advantages that pK a predictors based on quantum mechanical calculations have over similar empirical programs.

  6. Patient satisfaction with the laborist model of care in a large urban hospital

    Directory of Open Access Journals (Sweden)

    Srinivas SK

    2013-03-01

    Full Text Available Sindhu K Srinivas,1 Anna O Jesus,1 Elene Turzo,1 Dominic A Marchiano,2 Harish M Sehdev,2 Jack Ludmir2 1Maternal and Child Health Research Program, Department of Obstetrics and Gynecology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA; 2Pennsylvania Hospital, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA Background: The obstetric practice environment is evolving to include more laborists staffing obstetric units, with the hope of improving quality of care and provider satisfaction, yet there are scant data on the impact of a laborist care model on patient satisfaction or delivery outcomes. We sought to assess patient satisfaction after implementation of the laborist model of obstetric care in a large urban teaching hospital. Methods: Postpartum patients were asked to complete an anonymous survey assessing their satisfaction with care, particularly with regard to the laborist model. Survey questions included rating the overall experience of labor and delivery. All responses were based on a five-point Likert scale. Press-Ganey results were compared from before and after initiation of the model. Descriptive statistics were used to analyze the results. Results: Post-laborist implementation obstetric and delivery experience surveys were collected from 4166 patients, representing a 54% response rate. Ninety percent of patients reported that they were highly satisfied with the overall experience in the labor and delivery unit. A subgroup was asked to rate their experience with the practitioner for their current delivery. Of the 687 respondents, 75% answered excellent, 18% answered good/very good, and 3.4% answered neutral. Eighty-five percent of this subgroup stated that they were informed during prenatal care that they may be delivered by someone other than the practitioner or group that they saw during the pregnancy. Thirty-seven percent (n = 1553 of the total respondents reported that

  7. PENGEMBANGAN MODEL OUTDOOR LEARNING BERBANTUAN MODEL GROUP INVESTIGATION UNTUK PENGEMBANGKAN SIKAP ILMIAH

    Directory of Open Access Journals (Sweden)

    Novi Yuliyanti

    2015-06-01

    Full Text Available Hasil field study di SDN 2 Dukuh Tengah menunjukkan hasil bahwa pembelajaran IPA, didominasi pada aspek kognitif dan kurang mampu mengembangkan sikap ilmiah. Karakteristik model OLGI (outdoor learning berbantuan model group investigation belum  terlihat dengan  baik. Tujuan penelitian ini untuk mendeskripsikan, mengukur keefektifan dan untuk mengetahui peningkatan sikap ilmiah siswa dengan pengembangan model OLGI. Penelitian ini terdiri atas lima tahap 1 investigasi awal, 2 desain, 3 realisasi atau konstruksi, 4 tes, evaluasi, dan revisi, dan 5 implementasi.  Subjek penelitian adalah siswa kelas V Sekolah Dasar Negeri 2 Kersana sebagai kelas uji coba skala terbatas sejumlah 21 siswa, siswa VA SDN 2 Dukuh Tengah sejumlah 28 siswa sebagai kelas kontrol, dan siswa kelas VB SDN 2 Dukuh Tengah sejumlah 28 siswa sebagai kelas eksperimen. Teknik pengumpulan data menggunakan wawancara, lembar observasi aktivitas guru dan siswa, angket sikap ilmiah, dan lembar validasi. Analisis data yang digunakan adalah analisis deskriptif, analisis instrumen, dan uji banding dua sampel.  Hasil penelitian menunjukan bahwa model OLGI dapat meningkatkan sikap ilmiah siswa secara signifikan, dengan skor 0,55 dengan kriteria sedang pada rentang 0,30 ≤ ( ≤ 0,70. Berdasarkan analisis Uji T hasil belajar siswa di kelas eksperimen dengan rata-rata 85,32 lebih tinggi dari pada kelas kontrol dengan rata-rata 76,96. Based on the field study in SDN 2 Dukuh Tengah teaching was still dominated cognitive aspects, poor from to develop a scientific attitude.  The characteristic of OLGI models have not been seen with either. The purpose of this study was to describe, measure, effectiveness and to determine of study who receive the OLGI models to develop a scientific attitude. This research consist of five phases 1 investigation, 2 design, 3 the realization or construction, 4 evaluation and revision, 5 the implementation. Subject this research  is grade V student of SDN 2

  8. Modeling and Control of a Large Nuclear Reactor A Three-Time-Scale Approach

    CERN Document Server

    Shimjith, S R; Bandyopadhyay, B

    2013-01-01

    Control analysis and design of large nuclear reactors requires a suitable mathematical model representing the steady state and dynamic behavior of the reactor with reasonable accuracy. This task is, however, quite challenging because of several complex dynamic phenomena existing in a reactor. Quite often, the models developed would be of prohibitively large order, non-linear and of complex structure not readily amenable for control studies. Moreover, the existence of simultaneously occurring dynamic variations at different speeds makes the mathematical model susceptible to numerical ill-conditioning, inhibiting direct application of standard control techniques. This monograph introduces a technique for mathematical modeling of large nuclear reactors in the framework of multi-point kinetics, to obtain a comparatively smaller order model in standard state space form thus overcoming these difficulties. It further brings in innovative methods for controller design for systems exhibiting multi-time-scale property,...

  9. Recent Advances in Detailed Chemical Kinetic Models for Large Hydrocarbon and Biodiesel Transportation Fuels

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Pitz, W J; Curran, H J; Herbinet, O; Mehl, M

    2009-03-30

    n-Hexadecane and 2,2,4,4,6,8,8-heptamethylnonane represent the primary reference fuels for diesel that are used to determine cetane number, a measure of the ignition property of diesel fuel. With the development of chemical kinetics models for these two primary reference fuels for diesel, a new capability is now available to model diesel fuel ignition. Also, we have developed chemical kinetic models for a whole series of large n-alkanes and a large iso-alkane to represent these chemical classes in fuel surrogates for conventional and future fuels. Methyl decanoate and methyl stearate are large methyl esters that are closely related to biodiesel fuels, and kinetic models for these molecules have also been developed. These chemical kinetic models are used to predict the effect of the fuel molecule size and structure on ignition characteristics under conditions found in internal combustion engines.

  10. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  11. PVA gel as a potential adhesion barrier: a safety study in a large animal model of intestinal surgery.

    Science.gov (United States)

    Renz, Bernhard W; Leitner, Kurt; Odermatt, Erich; Worthley, Daniel L; Angele, Martin K; Jauch, Karl-Walter; Lang, Reinhold A

    2014-03-01

    Intra-abdominal adhesions following surgery are a major source of morbidity and mortality including abdominal pain and small bowel obstruction. This study evaluated the safety of PVA gel (polyvinyl alcohol and carboxymethylated cellulose gel) on intestinal anastomoses and its potential effectiveness in preventing adhesions in a clinically relevant large animal model. Experiments were performed in a pig model with median laparotomy and intestinal anastomosis following small bowel resection. The primary endpoint was the safety of PVA on small intestinal anastomoses. We also measured the incidence of postoperative adhesions in PVA vs. control groups: group A (eight pigs): stapled anastomosis with PVA gel compared to group B (eight pigs), which had no PVA gel; group C (eight pigs): hand-sewn anastomosis with PVA gel compared to group B (eight pigs), which had no anti-adhesive barrier. Animals were sacrificed 14 days after surgery and analyzed. All anastomoses had a patent lumen without any stenosis. No anastomoses leaked at an intraluminal pressure of 40 cmH2O. Thus, anastomoses healed very well in both groups, regardless of whether PVA was administered. PVA-treated animals, however, had significantly fewer adhesions in the area of stapled anastomoses. The hand-sewn PVA group also had weaker adhesions and trended towards fewer adhesions to adjacent organs. These results suggest that PVA gel does not jeopardize the integrity of intestinal anastomoses. However, larger trials are needed to investigate the potential of PVA gel to prevent adhesions in gastrointestinal surgery.

  12. A Regression Algorithm for Model Reduction of Large-Scale Multi-Dimensional Problems

    Science.gov (United States)

    Rasekh, Ehsan

    2011-11-01

    Model reduction is an approach for fast and cost-efficient modelling of large-scale systems governed by Ordinary Differential Equations (ODEs). Multi-dimensional model reduction has been suggested for reduction of the linear systems simultaneously with respect to frequency and any other parameter of interest. Multi-dimensional model reduction is also used to reduce the weakly nonlinear systems based on Volterra theory. Multiple dimensions degrade the efficiency of reduction by increasing the size of the projection matrix. In this paper a new methodology is proposed to efficiently build the reduced model based on regression analysis. A numerical example confirms the validity of the proposed regression algorithm for model reduction.

  13. A dynamic programming approach for quickly estimating large network-based MEV models

    DEFF Research Database (Denmark)

    Mai, Tien; Frejinger, Emma; Fosgerau, Mogens

    2017-01-01

    by a rooted, directed graph where each node without successor is an alternative. We formulate a family of MEV models as dynamic discrete choice models on graphs of correlation structures and show that the dynamic models are consistent with MEV theory and generalize the network MEV model (Daly and Bierlaire......We propose a way to estimate a family of static Multivariate Extreme Value (MEV) models with large choice sets in short computational time. The resulting model is also straightforward and fast to use for prediction. Following Daly and Bierlaire (2006), the correlation structure is defined...

  14. Dark and luminous matter in the NGC 3992 group of galaxies - I. The large barred spiral NGC 3992

    NARCIS (Netherlands)

    Bottema, R; Verheijen, MAW

    Detailed neutral hydrogen observations have been obtained of the large barred spiral galaxy NGC 3992 and its three small companion galaxies, UGC 6923, UGC 6940, and UGC 6969. For the main galaxy, the Hi distribution is regular with a low level radial extension outside the stellar disc. However, at

  15. Implementation of Lifestyle Modification Program Focusing on Physical Activity and Dietary Habits in a Large Group, Community-Based Setting

    Science.gov (United States)

    Stoutenberg, Mark; Falcon, Ashley; Arheart, Kris; Stasi, Selina; Portacio, Francia; Stepanenko, Bryan; Lan, Mary L.; Castruccio-Prince, Catarina; Nackenson, Joshua

    2017-01-01

    Background: Lifestyle modification programs improve several health-related behaviors, including physical activity (PA) and nutrition. However, few of these programs have been expanded to impact a large number of individuals in one setting at one time. Therefore, the purpose of this study was to determine whether a PA- and nutrition-based lifestyle…

  16. Assessing Student Perceptions of the Benefits of Discussions in Small-Group, Large-Class, and Online Learning Contexts

    Science.gov (United States)

    Hamann, Kerstin; Pollock, Philip H.; Wilson, Bruce M.

    2012-01-01

    A large literature establishes the benefits of discussions for stimulating student engagement and critical thinking skills. However, we know considerably less about the differential effects of various discussion environments on student learning. In this study, we assess student perceptions concerning the benefits of discussions in an upper-level…

  17. Modeling of the Critical Micelle Concentration (CMC) of Nonionic Surfactants with an Extended Group-Contribution Method

    DEFF Research Database (Denmark)

    Mattei, Michele; Kontogeorgis, Georgios; Gani, Rafiqul

    2013-01-01

    A group-contribution (GC) property prediction model for estimating the critical micelle concentration (CMC) of nonionic surfactants in water at 25 °C is presented. The model is based on the Marrero and Gani GC method. A systematic analysis of the model performance against experimental data...... is carried out using data for a wide range of nonionic surfactants covering a wide range of molecular structures. As a result of this procedure, new third order groups based on the characteristic structures of nonionic surfactants are defined and are included in the Marrero and Gani GC model. In this way...... of 150 experimental measurements covering a large variety of nonionic surfactants including linear, branched, and phenyl alkyl ethoxylates; alkanediols; alkyl mono- and disaccharide ethers and esters; ethoxylated alkyl amines and amides; fluorinated linear ethoxylates and amides; polyglycerol esters...

  18. The Hamburg large scale geostrophic ocean general circulation model. Cycle 1

    International Nuclear Information System (INIS)

    Maier-Reimer, E.; Mikolajewicz, U.

    1992-02-01

    The rationale for the Large Scale Geostrophic ocean circulation model (LSG-OGCM) is based on the observations that for a large scale ocean circulation model designed for climate studies, the relevant characteristic spatial scales are large compared with the internal Rossby radius throughout most of the ocean, while the characteristic time scales are large compared with the periods of gravity modes and barotropic Rossby wave modes. In the present version of the model, the fast modes have been filtered out by a conventional technique of integrating the full primitive equations, including all terms except the nonlinear advection of momentum, by an implicit time integration method. The free surface is also treated prognostically, without invoking a rigid lid approximation. The numerical scheme is unconditionally stable and has the additional advantage that it can be applied uniformly to the entire globe, including the equatorial and coastal current regions. (orig.)

  19. Implementation and assessment of the renormalization group (Rng) k - ε model in gothic

    International Nuclear Information System (INIS)

    Analytis, G.Th.

    2001-01-01

    In GOTHIC, the standard k - ε model is used to model turbulence. In an attempt to enhance the turbulence modelling capabilities of the code for simulation of mixing driven by highly buoyant discharges, we implemented the Renormalization Group (RNG) k - ε model. This model which for the time being, is only implemented in the ''gas'' phase, was tested with different simple test-problems and its predictions were compared to the corresponding ones obtained when the standard k - ε model was used. (author)

  20. Glucocorticoid induced osteopenia in cancellous bone of sheep: validation of large animal model for spine fusion and biomaterial research.

    Science.gov (United States)

    Ding, Ming; Cheng, Liming; Bollen, Peter; Schwarz, Peter; Overgaard, Søren

    2010-02-15

    Glucocorticoid with low calcium and phosphorus intake induces osteopenia in cancellous bone of sheep. To validate a large animal model for spine fusion and biomaterial research. A variety of ovariectomized animals has been used to study osteoporosis. Most experimental spine fusions were based on normal animals, and there is a great need for suitable large animal models with adequate bone size that closely resemble osteoporosis in humans. Eighteen female skeletal mature sheep were randomly allocated into 3 groups, 6 each. Group 1 (GC-1) received prednisolone (GC) treatment (0.60 mg/kg/day, 5 times weekly) for 7 months. Group 2 (GC-2) received the same treatment as GC-1 for 7 months followed by 3 months without treatment. Group 3 was left untreated and served as the controls. All sheep received restricted diet with low calcium and phosphorus during experiment. After killing the animals, cancellous bone specimens from the vertebra, femurs, and tibias were micro-CT scanned and tested mechanically. Serum biomarkers were determined. In lumbar vertebra, the GC treatment resulted in significant decrease of cancellous bone volume fraction and trabecular thickness, and bone strength. However, the microarchitecture and bone strength of GC-2 recovered to a similar level of the controls. A similar trend of microarchitectural changes was also observed in the distal femur and proximal tibia of both GC treated sheep. The bone formation marker serum-osteocalcin was largely reduced in GC-1 compared to the controls, but recovered with a rebound increase at month 10 in GC-2. The current investigation demonstrates that the changes in microarchitecture and mechanical properties were comparable with those observed in humans after long-term GC treatment. A prolonged GC treatment is needed for a long-term observation to keep osteopenic bone. This model resembles long-term glucocorticoid treated osteoporotic model, and is useful in preclinical studies.

  1. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    Energy Technology Data Exchange (ETDEWEB)

    William J. Schroeder

    2011-11-13

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannot be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally

  2. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    International Nuclear Information System (INIS)

    Schroeder, William J.

    2011-01-01

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannot be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem

  3. On Applications of Rasch Models in International Comparative Large-Scale Assessments: A Historical Review

    Science.gov (United States)

    Wendt, Heike; Bos, Wilfried; Goy, Martin

    2011-01-01

    Several current international comparative large-scale assessments of educational achievement (ICLSA) make use of "Rasch models", to address functions essential for valid cross-cultural comparisons. From a historical perspective, ICLSA and Georg Rasch's "models for measurement" emerged at about the same time, half a century ago. However, the…

  4. A simple atmospheric boundary layer model applied to large eddy simulations of wind turbine wakes

    DEFF Research Database (Denmark)

    Troldborg, Niels; Sørensen, Jens Nørkær; Mikkelsen, Robert Flemming

    2014-01-01

    A simple model for including the influence of the atmospheric boundary layer in connection with large eddy simulations of wind turbine wakes is presented and validated by comparing computed results with measurements as well as with direct numerical simulations. The model is based on an immersed...

  5. Induction of continuous expanding infrarenal aortic aneurysms in a large porcine animal model

    DEFF Research Database (Denmark)

    Kloster, Brian Ozeraitis; Lund, Lars; Lindholt, Jes S.

    2015-01-01

    BackgroundA large animal model with a continuous expanding infrarenal aortic aneurysm gives access to a more realistic AAA model with anatomy and physiology similar to humans, and thus allows for new experimental research in the natural history and treatment options of the disease. Methods10 pigs...

  6. Extrapolated renormalization group calculation of the surface tension in square-lattice Ising model

    International Nuclear Information System (INIS)

    Curado, E.M.F.; Tsallis, C.; Levy, S.V.F.; Oliveira, M.J. de

    1980-06-01

    By using self-dual clusters (whose sizes are characterized by the numbers b=2, 3, 4, 5) within a real space renormalization group framework, the longitudinal surface tension of the square-lattice first-neighbour 1/2-spin ferromagnetic Ising model is calculated. The exact critical temperature T sub(c) is recovered for any value of b; the exact assymptotic behaviour of the surface tension in the limit of low temperatures is analytically recovered; the approximate correlation length critical exponents monotonically tend towards the exact value ν=1 (which, at two dimensions, coincides with the surface tension critical exponent μ) for increasingly large cells; the same behaviour is remarked in what concerns the approximate values for the surface tension amplitude in the limit T→T sub(c). Four different numerical procedures are developed for extrapolating to b→infinite the renormalization group results for the surface tension, and quite satisfactory agreement is obtained with Onsager's exact expression (error varying from zero to a few percent on the whole temperature domain). Furthermore the set of RG surface tensions is compared with a set of biased surface tensions (associated to appropriate misfit seams), and find only fortuitous coincidence among them. (Author) [pt

  7. Item Construction Using Reflective, Formative, or Rasch Measurement Models: Implications for Group Work

    Science.gov (United States)

    Peterson, Christina Hamme; Gischlar, Karen L.; Peterson, N. Andrew

    2017-01-01

    Measures that accurately capture the phenomenon are critical to research and practice in group work. The vast majority of group-related measures were developed using the reflective measurement model rooted in classical test theory (CTT). Depending on the construct definition and the measure's purpose, the reflective model may not always be the…

  8. Reviewing the Role of Stakeholders in Operational Research: Opportunities for Group Model Building

    NARCIS (Netherlands)

    Gooyert, V. de; Rouwette, E.A.J.A.; Kranenburg, H.L. van

    2013-01-01

    Stakeholders have always received much attention in system dynamics, especially in the group model building tradition, which emphasizes the deep involvement of a client group in building a system dynamics model. In organizations, stakeholders are gaining more and more attention by managers who try

  9. Examination of a Group Counseling Model of Career Decision Making with College Students

    Science.gov (United States)

    Rowell, P. Clay; Mobley, A. Keith; Kemer, Gulsah; Giordano, Amanda

    2014-01-01

    The authors examined the effectiveness of a group career counseling model (Pyle, K. R., 2007) on college students' career decision-making abilities. They used a Solomon 4-group design and found that students who participated in the career counseling groups had significantly greater increases in career decision-making abilities than those who…

  10. Brief heterogeneous inpatient psychotherapy groups: a process-oriented psychoeducational (POP) model.

    Science.gov (United States)

    Cook, Wesley G; Arechiga, Adam; Dobson, Leslie Ann V; Boyd, Kenny

    2014-04-01

    In the United States, there is currently an increase in admissions to psychiatric hospitals, diagnostic heterogeneity, briefer stays, and a lack of inpatient research. Most traditional group therapy models are constructed for longer-term homogeneous patients. Diagnostically homogeneous groups even outperform heterogeneous groups. However, changes in health care have created a clinical dilemma in psychiatric hospitals where groups have become characterized by brief duration, rapid turnover, and diagnostic heterogeneity. A literature review offered little in the way of treatment recommendations, let alone a model or empirical basis, for facilitating these types of groups. Common factors from group therapy studies were extracted. Based on an integration of these studies, a process-oriented psychoeducational (POP) treatment model is recommended. This model is theoretically constructed and outlined for future study.

  11. Threshold dose for peripheral neuropathy following intraoperative radiotherapy (IORT) in a large animal model

    International Nuclear Information System (INIS)

    Kinsella, T.J.; DeLuca, A.M.; Barnes, M.; Anderson, W.; Terrill, R.; Sindelar, W.F.

    1991-01-01

    Radiation injury to peripheral nerve is a dose-limiting toxicity in the clinical application of intraoperative radiotherapy, particularly for pelvic and retroperitoneal tumors. Intraoperative radiotherapy-related peripheral neuropathy in humans receiving doses of 20-25 Gy is manifested as a mixed motor-sensory deficit beginning 6-9 months following treatment. In a previous experimental study of intraoperative radiotherapy-related neuropathy of the lumbro-sacral plexus, an approximate inverse linear relationship was reported between the intraoperative dose (20-75 Gy range) and the time to onset of hind limb paresis (1-12 mos following intraoperative radiotherapy). The principal histological lesion in irradiated nerve was loss of large nerve fibers and perineural fibrosis without significant vascular injury. Similar histological changes in irradiated nerves were found in humans. To assess peripheral nerve injury to lower doses of intraoperative radiotherapy in this same large animal model, groups of four adult American Foxhounds received doses of 10, 15, or 20 Gy to the right lumbro-sacral plexus and sciatic nerve using 9 MeV electrons. The left lumbro-sacral plexus and sciatic nerve were excluded from the intraoperative field to allow each animal to serve as its own control. Following treatment, a complete neurological exam, electromyogram, and nerve conduction studies were performed monthly for 1 year. Monthly neurological exams were performed in years 2 and 3 whereas electromyogram and nerve conduction studies were performed every 3 months during this follow-up period. With follow-up of greater than or equal to 42 months, no dog receiving 10 or 15 Gy IORT shows any clinical or laboratory evidence of peripheral nerve injury. However, all four dogs receiving 20 Gy developed right hind limb paresis at 8, 9, 9, and 12 mos following intraoperative radiotherapy

  12. A process-oriented group model for university students: a semi-structured approach.

    Science.gov (United States)

    Johnson, Chad V

    2009-10-01

    University students present several challenges for group therapists in terms of establishing and sustaining interpersonal process groups in college counseling center settings. These challenges may result from a lack of client preparation and/or a mismatch of therapy practices with the unique developmental characteristics of today's college students. This paper discusses these developmental needs and proposes a model for successful interpersonal group therapy with university students. The proposed model encourages structured activities at the initial and final stages of a process-oriented therapy group to assist leaders and teach the members skills to promote cohesion, skill development, and interpersonal learning. This model may also be used to train novice group counselors how to facilitate here-and-now interactions in group and shape group process.

  13. Control-Oriented Model of Molar Scavenge Oxygen Fraction for Exhaust Recirculation in Large Diesel Engines

    DEFF Research Database (Denmark)

    Nielsen, Kræn Vodder; Blanke, Mogens; Eriksson, Lars

    2016-01-01

    therefore focus on deriving and validating a mean-value model of a large two-stroke crosshead diesel engines with EGR. The model introduces a number of amendments and extensions to previous, complex models and shows in theory and practice that a simplified nonlinear model captures all essential dynamics...... the behavior of the scavenge oxygen fraction well over the entire envelope of load and blower speed range that are relevant for EGR. The simplicity of the new model makes it suitable for observer and control design, which are essential steps to meet the emission requirements for marine diesel engines that take......Exhaust gas recirculation (EGR) systems have been introduced to large marine engines in order to reduce NOx formation. Adequate modelling for control design is one of the bottlenecks to design EGR control that also meets emission requirements during transient loading conditions. This paper...

  14. Geochemical stratigraphy and correlation within large igneous provinces: The final preserved stages of the Faroe Islands Basalt Group

    Science.gov (United States)

    Millett, J. M.; Hole, M. J.; Jolley, D. W.; Passey, S. R.

    2017-08-01

    The Faroe Islands Basalt Group (FIBG) comprises a gross stratigraphic thickness of over 6.5 km of dominantly extrusive basaltic facies erupted during the Late Palaeocene to Early Eocene. In this study we present 140 major and trace element analyses from flow by flow field and borehole sample profiles, through the Enni Formation, which comprises the final phase of volcanism preserved on the Faroe Islands. The sample profiles target geographically spaced and overlapping stratigraphic sequences tied relative to a 3D ArcGIS surface for the regionally extensive volcaniclastic Argir Beds marker unit. From these profiles five geochemical groups including one low TiO2 (Low-Ti 1.5 wt%) groups differentiated by Nb, Zr, Y and V variations are identified in conjunction with previous studies. The spatial and stratigraphic distribution of these groups is mapped across the islands and demonstrates a complex inter-digitated flow field evolution. Within the finer scale variations, broad spatial and temporal development trends are identified demonstrating the potential for correlation within the volcanic succession at the local, tens of kilometers scale. Low-Ti lavas formed in association with lithospheric thinning and developed extensive flow fields between the Faroe Islands and East Greenland contemporaneous to the eruption of High-Ti smaller melt fraction lava flows in both locations. The progression of High-Ti lava groups preserved on either side of the developing rift zone is very similar, but is not, however, chronostratigraphic due to multiple inter-digitations of the chemical types. We tentatively suggest that a previously proposed rift-oblique transfer zone between the Faroe Islands and East Greenland enabled non-uniform lithospheric thinning and the preservation of a near-continuous High-Ti melting region between these areas beyond the onset of Low-Ti eruptions which were initially fed from the west. This study highlights the complex nature of late stage flood basalt

  15. The development and use of a molecular model for soybean maturity groups.

    Science.gov (United States)

    Langewisch, Tiffany; Lenis, Julian; Jiang, Guo-Liang; Wang, Dechun; Pantalone, Vince; Bilyeu, Kristin

    2017-05-30

    Achieving appropriate maturity in a target environment is essential to maximizing crop yield potential. In soybean [Glycine max (L.) Merr.], the time to maturity is largely dependent on developmental response to dark periods. Once the critical photoperiod is reached, flowering is initiated and reproductive development proceeds. Therefore, soybean adaptation has been attributed to genetic changes and natural or artificial selection to optimize plant development in specific, narrow latitudinal ranges. In North America, these regions have been classified into twelve maturity groups (MG), with lower MG being shorter season than higher MG. Growing soybean lines not adapted to a particular environment typically results in poor growth and significant yield reductions. The objective of this study was to develop a molecular model for soybean maturity based on the alleles underlying the major maturity loci: E1, E2, and E3. We determined the allelic variation and diversity of the E maturity genes in a large collection of soybean landraces, North American ancestors, Chinese cultivars, North American cultivars or expired Plant Variety Protection lines, and private-company lines. The E gene status of accessions in the USDA Soybean Germplasm Collection with SoySNP50K Beadchip data was also predicted. We determined the E allelic combinations needed to adapt soybean to different MGs in the United States (US) and discovered a strong signal of selection for E genotypes released in North America, particularly the US and Canada. The E gene maturity model proposed will enable plant breeders to more effectively transfer traits into different MGs and increase the overall efficiency of soybean breeding in the US and Canada. The powerful yet simple selection strategy for increasing soybean breeding efficiency can be used alone or to directly enhance genomic prediction/selection schemes. The results also revealed previously unrecognized aspects of artificial selection in soybean imposed by

  16. Importance of hemodialysis-related outcomes: comparison of ratings by a self-help group, clinicians, and health technology assessment authors with those by a large reference group of patients.

    Science.gov (United States)

    Janssen, Inger M; Scheibler, Fueloep; Gerhardus, Ansgar

    2016-01-01

    The selection of important outcomes is a crucial decision for clinical research and health technology assessment (HTA), and there is ongoing debate about which stakeholders should be involved. Hemodialysis is a complex treatment for chronic kidney disease (CKD) and affects many outcomes. Apart from obvious outcomes, such as mortality, morbidity and health-related quality of life (HRQoL), others such as, concerning daily living or health care provision, may also be important. The aim of our study was to analyze to what extent the preferences for patient-relevant outcomes differed between various stakeholders. We compared preferences of stakeholders normally or occasionally involved in outcome prioritization (patients from a self-help group, clinicians and HTA authors) with those of a large reference group of patients. The reference group consisted of 4,518 CKD patients investigated previously. We additionally recruited CKD patients via a regional self-help group, nephrologists via an online search and HTA authors via an expert database or personal contacts. All groups assessed the relative importance of the 23 outcomes by means of a discrete visual analog scale. We used descriptive statistics to rank outcomes and compare the results between groups. We received completed questionnaires from 49 self-help group patients, 19 nephrologists and 18 HTA authors. Only the following 3 outcomes were ranked within the top 7 outcomes by all 4 groups: safety, HRQoL and emotional state. The ratings by the self-help group were generally more concordant with the reference group ratings than those by nephrologists, while HTA authors showed the least concordance. Preferences of CKD patients from a self-help group, nephrologists and HTA authors differ to a varying extent from those of a large reference group of patients with CKD. The preferences of all stakeholders should form the basis of a transparent approach so as to generate a valid list of important outcomes.

  17. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    Science.gov (United States)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  18. Optimizing Prediction Using Bayesian Model Averaging: Examples Using Large-Scale Educational Assessments.

    Science.gov (United States)

    Kaplan, David; Lee, Chansoon

    2018-01-01

    This article provides a review of Bayesian model averaging as a means of optimizing the predictive performance of common statistical models applied to large-scale educational assessments. The Bayesian framework recognizes that in addition to parameter uncertainty, there is uncertainty in the choice of models themselves. A Bayesian approach to addressing the problem of model uncertainty is the method of Bayesian model averaging. Bayesian model averaging searches the space of possible models for a set of submodels that satisfy certain scientific principles and then averages the coefficients across these submodels weighted by each model's posterior model probability (PMP). Using the weighted coefficients for prediction has been shown to yield optimal predictive performance according to certain scoring rules. We demonstrate the utility of Bayesian model averaging for prediction in education research with three examples: Bayesian regression analysis, Bayesian logistic regression, and a recently developed approach for Bayesian structural equation modeling. In each case, the model-averaged estimates are shown to yield better prediction of the outcome of interest than any submodel based on predictive coverage and the log-score rule. Implications for the design of large-scale assessments when the goal is optimal prediction in a policy context are discussed.

  19. The impact of the Art Therapy Large Group, an educational tool in the training of art therapists, on post-qualification professional practice

    OpenAIRE

    Skaife, Sally; Jones, Kevin; Pentaris, Panagiotis

    2016-01-01

    This article reports the findings of a Likert scale survey that was sent to past graduates of the MA Art Psychotherapy, Goldsmiths, University of London asking them about the relevance of their experience in the Art Therapy Large Group (ATLG) to their subsequent employment as art therapists or work in another capacity. The ATLG comprises all the students and staff in a psychodynamically based experiential group that meets six times during the year. Survey questions were drawn from previously ...

  20. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large