Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation
Directory of Open Access Journals (Sweden)
Ümit Çiftçi
2010-03-01
Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test
Abd El–Naser A. Mohammed; Ahmed Nabih Zaki Rashed; Osama S. Fragallah; Mohamed G. El-Abyad
2013-01-01
In simple wavelength-division multiplexed (WDM) networks, a connection must be established along a route using a common wavelength on all of the links along the route. The introduction of wavelength converters into WDM cross connects increases the hardware cost and complexity. Given a set of connection requests, the routing and wavelength assignment problem involves finding a route (routing) and assigning a wavelength to each request. This paper has presented the WDM technology is being exten...
SCRAED - Simple and Complex Random Assignment in Experimental Designs
Alferes, Valentim R.
2009-01-01
SCRAED is a package of 37 self-contained SPSS syntax files that performs simple and complex random assignment in experimental designs. For between-subjects designs, SCRAED includes simple random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities), block random assignment (simple and generalized blocks), and stratified random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities). For within-subject...
A Computerized Approach to Trickle-Process, Random Assignment.
Braucht, G. Nicholas; Reichardt, Charles S.
1993-01-01
Procedures for implementing random assignment with trickle processing and ways they can be corrupted are described. A computerized method for implementing random assignment with trickle processing is presented as a desirable alternative in many situations and a way of protecting against threats to assignment validity. (SLD)
Rigorously testing multialternative decision field theory against random utility models.
Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg
2014-06-01
Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Random Assignment: Practical Considerations from Field Experiments.
Dunford, Franklyn W.
1990-01-01
Seven qualitative issues associated with randomization that have the potential to weaken or destroy otherwise sound experimental designs are reviewed and illustrated via actual field experiments. Issue areas include ethics and legality, liability risks, manipulation of randomized outcomes, hidden bias, design intrusiveness, case flow, and…
Randomized Assignments for Barter Exchanges: Fairness vs Efficiency
DEFF Research Database (Denmark)
Fang, Wenyi; Filos-Ratsikas, Aris; Frederiksen, Søren Kristoffer Stiil
2015-01-01
We study fairness and efficiency properties of randomized algorithms for barter exchanges with direct applications to kidney exchange problems. It is well documented that randomization can serve as a tool to ensure fairness among participants. However, in many applications, practical constraints...
Pettigrew, Jonathan; Miller-Day, Michelle; Krieger, Janice L.; Zhou, Jiangxiu; Hecht, Michael L.
2014-01-01
Random assignment to groups is the foundation for scientifically rigorous clinical trials. But assignment is challenging in group randomized trials when only a few units (schools) are assigned to each condition. In the DRSR project, we assigned 39 rural Pennsylvania and Ohio schools to three conditions (rural, classic, control). But even with 13 schools per condition, achieving pretest equivalence on important variables is not guaranteed. We collected data on six important school-level variables: rurality, number of grades in the school, enrollment per grade, percent white, percent receiving free/assisted lunch, and test scores. Key to our procedure was the inclusion of school-level drug use data, available for a subset of the schools. Also, key was that we handled the partial data with modern missing data techniques. We chose to create one composite stratifying variable based on the seven school-level variables available. Principal components analysis with the seven variables yielded two factors, which were averaged to form the composite inflate-suppress (CIS) score which was the basis of stratification. The CIS score was broken into three strata within each state; schools were assigned at random to the three program conditions from within each stratum, within each state. Results showed that program group membership was unrelated to the CIS score, the two factors making up the CIS score, and the seven items making up the factors. Program group membership was not significantly related to pretest measures of drug use (alcohol, cigarettes, marijuana, chewing tobacco; smallest p>.15), thus verifying that pretest equivalence was achieved. PMID:23722619
Random Schroedinger operators and the theory of disordered systems: some rigorous results
International Nuclear Information System (INIS)
Kunz, H.; Souillard, B.
1981-01-01
The authors report results on a class of finite difference Schroedinger operators with stochastic potentials. The Hamiltonian is then H(V)=-Δ+V; where Δ is the discretized Laplacian and the potential V acts as a multiplication operator. The potential V is random. (Auth.)
Pólya number and first return of bursty random walk: Rigorous solutions
Wan, J.; Xu, X. P.
2012-03-01
The recurrence properties of random walks can be characterized by Pólya number, i.e., the probability that the walker has returned to the origin at least once. In this paper, we investigate Pólya number and first return for bursty random walk on a line, in which the walk has different step size and moving probabilities. Using the concept of the Catalan number, we obtain exact results for first return probability, the average first return time and Pólya number for the first time. We show that Pólya number displays two different functional behavior when the walk deviates from the recurrent point. By utilizing the Lagrange inversion formula, we interpret our findings by transferring Pólya number to the closed-form solutions of an inverse function. We also calculate Pólya number using another approach, which corroborates our results and conclusions. Finally, we consider the recurrence properties and Pólya number of two variations of the bursty random walk model.
Applications of random forest feature selection for fine-scale genetic population assignment.
Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G
2018-02-01
Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.
Directory of Open Access Journals (Sweden)
Sorana D. BOLBOACĂ
2011-06-01
Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.
No Randomization? No Problem: Experimental Control and Random Assignment in Single Case Research
Ledford, Jennifer R.
2018-01-01
Randomization of large number of participants to different treatment groups is often not a feasible or preferable way to answer questions of immediate interest to professional practice. Single case designs (SCDs) are a class of research designs that are experimental in nature but require only a few participants, all of whom receive the…
Teacher-Child Interaction Training: A Pilot Study With Random Assignment.
Fernandez, Melanie A; Adelstein, Jonathan S; Miller, Samantha P; Areizaga, Margaret J; Gold, Dylann C; Sanchez, Amanda L; Rothschild, Sara A; Hirsch, Emily; Gudiño, Omar G
2015-07-01
Teacher-Child Interaction Training (TCIT), adapted from Parent-Child Interaction Therapy (PCIT), is a classroom-based program designed to provide teachers with behavior management skills that foster positive teacher-student relationships and to improve student behavior by creating a more constructive classroom environment. The purpose of this pilot study was to evaluate TCIT in more classrooms than previously reported in the literature, with older children than previously reported, using random assignment of classrooms to TCIT or to a no-TCIT control condition and conducting all but two sessions within the classroom to enhance feasibility. Participants included 11 kindergarten and first grade classroom teachers and their 118 students from three urban, public schools in Manhattan, with five classrooms randomly assigned to receive TCIT and six to the no-TCIT control condition. Observations of teacher skill acquisition were conducted before, during, and after TCIT for all 11 teachers, and teacher reports of student behavior were obtained at these same time points. Teacher satisfaction with TCIT was assessed following training. Results suggested that after receiving TCIT, teachers increased rates of positive attention to students' appropriate behavior, decreased rates of negative attention to misbehavior, reported significantly less distress related to student disruptive behavior, and reported high satisfaction with the training program. Our study supports the growing evidence-base suggesting that TCIT is a promising approach for training teachers in positive behavior management strategies and for improving student disruptive behavior in the classroom. Copyright © 2015. Published by Elsevier Ltd.
An efficient randomized algorithm for contact-based NMR backbone resonance assignment.
Kamisetty, Hetunandan; Bailey-Kellogg, Chris; Pandurangan, Gopal
2006-01-15
Backbone resonance assignment is a critical bottleneck in studies of protein structure, dynamics and interactions by nuclear magnetic resonance (NMR) spectroscopy. A minimalist approach to assignment, which we call 'contact-based', seeks to dramatically reduce experimental time and expense by replacing the standard suite of through-bond experiments with the through-space (nuclear Overhauser enhancement spectroscopy, NOESY) experiment. In the contact-based approach, spectral data are represented in a graph with vertices for putative residues (of unknown relation to the primary sequence) and edges for hypothesized NOESY interactions, such that observed spectral peaks could be explained if the residues were 'close enough'. Due to experimental ambiguity, several incorrect edges can be hypothesized for each spectral peak. An assignment is derived by identifying consistent patterns of edges (e.g. for alpha-helices and beta-sheets) within a graph and by mapping the vertices to the primary sequence. The key algorithmic challenge is to be able to uncover these patterns even when they are obscured by significant noise. This paper develops, analyzes and applies a novel algorithm for the identification of polytopes representing consistent patterns of edges in a corrupted NOESY graph. Our randomized algorithm aggregates simplices into polytopes and fixes inconsistencies with simple local modifications, called rotations, that maintain most of the structure already uncovered. In characterizing the effects of experimental noise, we employ an NMR-specific random graph model in proving that our algorithm gives optimal performance in expected polynomial time, even when the input graph is significantly corrupted. We confirm this analysis in simulation studies with graphs corrupted by up to 500% noise. Finally, we demonstrate the practical application of the algorithm on several experimental beta-sheet datasets. Our approach is able to eliminate a large majority of noise edges and to
NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel
2017-08-01
Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.
Park, Hyunjoon; Behrman, Jere R.; Choi, Jaesung
2013-01-01
Despite the voluminous literature on the potentials of single-sex schools, there is no consensus on the effects of single-sex schools because of student selection of school types. We exploit a unique feature of schooling in Seoul—the random assignment of students into single-sex versus coeducational high schools—to assess causal effects of single-sex schools on college entrance exam scores and college attendance. Our validation of the random assignment shows comparable socioeconomic backgroun...
McCoy, Thomasin E; Conrad, Amy L; Richman, Lynn C; Lindgren, Scott D; Nopoulos, Peg C; Bell, Edward F
2011-01-01
Preterm infants are frequently transfused with red blood cells based on standardized guidelines or clinical concerns that anemia taxes infants' physiological compensatory mechanisms and thereby threatens their health and well-being. The impact of various transfusion guidelines on long-term neurocognitive outcome is not known. The purpose of this study is to evaluate long-term neurocognitive outcome on children born prematurely and treated at birth with different transfusion guidelines. Neurocognitive outcomes were examined at school age for 56 preterm infants randomly assigned to a liberal (n = 33) or restrictive (n = 23) transfusion strategy. Tests of intelligence, achievement, language, visual-spatial/motor, and memory skills were administered. Between-group differences were assessed. Those in the liberal transfusion group performed more poorly than those in the restrictive group on measures of associative verbal fluency, visual memory, and reading. Findings highlight possible long-term neurodevelopmental consequences of maintaining higher hematocrit levels.
Sequential, Multiple Assignment, Randomized Trial Designs in Immuno-oncology Research.
Kidwell, Kelley M; Postow, Michael A; Panageas, Katherine S
2018-02-15
Clinical trials investigating immune checkpoint inhibitors have led to the approval of anti-CTLA-4 (cytotoxic T-lymphocyte antigen-4), anti-PD-1 (programmed death-1), and anti-PD-L1 (PD-ligand 1) drugs by the FDA for numerous tumor types. In the treatment of metastatic melanoma, combinations of checkpoint inhibitors are more effective than single-agent inhibitors, but combination immunotherapy is associated with increased frequency and severity of toxicity. There are questions about the use of combination immunotherapy or single-agent anti-PD-1 as initial therapy and the number of doses of either approach required to sustain a response. In this article, we describe a novel use of sequential, multiple assignment, randomized trial (SMART) design to evaluate immune checkpoint inhibitors to find treatment regimens that adapt within an individual based on intermediate response and lead to the longest overall survival. We provide a hypothetical example SMART design for BRAF wild-type metastatic melanoma as a framework for investigating immunotherapy treatment regimens. We compare implementing a SMART design to implementing multiple traditional randomized clinical trials. We illustrate the benefits of a SMART over traditional trial designs and acknowledge the complexity of a SMART. SMART designs may be an optimal way to find treatment strategies that yield durable response, longer survival, and lower toxicity. Clin Cancer Res; 24(4); 730-6. ©2017 AACR . ©2017 American Association for Cancer Research.
Day hospital as an alternative to inpatient care for cancer patients: a random assignment trial.
Mor, V; Stalker, M Z; Gralla, R; Scher, H I; Cimma, C; Park, D; Flaherty, A M; Kiss, M; Nelson, P; Laliberte, L
1988-01-01
A stratified, random-assignment trial of 442 cancer patients was conducted to evaluate medical, psychosocial, and financial outcomes of day hospital treatment as an alternative to inpatient care for certain cancer patients. Eligible patients required: a 4- to 8-hour treatment plan, including chemotherapy and other long-term intravenous (i.v.) treatment; a stable cardiovascular status; mental competence; no skilled overnight nursing; and a helper to assist with home care. Patients were ineligible if standard outpatient treatment was possible. No statistically significant (p less than 0.05) differences were found between the Adult Day Hospital (ADH) and Inpatient care in medical or psychosocial outcomes over the 60-day study period. The major difference was in medical costs--approximately one-third lower for ADH patients (p less than 0.001) than for the Inpatient group. The study demonstrates that day hospital care of medical oncology patients is clinically equivalent to Inpatient care, causes no negative psychosocial effects, and costs less than Inpatient care. Findings support the trend toward dehospitalization of medical treatment.
Meurer, William J; Seewald, Nicholas J; Kidwell, Kelley
2017-04-01
Modern clinical trials in stroke reperfusion fall into 2 categories: alternative systemic pharmacological regimens to alteplase and "rescue" endovascular approaches using targeted thrombectomy devices and/or medications delivered directly for persistently occluded vessels. Clinical trials in stroke have not evaluated how initial pharmacological thrombolytic management might influence subsequent rescue strategy. A sequential multiple assignment randomized trial (SMART) is a novel trial design that can test these dynamic treatment regimens and lead to treatment guidelines that more closely mimic practice. To characterize a SMART design in comparison to traditional approaches for stroke reperfusion trials. We conducted a numerical simulation study that evaluated the performance of contrasting acute stroke clinical trial designs of both initial reperfusion and rescue therapy. We compare a SMART design where the same patients are followed through initial reperfusion and rescue therapy within 1 trial to a standard phase III design comparing 2 reperfusion treatments and a separate phase II futility design of rescue therapy in terms of sample size, power, and ability to address particular research questions. Traditional trial designs can be well powered and have optimal design characteristics for independent treatment effects. When treatments, such as the reperfusion and rescue therapies, may interact, commonly used designs fail to detect this. A SMART design, with similar sample size to standard designs, can detect treatment interactions. The use of SMART designs to investigate effective and realistic dynamic treatment regimens is a promising way to accelerate the discovery of new, effective treatments for stroke. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.
The High/Scope Perry Preschool Study: A Case Study in Random Assignment.
Schweinhart, Lawrence J.
2000-01-01
Studied the long-term benefits of preschool programs for young children living in poverty in the High/Scope Perry Preschool Study, which examined the lives of 123 African Americans randomly divided into a preschool treatment group and a no-preschool comparison group. Cost-benefit analyses of data on these students to age 27 show beneficial effects…
Kelleher, Sarah A; Dorfman, Caroline S; Plumb Vilardaga, Jen C; Majestic, Catherine; Winger, Joseph; Gandhi, Vicky; Nunez, Christine; Van Denburg, Alyssa; Shelby, Rebecca A; Reed, Shelby D; Murphy, Susan; Davidian, Marie; Laber, Eric B; Kimmick, Gretchen G; Westbrook, Kelly W; Abernethy, Amy P; Somers, Tamara J
2017-06-01
Pain is common in cancer patients and results in lower quality of life, depression, poor physical functioning, financial difficulty, and decreased survival time. Behavioral pain interventions are effective and nonpharmacologic. Traditional randomized controlled trials (RCT) test interventions of fixed time and dose, which poorly represent successive treatment decisions in clinical practice. We utilize a novel approach to conduct a RCT, the sequential multiple assignment randomized trial (SMART) design, to provide comparative evidence of: 1) response to differing initial doses of a pain coping skills training (PCST) intervention and 2) intervention dose sequences adjusted based on patient response. We also examine: 3) participant characteristics moderating intervention responses and 4) cost-effectiveness and practicality. Breast cancer patients (N=327) having pain (ratings≥5) are recruited and randomly assigned to: 1) PCST-Full or 2) PCST-Brief. PCST-Full consists of 5 PCST sessions. PCST-Brief consists of one 60-min PCST session. Five weeks post-randomization, participants re-rate their pain and are re-randomized, based on intervention response, to receive additional PCST sessions, maintenance calls, or no further intervention. Participants complete measures of pain intensity, interference and catastrophizing. Novel RCT designs may provide information that can be used to optimize behavioral pain interventions to be adaptive, better meet patients' needs, reduce barriers, and match with clinical practice. This is one of the first trials to use a novel design to evaluate symptom management in cancer patients and in chronic illness; if successful, it could serve as a model for future work with a wide range of chronic illnesses. Copyright © 2016. Published by Elsevier Inc.
Park, Hyunjoon; Behrman, Jere R; Choi, Jaesung
2013-04-01
Despite the voluminous literature on the potentials of single-sex schools, there is no consensus on the effects of single-sex schools because of student selection of school types. We exploit a unique feature of schooling in Seoul-the random assignment of students into single-sex versus coeducational high schools-to assess causal effects of single-sex schools on college entrance exam scores and college attendance. Our validation of the random assignment shows comparable socioeconomic backgrounds and prior academic achievement of students attending single-sex schools and coeducational schools, which increases the credibility of our causal estimates of single-sex school effects. The three-level hierarchical model shows that attending all-boys schools or all-girls schools, rather than coeducational schools, is significantly associated with higher average scores on Korean and English test scores. Applying the school district fixed-effects models, we find that single-sex schools produce a higher percentage of graduates who attended four-year colleges and a lower percentage of graduates who attended two-year junior colleges than do coeducational schools. The positive effects of single-sex schools remain substantial, even after we take into account various school-level variables, such as teacher quality, the student-teacher ratio, the proportion of students receiving lunch support, and whether the schools are public or private.
Park, Hyunjoon; Behrman, Jere R.; Choi, Jaesung
2012-01-01
Despite the voluminous literature on the potentials of single-sex schools, there is no consensus on the effects of single-sex schools because of student selection of school types. We exploit a unique feature of schooling in Seoul—the random assignment of students into single-sex versus coeducational high schools—to assess causal effects of single-sex schools on college entrance exam scores and college attendance. Our validation of the random assignment shows comparable socioeconomic backgrounds and prior academic achievement of students attending single-sex schools and coeducational schools, which increases the credibility of our causal estimates of single-sex school effects. The three-level hierarchical model shows that attending all-boys schools or all-girls schools, rather than coeducational schools, is significantly associated with higher average scores on Korean and English test scores. Applying the school district fixed-effects models, we find that single-sex schools produce a higher percentage of graduates who attended four-year colleges and a lower percentage of graduates who attended two-year junior colleges than do coeducational schools. The positive effects of single-sex schools remain substantial, even after we take into account various school-level variables, such as teacher quality, the student-teacher ratio, the proportion of students receiving lunch support, and whether the schools are public or private. PMID:23073751
Li, Yi; Guo, Guang
2016-09-01
Identifying casual peer influence is a long-standing challenge to social scientists. Using data from a natural experiment of randomly-assigned college roommates (N = 2,059), which removes the threat of friend selection, we investigate peer effects on aggressive behavior, smoking, and concurrent sexual partnering. The findings suggest that the magnitude and direction of peer influence depend on predisposition, gender, and the nature of the behavior. Peer effects on individuals predisposed toward a given behavior tend to be larger than peer effects on individuals without such a predisposition. We find that the influence of roommates on aggressive behavior is more pronounced among male students than among female students; roommate effects on smoking are negative among female students and male students who did not smoke before college. For concurrent sexual partnering, a highly private behavior, we find no evidence of peer effects. © American Sociological Association 2016.
Choi, Jaesung; Park, Hyunjoon; Behrman, Jere R
2015-06-01
A growing body of research reports associations of school contexts with adolescents' weight and weight-related behaviors. One interesting, but under-researched, dimension of school context that potentially matters for adolescents' weight is the gender composition. If boys and girls are separated into single-sex schools, they might be less concerned about physical appearance, which may result in increased weight. Utilizing a unique setting in Seoul, Korea where students are randomly assigned to single-sex and coeducational schools within school districts, we estimate causal effects of single-sex schools on weight and weight-related behaviors. Our results show that students attending single-sex schools are more likely to be overweight, and that the effects are more pronounced for girls. We also find that girls in single-sex schools are less likely to engage in strenuous activities than their coeducational counterparts. Copyright © 2015 Elsevier Ltd. All rights reserved.
Scientific rigor through videogames.
Treuille, Adrien; Das, Rhiju
2014-11-01
Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.
Statistical mechanics rigorous results
Ruelle, David
1999-01-01
This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.
Ryum, Truls; Stiles, Tore C.; Svartberg, Martin; McCullough, Leigh
2010-01-01
Therapist competence in assigning homework was used to predict mid- and posttreatment outcome for patients with Cluster C personality disorders in cognitive therapy (CT). Twenty-five patients that underwent 40 sessions of CT were taken from a randomized controlled trial (Svartberg, Stiles, & Seltzer, 2004). Therapist competence in assigning…
2017-05-06
In an effort to address performance gaps we devised a teaching paradigm, called Task-Oriented Role Ass ignment’, in which we have delegated a...task delegation , thereby improving NRP performance. Health care professionals taking the NRP course were randomized to either the control group, which...such as leadership (mean = 4· control, s study; p = 0.05). However, both groups scored similarly in overall NRP task performance with mean scores of
Putrefactive rigor: apparent rigor mortis due to gas distension.
Gill, James R; Landi, Kristen
2011-09-01
Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.
Evaluating the Flipped Classroom: A Randomized Controlled Trial
Wozny, Nathan; Balser, Cary; Ives, Drew
2018-01-01
Despite recent interest in flipped classrooms, rigorous research evaluating their effectiveness is sparse. In this study, the authors implement a randomized controlled trial to evaluate the effect of a flipped classroom technique relative to a traditional lecture in an introductory undergraduate econometrics course. Random assignment enables the…
Mathematical Rigor in Introductory Physics
Vandyke, Michael; Bassichis, William
2011-10-01
Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.
A case of instantaneous rigor?
Pirch, J; Schulz, Y; Klintschar, M
2013-09-01
The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.
DEFF Research Database (Denmark)
Martins, Cesario; Garly, May-Lill; Bale, Carlitos
2014-01-01
The World Health Organization recommends administration of measles vaccine (MV) at age 9 months in low-income countries. We tested the measles virus antibody response at 4.5, 9, 18, and 24 months of age for children randomly assigned to receive standard-titer Edmonston-Zagreb MV at 4.5 and 9 months...
Realizing rigor in the mathematics classroom
Hull, Ted H (Henry); Balka, Don S
2014-01-01
Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data
Rigorous theory of molecular orientational nonlinear optics
International Nuclear Information System (INIS)
Kwak, Chong Hoon; Kim, Gun Yeup
2015-01-01
Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecular hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented
Classroom Talk for Rigorous Reading Comprehension Instruction
Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.
2004-01-01
This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…
Rigorous Science: a How-To Guide
Directory of Open Access Journals (Sweden)
Arturo Casadevall
2016-11-01
Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.
DEFF Research Database (Denmark)
Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave
2014-01-01
minimizing modification of the classic random priority method to solve this class of problems. We also provide some logical relations in our setting among standard axioms in the literature on assignment problems, and explore the robustness of our results to several extensions of our setting....
Kröger, Christoph; Kliem, Sören; Zimmermann, Peter; Kowalski, Jens
2018-04-01
This study examines the short-term effectiveness of a relationship education program designed for military couples. Distressed couples were randomly placed in either a wait-list control group or an intervention group. We conducted training sessions before a 3-month foreign assignment, and refresher courses approximately 6-week post-assignment. We analyzed the dyadic data of 32 couples, using hierarchical linear modeling in a two-level model. Reduction in unresolved conflicts was found in the intervention group, with large pre-post effects for both partners. Relationship satisfaction scores were improved, with moderate-to-large effects only for soldiers, rather than their partners. Post-follow-up effect sizes suggested further improvement in the intervention group. Future research should examine the long-term effectiveness of this treatment. © 2017 American Association for Marriage and Family Therapy.
Landsheer, Johannes A.; Smit, Johannes H.; van Oppen, Patricia; van Balkom, Anton J L M
2015-01-01
The effectiveness of Fluvoxamine was compared to that of Cognitive Therapy (CT) in a 12-week randomized controlled trial (RCT) in 48 patients with obsessive-compulsive disorder (OCD), who were treatment-resistant to a previous behavior therapy (BT). A considerable amount of patients did not comply
Krompecher, T
1981-01-01
Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.
Inferential backbone assignment for sparse data
International Nuclear Information System (INIS)
Vitek, Olga; Bailey-Kellogg, Chris; Craig, Bruce; Vitek, Jan
2006-01-01
This paper develops an approach to protein backbone NMR assignment that effectively assigns large proteins while using limited sets of triple-resonance experiments. Our approach handles proteins with large fractions of missing data and many ambiguous pairs of pseudoresidues, and provides a statistical assessment of confidence in global and position-specific assignments. The approach is tested on an extensive set of experimental and synthetic data of up to 723 residues, with match tolerances of up to 0.5 ppm for C α and C β resonance types. The tests show that the approach is particularly helpful when data contain experimental noise and require large match tolerances. The keys to the approach are an empirical Bayesian probability model that rigorously accounts for uncertainty in the data at all stages in the analysis, and a hybrid stochastic tree-based search algorithm that effectively explores the large space of possible assignments
"Rigor mortis" in a live patient.
Chakravarthy, Murali
2010-03-01
Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.
[Rigor mortis -- a definite sign of death?].
Heller, A R; Müller, M P; Frank, M D; Dressler, J
2005-04-01
In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.
An ultramicroscopic study on rigor mortis.
Suzuki, T
1976-01-01
Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.
Molina, Brooke S G; Hinshaw, Stephen P; Eugene Arnold, L; Swanson, James M; Pelham, William E; Hechtman, Lily; Hoza, Betsy; Epstein, Jeffery N; Wigal, Timothy; Abikoff, Howard B; Greenhill, Laurence L; Jensen, Peter S; Wells, Karen C; Vitiello, Benedetto; Gibbons, Robert D; Howard, Andrea; Houck, Patricia R; Hur, Kwan; Lu, Bo; Marcus, Sue
2013-03-01
To determine long-term effects on substance use and substance use disorder (SUD), up to 8 years after childhood enrollment, of the randomly assigned 14-month treatments in the multisite Multimodal Treatment Study of Children with Attention-Deficit/Hyperactivity Disorder (MTA; n = 436); to test whether medication at follow-up, cumulative psychostimulant treatment over time, or both relate to substance use/SUD; and to compare substance use/SUD in the ADHD sample to the non-ADHD childhood classmate comparison group (n = 261). Mixed-effects regression models with planned contrasts were used for all tests except the important cumulative stimulant treatment question, for which propensity score matching analysis was used. The originally randomized treatment groups did not differ significantly on substance use/SUD by the 8-year follow-up or earlier (mean age = 17 years). Neither medication at follow-up (mostly stimulants) nor cumulative stimulant treatment was associated with adolescent substance use/SUD. Substance use at all time points, including use of two or more substances and SUD, were each greater in the ADHD than in the non-ADHD samples, regardless of sex. Medication for ADHD did not protect from, or contribute to, visible risk of substance use or SUD by adolescence, whether analyzed as randomized treatment assignment in childhood, as medication at follow-up, or as cumulative stimulant treatment over an 8-year follow-up from childhood. These results suggest the need to identify alternative or adjunctive adolescent-focused approaches to substance abuse prevention and treatment for boys and girls with ADHD, especially given their increased risk for use and abuse of multiple substances that is not improved with stimulant medication. Clinical trial registration information-Multimodal Treatment Study of Children With Attention Deficit and Hyperactivity Disorder (MTA); http://clinical trials.gov/; NCT00000388. Copyright © 2013 American Academy of Child and Adolescent
The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System
Mixon, Jason; Stuart, Jerry
2009-01-01
In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…
Trends: Rigor Mortis in the Arts.
Blodget, Alden S.
1991-01-01
Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)
Photoconductivity of amorphous silicon-rigorous modelling
International Nuclear Information System (INIS)
Brada, P.; Schauer, F.
1991-01-01
It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)
Accelerating Biomedical Discoveries through Rigor and Transparency.
Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D
2017-07-01
Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Plagiarism-Proofing Assignments
Johnson, Doug
2004-01-01
Mr. Johnson has discovered that the higher the level of student engagement and creativity, the lower the probability of plagiarism. For teachers who would like to see such desirable results, he describes the characteristics of assignments that are most likely to produce them. Two scenarios of types of assignments that avoid plagiarism are…
Software metrics a rigorous and practical approach
Fenton, Norman
2014-01-01
A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant
Development of rigor mortis is not affected by muscle volume.
Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H
2001-04-01
There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.
Lahaie, Sébastien; Parkes, David C.
We consider the problem of fair allocation in the package assignment model, where a set of indivisible items, held by single seller, must be efficiently allocated to agents with quasi-linear utilities. A fair assignment is one that is efficient and envy-free. We consider a model where bidders have superadditive valuations, meaning that items are pure complements. Our central result is that core outcomes are fair and even coalition-fair over this domain, while fair distributions may not even exist for general valuations. Of relevance to auction design, we also establish that the core is equivalent to the set of anonymous-price competitive equilibria, and that superadditive valuations are a maximal domain that guarantees the existence of anonymous-price competitive equilibrium. Our results are analogs of core equivalence results for linear prices in the standard assignment model, and for nonlinear, non-anonymous prices in the package assignment model with general valuations.
ABCA Bulletin, 1983
1983-01-01
Describes three assignments for enticing business communication students to undertake library research: an analysis of a Fortune 500 company, a career choice report, and a report on an organization that offers potential employment. (AEA)
Historical WBAN ID Assignments
National Oceanic and Atmospheric Administration, Department of Commerce — 4"x6" index cards represent the first written assignments of Weather Bureau Army Navy (WBAN) station identifier numbers by the National Climatic Data Center....
1983-12-01
D-136 548 DYNAMIIC SEQUENCE ASSIGNMENT(U) ADVANCED INFORMATION AND 1/2 DECISION SYSTEMS MOUNTAIN YIELW CA C A 0 REILLY ET AL. UNCLSSIIED DEC 83 AI/DS...I ADVANCED INFORMATION & DECISION SYSTEMS Mountain View. CA 94040 84 u ,53 V,..’. Unclassified _____ SCURITY CLASSIFICATION OF THIS PAGE REPORT...reviews some important heuristic algorithms developed for fas- ter solution of the sequence assignment problem. 3.1. DINAMIC MOGRAMUNIG FORMULATION FOR
Directory of Open Access Journals (Sweden)
2016-01-01
Full Text Available The article is devoted to the airline scheduling process and methods of its modeling. This article describes the main stages of airline scheduling process (scheduling, fleet assignment, revenue management, operations, their features and interactions. The main part of scheduling process is fleet assignment. The optimal solution of the fleet assignment problem enables airlines to increase their incomes up to 3 % due to quality improving of connections and execution of the planned number of flights operated by less number of aircraft than usual or planned earlier. Fleet assignment of scheduling process is examined and Conventional Leg-Based Fleet Assignment Model is analyzed. Finally strong and weak aspects of the model (SWOT are released and applied. The article gives a critical analysis of FAM model, with the purpose of identi- fying possible options and constraints of its use (for example, in cases of short-term and long-term planning, changing the schedule or replacing the aircraft, as well as possible ways to improve the model.
Rigor in Qualitative Supply Chain Management Research
DEFF Research Database (Denmark)
Goffin, Keith; Raja, Jawwad; Claes, Björn
2012-01-01
, reliability, and theoretical saturation. Originality/value – It is the authors' contention that the addition of the repertory grid technique to the toolset of methods used by logistics and supply chain management researchers can only enhance insights and the building of robust theories. Qualitative studies......Purpose – The purpose of this paper is to share the authors' experiences of using the repertory grid technique in two supply chain management studies. The paper aims to demonstrate how the two studies provided insights into how qualitative techniques such as the repertory grid can be made more...... rigorous than in the past, and how results can be generated that are inaccessible using quantitative methods. Design/methodology/approach – This paper presents two studies undertaken using the repertory grid technique to illustrate its application in supply chain management research. Findings – The paper...
Statistics for mathematicians a rigorous first course
Panaretos, Victor M
2016-01-01
This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.
Dominguez-Martinez, S.
2009-01-01
An important task of a manager is to motivate her subordinates. One way in which a manager can give incentives to junior employees is through the assignment of tasks. How a manager allocates tasks in an organization, provides information to the junior employees about his ability. Without coaching
Martins, Cesario; Garly, May-Lill; Bale, Carlitos; Rodrigues, Amabelia; Njie-Jobe, Jainaba; Benn, Christine S; Whittle, Hilton; Aaby, Peter
2014-09-01
The World Health Organization recommends administration of measles vaccine (MV) at age 9 months in low-income countries. We tested the measles virus antibody response at 4.5, 9, 18, and 24 months of age for children randomly assigned to receive standard-titer Edmonston-Zagreb MV at 4.5 and 9 months, at 9 months, or at 9 and 18 months of age. At 4.5 months of age, 75% had nonprotective measles virus antibody levels. Following receipt of MV at 4.5 months of age, 77% (316/408) had protective antibody levels at 9 months of age; after a second dose at 9 months of age, 97% (326/337) had protective levels at 24 months of age. In addition, the response at both 9 and 24 months of age was inversely correlated with the antibody level at receipt of the first dose of MV, and the second dose of MV, received at 9 months of age, provided a significant boost in antibody level to children who had low antibody levels. In the group of 318 children who received MV at 9 months of age, with or without a second dose at 18 months of age, 99% (314) had protective levels at 24 months of age. The geometric mean titer at 24 months of age was significantly lower in the group that received MV at 4.5 and 9 months of age than in the group that received MV at 9 months of age (P = .0001). In conclusion, an early 2-dose MV schedule was associated with protective measles virus antibody levels at 24 months of age in nearly all children. Clinical Trials Registration. NCT00168558. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
LENUS (Irish Health Repository)
Bonati, Leo H
2009-10-01
In the Carotid and Vertebral Artery Transluminal Angioplasty Study (CAVATAS), early recurrent carotid stenosis was more common in patients assigned to endovascular treatment than it was in patients assigned to endarterectomy (CEA), raising concerns about the long-term effectiveness of endovascular treatment. We aimed to investigate the long-term risks of restenosis in patients included in CAVATAS.
Personnel dose assignment practices
International Nuclear Information System (INIS)
Fix, J.J.
1993-04-01
Implementation of DOE N 5480.6 Radiological Control Manual Article 511(3) requirements, to minimize the assignment of personnel dosimeters, should be done only under a broader context ensuring that capabilities are in place to monitor and record personnel exposure both for compliance and for potential litigation. As noted in NCRP Report No. 114, personnel dosimetry programs are conducted to meet four major objectives: radiation safety program control and evaluation; regulatory compliance; epidemiological research; and litigation. A change to Article 511(3) is proposed that would require that minimizing the assignment of personnel dosimeters take place only following full evaluation of overall capabilities (e.g., access control, area dosimetry, etc.) to meet the NCRP objectives
Scaffolding students’ assignments
DEFF Research Database (Denmark)
Slot, Marie Falkesgaard
2013-01-01
This article discusses scaffolding in typical student assignments in mother tongue learning materials in upper secondary education in Denmark and the United Kingdom. It has been determined that assignments do not have sufficient scaffolding end features to help pupils understand concepts and build...... objects. The article presents the results of empirical research on tasks given in Danish and British learning materials. This work is based on a further development of my PhD thesis: “Learning materials in the subject of Danish” (Slot 2010). The main focus is how cognitive models (and subsidiary explicit...... learning goals) can help students structure their argumentative and communica-tive learning processes, and how various multimodal representations can give more open-ended learning possibilities for collaboration. The article presents a short introduction of the skills for 21st century learning and defines...
Dominguez-Martinez, S.
2009-01-01
An important task of a manager is to motivate her subordinates. One way in which a manager can give incentives to junior employees is through the assignment of tasks. How a manager allocates tasks in an organization, provides information to the junior employees about his ability. Without coaching from a manager, the junior employee only has information about his past performance. Based on his past performance, a talented junior who has performed a difficult task sometimes decides to leave the...
Rigor or mortis: best practices for preclinical research in neuroscience.
Steward, Oswald; Balice-Gordon, Rita
2014-11-05
Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.
[Experimental study of restiffening of the rigor mortis].
Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M
2001-11-01
To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.
Rigorous Results for the Distribution of Money on Connected Graphs
Lanchier, Nicolas; Reed, Stephanie
2018-05-01
This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.
Long persistence of rigor mortis at constant low temperature.
Varetto, Lorenzo; Curto, Ombretta
2005-01-06
We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.
Rigorous solution to Bargmann-Wigner equation for integer spin
Huang Shi Zhong; Wu Ning; Zheng Zhi Peng
2002-01-01
A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived
Using grounded theory as a method for rigorously reviewing literature
Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.
2013-01-01
This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously
Evaluating Rigor in Qualitative Methodology and Research Dissemination
Trainor, Audrey A.; Graue, Elizabeth
2014-01-01
Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…
Job Assignments under Moral Hazard
DEFF Research Database (Denmark)
Koch, Alexander; Nafziger, Julia
Inefficient job assignments are usually explained with incomplete information about employees' abilities or contractual imperfections. We show that inefficient assignments arise even without uncertainty about the employee's ability and with complete contracts. Building on this result we provide...
Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R
1983-07-01
The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.
Krompecher, T; Bergerioux, C
1988-01-01
The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.
Stochastic Geometry and Quantum Gravity: Some Rigorous Results
Zessin, H.
The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.
Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.
Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten
2018-01-01
Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence
Monitoring muscle optical scattering properties during rigor mortis
Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.
2007-09-01
Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.
Recent Development in Rigorous Computational Methods in Dynamical Systems
Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł
2009-01-01
We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...
Estimation of the time since death--reconsidering the re-establishment of rigor mortis.
Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter
2013-01-01
In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.
Ant Colony Algorithm and Simulation for Robust Airport Gate Assignment
Directory of Open Access Journals (Sweden)
Hui Zhao
2014-01-01
Full Text Available Airport gate assignment is core task for airport ground operations. Due to the fact that the departure and arrival time of flights may be influenced by many random factors, the airport gate assignment scheme may encounter gate conflict and many other problems. This paper aims at finding a robust solution for airport gate assignment problem. A mixed integer model is proposed to formulate the problem, and colony algorithm is designed to solve this model. Simulation result shows that, in consideration of robustness, the ability of antidisturbance for airport gate assignment scheme has much improved.
Tenderness of pre- and post rigor lamb longissimus muscle.
Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad
2011-08-01
Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.
Incorporating breeding abundance into spatial assignments on continuous surfaces.
Rushing, Clark S; Marra, Peter P; Studds, Colin E
2017-06-01
Determining the geographic connections between breeding and nonbreeding populations, termed migratory connectivity, is critical to advancing our understanding of the ecology and conservation of migratory species. Assignment models based on stable isotopes historically have been an important tool for studying migratory connectivity of small-bodied species, but the low resolution of these assignments has generated interest into combining isotopes with other sources in information. Abundance is one of the most appealing data sources to include in isotope-based assignments, but there are currently no statistical methods or guidelines for optimizing the contribution of stable isotopes and abundance for inferring migratory connectivity. Using known-origin stable-hydrogen isotope samples of six Neotropical migratory bird species, we rigorously assessed the performance of assignment models that differentially weight the contribution of the isotope and abundance data. For two species with adequate sample sizes, we used Pareto optimality to determine the set of models that simultaneously minimized both assignment error rate and assignment area. We then assessed the ability of the top models from these two species to improve assignments of the remaining four species compared to assignments based on isotopes alone. We show that the increased precision of models that include abundance is often offset by a large increase in assignment error. However, models that optimally weigh the abundance data relative to the isotope data can result in higher precision and, in some cases, lower error than models based on isotopes alone. The top models, however, depended on the distribution of relative breeding abundance, with patchier distributions requiring stronger downweighting of abundance, and we present general guidelines for future studies. These results confirm that breeding abundance can be an important source of information for studies investigating broad-scale movements of
Estimation of the breaking of rigor mortis by myotonometry.
Vain, A; Kauppila, R; Vuori, E
1996-05-31
Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.
Physiological studies of muscle rigor mortis in the fowl
International Nuclear Information System (INIS)
Nakahira, S.; Kaneko, K.; Tanaka, K.
1990-01-01
A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)
Contact replacement for NMR resonance assignment.
Xiong, Fei; Pandurangan, Gopal; Bailey-Kellogg, Chris
2008-07-01
Complementing its traditional role in structural studies of proteins, nuclear magnetic resonance (NMR) spectroscopy is playing an increasingly important role in functional studies. NMR dynamics experiments characterize motions involved in target recognition, ligand binding, etc., while NMR chemical shift perturbation experiments identify and localize protein-protein and protein-ligand interactions. The key bottleneck in these studies is to determine the backbone resonance assignment, which allows spectral peaks to be mapped to specific atoms. This article develops a novel approach to address that bottleneck, exploiting an available X-ray structure or homology model to assign the entire backbone from a set of relatively fast and cheap NMR experiments. We formulate contact replacement for resonance assignment as the problem of computing correspondences between a contact graph representing the structure and an NMR graph representing the data; the NMR graph is a significantly corrupted, ambiguous version of the contact graph. We first show that by combining connectivity and amino acid type information, and exploiting the random structure of the noise, one can provably determine unique correspondences in polynomial time with high probability, even in the presence of significant noise (a constant number of noisy edges per vertex). We then detail an efficient randomized algorithm and show that, over a variety of experimental and synthetic datasets, it is robust to typical levels of structural variation (1-2 AA), noise (250-600%) and missings (10-40%). Our algorithm achieves very good overall assignment accuracy, above 80% in alpha-helices, 70% in beta-sheets and 60% in loop regions. Our contact replacement algorithm is implemented in platform-independent Python code. The software can be freely obtained for academic use by request from the authors.
Supersymmetry and the Parisi-Sourlas dimensional reduction: A rigorous proof
International Nuclear Information System (INIS)
Klein, A.; Landau, L.J.; Perez, J.F.
1984-01-01
Functional integrals that are formally related to the average correlation functions of a classical field theory in the presence of random external sources are given a rigorous meaning. Their dimensional reduction to the Schwinger functions of the corresponding quantum field theory in two fewer dimensions is proven. This is done by reexpressing those functional integrals as expectations of a supersymmetric field theory. The Parisi-Sourlas dimensional reduction of a supersymmetric field theory to a usual quantum field theory in two fewer dimensions is proven. (orig.)
Game theory and traffic assignment.
2013-09-01
Traffic assignment is used to determine the number of users on roadway links in a network. While this problem has : been widely studied in transportation literature, its use of the concept of equilibrium has attracted considerable interest : in the f...
Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research
Andriessen, Daniel
2004-01-01
This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…
Paper 3: Content and Rigor of Algebra Credit Recovery Courses
Walters, Kirk; Stachel, Suzanne
2014-01-01
This paper describes the content, organization and rigor of the f2f and online summer algebra courses that were delivered in summers 2011 and 2012. Examining the content of both types of courses is important because research suggests that algebra courses with certain features may be better than others in promoting success for struggling students.…
A rigorous treatment of uncertainty quantification for Silicon damage metrics
International Nuclear Information System (INIS)
Griffin, P.
2016-01-01
These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)
Effects of post mortem temperature on rigor tension, shortening and ...
African Journals Online (AJOL)
Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...
Characterization of rigor mortis of longissimus dorsi and triceps ...
African Journals Online (AJOL)
24 h) of the longissimus dorsi (LD) and triceps brachii (TB) muscles as well as the shear force (meat tenderness) and colour were evaluated, aiming at characterizing the rigor mortis in the meat during industrial processing. Data statistic treatment demonstrated that carcass temperature and pH decreased gradually during ...
Rigor, vigor, and the study of health disparities.
Adler, Nancy; Bush, Nicole R; Pantell, Matthew S
2012-10-16
Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.
A rigorous proof for the Landauer-Büttiker formula
DEFF Research Database (Denmark)
Cornean, Horia Decebal; Jensen, Arne; Moldoveanu, V.
Recently, Avron et al. shed new light on the question of quantum transport in mesoscopic samples coupled to particle reservoirs by semi-infinite leads. They rigorously treat the case when the sample undergoes an adiabatic evolution thus generating a current through th leads, and prove the so call...
UOP LDR 300 All Assignments New
ADMIN
2018-01-01
UOP LDR 300 All Assignments New Check this A+ tutorial guideline at http://www.ldr300assignment.com/ldr-300-uop/ldr-300-all-assignments-latest For more classes visit http://www.ldr300assignment.com LDR 300 Week 1 Assignment Leadership Assessment (2 Papers) LDR 300 Week 2 Assignment Leadership Theories Matrix (2 Set) LDR 300 Week 2 Assignment Formulating Leadership Part I (2 Papers) LDR 300 Week 3 Assignment Interaction and Influence Amo...
Rigorous simulation: a tool to enhance decision making
Energy Technology Data Exchange (ETDEWEB)
Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)
2012-07-01
The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct
Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.
Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia
2015-01-01
mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.
Einstein's Theory A Rigorous Introduction for the Mathematically Untrained
Grøn, Øyvind
2011-01-01
This book provides an introduction to the theory of relativity and the mathematics used in its processes. Three elements of the book make it stand apart from previously published books on the theory of relativity. First, the book starts at a lower mathematical level than standard books with tensor calculus of sufficient maturity to make it possible to give detailed calculations of relativistic predictions of practical experiments. Self-contained introductions are given, for example vector calculus, differential calculus and integrations. Second, in-between calculations have been included, making it possible for the non-technical reader to follow step-by-step calculations. Thirdly, the conceptual development is gradual and rigorous in order to provide the inexperienced reader with a philosophically satisfying understanding of the theory. Einstein's Theory: A Rigorous Introduction for the Mathematically Untrained aims to provide the reader with a sound conceptual understanding of both the special and genera...
Rigor mortis in an unusual position: Forensic considerations.
D'Souza, Deepak H; Harish, S; Rajesh, M; Kiran, J
2011-07-01
We report a case in which the dead body was found with rigor mortis in an unusual position. The dead body was lying on its back with limbs raised, defying gravity. Direction of the salivary stains on the face was also defying the gravity. We opined that the scene of occurrence of crime is unlikely to be the final place where the dead body was found. The clues were revealing a homicidal offence and an attempt to destroy the evidence. The forensic use of 'rigor mortis in an unusual position' is in furthering the investigations, and the scientific confirmation of two facts - the scene of death (occurrence) is different from the scene of disposal of dead body, and time gap between the two places.
Some rigorous results concerning spectral theory for ideal MHD
International Nuclear Information System (INIS)
Laurence, P.
1986-01-01
Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first- and second-order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first-order formulation satisfies the conditions of the Hille--Yosida theorem. A foundation is laid thereby within which the domains associated with the first- and second-order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0
Some rigorous results concerning spectral theory for ideal MHD
International Nuclear Information System (INIS)
Laurence, P.
1985-05-01
Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first and second order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first order formulation satisfies the conditions of the Hille-Yosida theorem. A foundation is laid thereby within which the domains associated with the first and second order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0
Rigorous results on measuring the quark charge below color threshold
International Nuclear Information System (INIS)
Lipkin, H.J.
1979-01-01
Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)
A Rigorous Methodology for Analyzing and Designing Plug-Ins
DEFF Research Database (Denmark)
Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph
2013-01-01
. This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....
Striation Patterns of Ox Muscle in Rigor Mortis
Locker, Ronald H.
1959-01-01
Ox muscle in rigor mortis offers a selection of myofibrils fixed at varying degrees of contraction from sarcomere lengths of 3.7 to 0.7 µ. A study of this material by phase contrast and electron microscopy has revealed four distinct successive patterns of contraction, including besides the familiar relaxed and contracture patterns, two intermediate types (2.4 to 1.9 µ, 1.8 to 1.5 µ) not previously well described. PMID:14417790
Rigorous Analysis of a Randomised Number Field Sieve
Lee, Jonathan; Venkatesan, Ramarathnam
2018-01-01
Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...
Reciprocity relations in transmission electron microscopy: A rigorous derivation.
Krause, Florian F; Rosenauer, Andreas
2017-01-01
A concise derivation of the principle of reciprocity applied to realistic transmission electron microscopy setups is presented making use of the multislice formalism. The equivalence of images acquired in conventional and scanning mode is thereby rigorously shown. The conditions for the applicability of the found reciprocity relations is discussed. Furthermore the positions of apertures in relation to the corresponding lenses are considered, a subject which scarcely has been addressed in previous publications. Copyright Â© 2016 Elsevier Ltd. All rights reserved.
Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.
Morse, Janice M
2015-09-01
Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.
Effective Homework Assignments. Research Brief
Cooper, Harris
2008-01-01
Perhaps more than any question other than "How much time should students spend doing homework?" parents and educators want to know, "What kinds of homework assignments are most effective?" Clearly, the answers to this question vary according to many factors, especially the developmental level of students and the topic area. Generally, answers are…
New rigorous asymptotic theorems for inverse scattering amplitudes
International Nuclear Information System (INIS)
Lomsadze, Sh.Yu.; Lomsadze, Yu.M.
1984-01-01
The rigorous asymptotic theorems both of integral and local types obtained earlier and establishing logarithmic and in some cases even power correlations aetdeen the real and imaginary parts of scattering amplitudes Fsub(+-) are extended to the inverse amplitudes 1/Fsub(+-). One also succeeds in establishing power correlations of a new type between the real and imaginary parts, both for the amplitudes themselves and for the inverse ones. All the obtained assertions are convenient to be tested in high energy experiments when the amplitudes show asymptotic behaviour
Sonoelasticity to monitor mechanical changes during rigor and ageing.
Ayadi, A; Culioli, J; Abouelkaram, S
2007-06-01
We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.
Rigorous quantum limits on monitoring free masses and harmonic oscillators
Roy, S. M.
2018-03-01
There are heuristic arguments proposing that the accuracy of monitoring position of a free mass m is limited by the standard quantum limit (SQL): σ2( X (t ) ) ≥σ2( X (0 ) ) +(t2/m2) σ2( P (0 ) ) ≥ℏ t /m , where σ2( X (t ) ) and σ2( P (t ) ) denote variances of the Heisenberg representation position and momentum operators. Yuen [Phys. Rev. Lett. 51, 719 (1983), 10.1103/PhysRevLett.51.719] discovered that there are contractive states for which this result is incorrect. Here I prove universally valid rigorous quantum limits (RQL), viz. rigorous upper and lower bounds on σ2( X (t ) ) in terms of σ2( X (0 ) ) and σ2( P (0 ) ) , given by Eq. (12) for a free mass and by Eq. (36) for an oscillator. I also obtain the maximally contractive and maximally expanding states which saturate the RQL, and use the contractive states to set up an Ozawa-type measurement theory with accuracies respecting the RQL but beating the standard quantum limit. The contractive states for oscillators improve on the Schrödinger coherent states of constant variance and may be useful for gravitational wave detection and optical communication.
A rigorous test for a new conceptual model for collisions
International Nuclear Information System (INIS)
Peixoto, E.M.A.; Mu-Tao, L.
1979-01-01
A rigorous theoretical foundation for the previously proposed model is formulated and applied to electron scattering by H 2 in the gas phase. An rigorous treatment of the interaction potential between the incident electron and the Hydrogen molecule is carried out to calculate Differential Cross Sections for 1 KeV electrons, using Glauber's approximation Wang's molecular wave function for the ground electronic state of H 2 . Moreover, it is shown for the first time that, when adequately done, the omission of two center terms does not adversely influence the results of molecular calculations. It is shown that the new model is far superior to the Independent Atom Model (or Independent Particle Model). The accuracy and simplicity of the new model suggest that it may be fruitfully applied to the description of other collision phenomena (e.g., in molecular beam experiments and nuclear physics). A new techniques is presented for calculations involving two center integrals within the frame work of the Glauber's approximation for scattering. (Author) [pt
Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories
Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto
2012-01-01
There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.
Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories
Directory of Open Access Journals (Sweden)
Augusto Beléndez
2012-08-01
Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.
A methodology for the rigorous verification of plasma simulation codes
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice
2008-04-07
Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.
A Statistical Programme Assignment Model
DEFF Research Database (Denmark)
Rosholm, Michael; Staghøj, Jonas; Svarer, Michael
When treatment effects of active labour market programmes are heterogeneous in an observable way across the population, the allocation of the unemployed into different programmes becomes a particularly important issue. In this paper, we present a statistical model designed to improve the present...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the plementation of such a system, especially the interplay between the statistical model and case workers....
A note on ranking assignments using reoptimization
DEFF Research Database (Denmark)
Pedersen, Christian Roed; Nielsen, L.R.; Andersen, K.A.
2005-01-01
We consider the problem of ranking assignments according to cost in the classical linear assignment problem. An algorithm partitioning the set of possible assignments, as suggested by Murty, is presented where, for each partition, the optimal assignment is calculated using a new reoptimization...
An algorithm for ranking assignments using reoptimization
DEFF Research Database (Denmark)
Pedersen, Christian Roed; Nielsen, Lars Relund; Andersen, Kim Allan
2008-01-01
We consider the problem of ranking assignments according to cost in the classical linear assignment problem. An algorithm partitioning the set of possible assignments, as suggested by Murty, is presented where, for each partition, the optimal assignment is calculated using a new reoptimization...... technique. Computational results for the new algorithm are presented...
Rigorous approach to the comparison between experiment and theory in Casimir force measurements
International Nuclear Information System (INIS)
Klimchitskaya, G L; Chen, F; Decca, R S; Fischbach, E; Krause, D E; Lopez, D; Mohideen, U; Mostepanenko, V M
2006-01-01
In most experiments on the Casimir force the comparison between measurement data and theory was done using the concept of the root-mean-square deviation, a procedure that has been criticized in the literature. Here we propose a special statistical analysis which should be performed separately for the experimental data and for the results of the theoretical computations. In so doing, the random, systematic and total experimental errors are found as functions of separation, taking into account the distribution laws for each error at 95% confidence. Independently, all theoretical errors are combined to obtain the total theoretical error at the same confidence. Finally, the confidence interval for the differences between theoretical and experimental values is obtained as a function of separation. This rigorous approach is applied to two recent experiments on the Casimir effect
Study design elements for rigorous quasi-experimental comparative effectiveness research.
Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan
2013-03-01
Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.
Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien
2018-03-30
Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from
Reframing Rigor: A Modern Look at Challenge and Support in Higher Education
Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.
2018-01-01
This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.
Rigorous force field optimization principles based on statistical distance minimization
Energy Technology Data Exchange (ETDEWEB)
Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)
2015-10-14
We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.
From everyday communicative figurations to rigorous audience news repertoires
DEFF Research Database (Denmark)
Kobbernagel, Christian; Schrøder, Kim Christian
2016-01-01
In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013), in which people build their cross-media news...... repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption......, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building...
Student’s rigorous mathematical thinking based on cognitive style
Fitriyani, H.; Khasanah, U.
2017-12-01
The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.
Rigorous Quantum Field Theory A Festschrift for Jacques Bros
Monvel, Anne Boutet; Iagolnitzer, Daniel; Moschella, Ugo
2007-01-01
Jacques Bros has greatly advanced our present understanding of rigorous quantum field theory through numerous fundamental contributions. This book arose from an international symposium held in honour of Jacques Bros on the occasion of his 70th birthday, at the Department of Theoretical Physics of the CEA in Saclay, France. The impact of the work of Jacques Bros is evident in several articles in this book. Quantum fields are regarded as genuine mathematical objects, whose various properties and relevant physical interpretations must be studied in a well-defined mathematical framework. The key topics in this volume include analytic structures of Quantum Field Theory (QFT), renormalization group methods, gauge QFT, stability properties and extension of the axiomatic framework, QFT on models of curved spacetimes, QFT on noncommutative Minkowski spacetime. Contributors: D. Bahns, M. Bertola, R. Brunetti, D. Buchholz, A. Connes, F. Corbetta, S. Doplicher, M. Dubois-Violette, M. Dütsch, H. Epstein, C.J. Fewster, K....
Desarrollo constitucional, legal y jurisprudencia del principio de rigor subsidiario
Directory of Open Access Journals (Sweden)
Germán Eduardo Cifuentes Sandoval
2013-09-01
Full Text Available In colombia the environment state administration is in charge of environmental national system, SINA, SINA is made up of states entities that coexist beneath a mixed organization of centralization and decentralization. SINA decentralization express itself in a administrative and territorial level, and is waited that entities that function under this structure act in a coordinated way in order to reach suggested objectives in the environmental national politicy. To achieve the coordinated environmental administration through entities that define the SINA, the environmental legislation of Colombia has include three basic principles: 1. The principle of “armorial regional” 2. The principle of “gradationnormative” 3. The principle of “rigorsubsidiaries”. These principles belong to the article 63, law 99 of 1933, and even in the case of the two first, it is possible to find equivalents in other norms that integrate the Colombian legal system, it does not happen in that way with the “ rigor subsidiaries” because its elements are uniques of the environmental normativity and do not seem to be similar to those that make part of the principle of “ subsidiaridad” present in the article 288 of the politic constitution. The “ rigor subsidiaries” give to decentralizates entities certain type of special ability to modify the current environmental legislation to defend the local ecological patrimony. It is an administrative ability with a foundation in the decentralization autonomy that allows to take place of the reglamentary denied of the legislative power with the condition that the new normativity be more demanding that the one that belongs to the central level
Rigorous patient-prosthesis matching of Perimount Magna aortic bioprosthesis.
Nakamura, Hiromasa; Yamaguchi, Hiroki; Takagaki, Masami; Kadowaki, Tasuku; Nakao, Tatsuya; Amano, Atsushi
2015-03-01
Severe patient-prosthesis mismatch, defined as effective orifice area index ≤0.65 cm(2) m(-2), has demonstrated poor long-term survival after aortic valve replacement. Reported rates of severe mismatch involving the Perimount Magna aortic bioprosthesis range from 4% to 20% in patients with a small annulus. Between June 2008 and August 2011, 251 patients (mean age 70.5 ± 10.2 years; mean body surface area 1.55 ± 0.19 m(2)) underwent aortic valve replacement with a Perimount Magna bioprosthesis, with or without concomitant procedures. We performed our procedure with rigorous patient-prosthesis matching to implant a valve appropriately sized to each patient, and carried out annular enlargement when a 19-mm valve did not fit. The bioprosthetic performance was evaluated by transthoracic echocardiography predischarge and at 1 and 2 years after surgery. Overall hospital mortality was 1.6%. Only 5 (2.0%) patients required annular enlargement. The mean follow-up period was 19.1 ± 10.7 months with a 98.4% completion rate. Predischarge data showed a mean effective orifice area index of 1.21 ± 0.20 cm(2) m(-2). Moderate mismatch, defined as effective orifice area index ≤0.85 cm(2) m(-2), developed in 4 (1.6%) patients. None developed severe mismatch. Data at 1 and 2 years showed only two cases of moderate mismatch; neither was severe. Rigorous patient-prosthesis matching maximized the performance of the Perimount Magna, and no severe mismatch resulted in this Japanese population of aortic valve replacement patients. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Using HL7 in hospital staff assignments.
Unluturk, Mehmet S
2014-02-01
Hospital staff assignments are the instructions that allocate the hospital staff members to the hospital beds. Currently, hospital administrators make the assignments without accessing the information regarding the occupancy of the hospital beds and the acuity of the patient. As a result, administrators cannot distinguish between occupied and unoccupied beds, and may therefore assign staff to unoccupied beds. This gives rise to uneven and inefficient staff assignments. In this paper, the hospital admission-discharge-transfer (ADT) system is employed both as a data source and an assignment device to create staff assignments. When the patient data is newly added or modified, the ADT system updates the assignment software client with the relevant data. Based on the relevant data, the assignment software client is able to construct staff assignments in a more efficient way. © 2013 Elsevier Ltd. All rights reserved.
A method to assign failure rates for piping reliability assessments
International Nuclear Information System (INIS)
Gamble, R.M.; Tagart, S.W. Jr.
1991-01-01
This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies
Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.
Lee, Jae Hee; Jung, Koo Young
2012-07-01
Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.
Integrated assignment and path planning
Murphey, Robert A.
2005-11-01
A surge of interest in unmanned systems has exposed many new and challenging research problems across many fields of engineering and mathematics. These systems have the potential of transforming our society by replacing dangerous and dirty jobs with networks of moving machines. This vision is fundamentally separate from the modern view of robotics in that sophisticated behavior is realizable not by increasing individual vehicle complexity, but instead through collaborative teaming that relies on collective perception, abstraction, decision making, and manipulation. Obvious examples where collective robotics will make an impact include planetary exploration, space structure assembly, remote and undersea mining, hazardous material handling and clean-up, and search and rescue. Nonetheless, the phenomenon driving this technology trend is the increasing reliance of the US military on unmanned vehicles, specifically, aircraft. Only a few years ago, following years of resistance to the use of unmanned systems, the military and civilian leadership in the United States reversed itself and have recently demonstrated surprisingly broad acceptance of increasingly pervasive use of unmanned platforms in defense surveillance, and even attack. However, as rapidly as unmanned systems have gained acceptance, the defense research community has discovered the technical pitfalls that lie ahead, especially for operating collective groups of unmanned platforms. A great deal of talent and energy has been devoted to solving these technical problems, which tend to fall into two categories: resource allocation of vehicles to objectives, and path planning of vehicle trajectories. An extensive amount of research has been conducted in each direction, yet, surprisingly, very little work has considered the integrated problem of assignment and path planning. This dissertation presents a framework for studying integrated assignment and path planning and then moves on to suggest an exact
Managing voluntary turnover through challenging assignments
Preenen, P.T.Y.; de Pater, I.E.; van Vianen, A.E.M.; Keijzer, L.
2011-01-01
This study examines employees’ challenging assignments as manageable means to reduce turnover intentions, job search behaviors, and voluntary turnover. Results indicate that challenging assignments are negatively related to turnover intentions and job search behaviors and that these relationships
Managing voluntary turnover through challenging assignments
Preenen, P.T.Y.; Pater, I.E. de; Vianen, A.E.M. van; Keijzer, L.
2011-01-01
This study examines employees' challenging assignments as manageable means to reduce turnover intentions, job search behaviors, and voluntary turnover. Results indicate that challenging assignments are negatively related to turnover intentions and job search behaviors and that these relationships
Memory sparing, fast scattering formalism for rigorous diffraction modeling
Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.
2017-07-01
The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.
Rigorous vector wave propagation for arbitrary flat media
Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.
2017-08-01
Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.
Dynamics of harmonically-confined systems: Some rigorous results
Energy Technology Data Exchange (ETDEWEB)
Wu, Zhigang, E-mail: zwu@physics.queensu.ca; Zaremba, Eugene, E-mail: zaremba@sparky.phy.queensu.ca
2014-03-15
In this paper we consider the dynamics of harmonically-confined atomic gases. We present various general results which are independent of particle statistics, interatomic interactions and dimensionality. Of particular interest is the response of the system to external perturbations which can be either static or dynamic in nature. We prove an extended Harmonic Potential Theorem which is useful in determining the damping of the centre of mass motion when the system is prepared initially in a highly nonequilibrium state. We also study the response of the gas to a dynamic external potential whose position is made to oscillate sinusoidally in a given direction. We show in this case that either the energy absorption rate or the centre of mass dynamics can serve as a probe of the optical conductivity of the system. -- Highlights: •We derive various rigorous results on the dynamics of harmonically-confined atomic gases. •We derive an extension of the Harmonic Potential Theorem. •We demonstrate the link between the energy absorption rate in a harmonically-confined system and the optical conductivity.
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.
Kelly, David; Majda, Andrew J; Tong, Xin T
2015-08-25
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.
PRO development: rigorous qualitative research as the crucial foundation.
Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen
2010-10-01
Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.
Rigorous derivation of porous-media phase-field equations
Schmuck, Markus; Kalliadasis, Serafim
2017-11-01
The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.
Rigorous time slicing approach to Feynman path integrals
Fujiwara, Daisuke
2017-01-01
This book proves that Feynman's original definition of the path integral actually converges to the fundamental solution of the Schrödinger equation at least in the short term if the potential is differentiable sufficiently many times and its derivatives of order equal to or higher than two are bounded. The semi-classical asymptotic formula up to the second term of the fundamental solution is also proved by a method different from that of Birkhoff. A bound of the remainder term is also proved. The Feynman path integral is a method of quantization using the Lagrangian function, whereas Schrödinger's quantization uses the Hamiltonian function. These two methods are believed to be equivalent. But equivalence is not fully proved mathematically, because, compared with Schrödinger's method, there is still much to be done concerning rigorous mathematical treatment of Feynman's method. Feynman himself defined a path integral as the limit of a sequence of integrals over finite-dimensional spaces which is obtained by...
24 CFR 221.255 - Assignment option.
2010-04-01
... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Assignment option. 221.255 Section... Assignment option. (a) A mortgagee holding a mortgage insured pursuant to a conditional or firm commitment issued on or before November 30, 1983 has the option to assign, transfer and deliver to the Commissioner...
24 CFR 221.770 - Assignment option.
2010-04-01
... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Assignment option. 221.770 Section... § 221.770 Assignment option. A mortgagee holding a conditional or firm commitment issued on or before... mortgagee's approved underwriter on or before November 30, 1983) has the option to assign, transfer and...
Solving the rectangular assignment problem and applications
Bijsterbosch, J.; Volgenant, A.
2010-01-01
The rectangular assignment problem is a generalization of the linear assignment problem (LAP): one wants to assign a number of persons to a smaller number of jobs, minimizing the total corresponding costs. Applications are, e.g., in the fields of object recognition and scheduling. Further, we show
Directory of Open Access Journals (Sweden)
Spiros Pagiatakis
2009-10-01
Full Text Available In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times. It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF. It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at −40 °C, −20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.
El-Diasty, Mohammed; Pagiatakis, Spiros
2009-01-01
In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.
Bringing scientific rigor to community-developed programs in Hong Kong
Directory of Open Access Journals (Sweden)
Fabrizio Cecilia S
2012-12-01
Full Text Available Abstract Background This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR. Methods The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Results Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. Conclusions The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.
Bringing scientific rigor to community-developed programs in Hong Kong.
Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M
2012-12-31
This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.
International Nuclear Information System (INIS)
Chen, Xin
2014-01-01
Understanding the roles of the temporary and spatial structures of quantum functional noise in open multilevel quantum molecular systems attracts a lot of theoretical interests. I want to establish a rigorous and general framework for functional quantum noises from the constructive and computational perspectives, i.e., how to generate the random trajectories to reproduce the kernel and path ordering of the influence functional with effective Monte Carlo methods for arbitrary spectral densities. This construction approach aims to unify the existing stochastic models to rigorously describe the temporary and spatial structure of Gaussian quantum noises. In this paper, I review the Euclidean imaginary time influence functional and propose the stochastic matrix multiplication scheme to calculate reduced equilibrium density matrices (REDM). In addition, I review and discuss the Feynman-Vernon influence functional according to the Gaussian quadratic integral, particularly its imaginary part which is critical to the rigorous description of the quantum detailed balance. As a result, I establish the conditions under which the influence functional can be interpreted as the average of exponential functional operator over real-valued Gaussian processes for open multilevel quantum systems. I also show the difference between the local and nonlocal phonons within this framework. With the stochastic matrix multiplication scheme, I compare the normalized REDM with the Boltzmann equilibrium distribution for open multilevel quantum systems
Rigorous Clinical Trial Design in Public Health Emergencies Is Essential
DEFF Research Database (Denmark)
Ellenberg, Susan S; Keusch, Gerald T; Babiker, Abdel G
2018-01-01
Randomized clinical trials are the most reliable approaches to evaluating the effects of new treatments and vaccines. During the 2014-15 West African Ebola epidemic, many argued that such trials were neither ethical nor feasible in an environment of limited health infrastructure and severe disease...
RIGOROUS GEOREFERENCING OF ALSAT-2A PANCHROMATIC AND MULTISPECTRAL IMAGERY
Directory of Open Access Journals (Sweden)
I. Boukerch
2013-04-01
Full Text Available The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI, require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.
A rigorous derivation of gravitational self-force
International Nuclear Information System (INIS)
Gralla, Samuel E; Wald, Robert M
2008-01-01
There is general agreement that the MiSaTaQuWa equations should describe the motion of a 'small body' in general relativity, taking into account the leading order self-force effects. However, previous derivations of these equations have made a number of ad hoc assumptions and/or contain a number of unsatisfactory features. For example, all previous derivations have invoked, without proper justification, the step of 'Lorenz gauge relaxation', wherein the linearized Einstein equation is written in the form appropriate to the Lorenz gauge, but the Lorenz gauge condition is then not imposed-thereby making the resulting equations for the metric perturbation inequivalent to the linearized Einstein equations. (Such a 'relaxation' of the linearized Einstein equations is essential in order to avoid the conclusion that 'point particles' move on geodesics.) In this paper, we analyze the issue of 'particle motion' in general relativity in a systematic and rigorous way by considering a one-parameter family of metrics, g ab (λ), corresponding to having a body (or black hole) that is 'scaled down' to zero size and mass in an appropriate manner. We prove that the limiting worldline of such a one-parameter family must be a geodesic of the background metric, g ab (λ = 0). Gravitational self-force-as well as the force due to coupling of the spin of the body to curvature-then arises as a first-order perturbative correction in λ to this worldline. No assumptions are made in our analysis apart from the smoothness and limit properties of the one-parameter family of metrics, g ab (λ). Our approach should provide a framework for systematically calculating higher order corrections to gravitational self-force, including higher multipole effects, although we do not attempt to go beyond first-order calculations here. The status of the MiSaTaQuWa equations is explained
Rigorous covariance propagation of geoid errors to geodetic MDT estimates
Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.
2012-04-01
The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.
Medicare Part D Roulette, Potential Implications of Random..
U.S. Department of Health & Human Services — Medicare Part D Roulette, Potential Implications of Random Assignment and Plan Restrictions Dual-eligible (Medicare and Medicaid) beneficiaries are randomly assigned...
Krompecher, T; Fryc, O
1978-01-01
The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.
How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research
Wolf, Joachim; Rosenberg, Timo
2012-01-01
This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...
RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.
Meltzer, S J; Auer, J
1908-01-01
Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.
Blocked Randomization with Randomly Selected Block Sizes
Directory of Open Access Journals (Sweden)
Jimmy Efird
2010-12-01
Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.
32 CFR 884.2 - Assigned responsibilities.
2010-07-01
... OF PERSONNEL TO UNITED STATES CIVILIAN AUTHORITIES FOR TRIAL § 884.2 Assigned responsibilities. (a... 32 National Defense 6 2010-07-01 2010-07-01 false Assigned responsibilities. 884.2 Section 884.2... requests for return of members to the United States for delivery to civilian authorities when the request...
12 CFR 563e.28 - Assigned ratings.
2010-01-01
... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Assigned ratings. 563e.28 Section 563e.28 Banks... for Assessing Performance § 563e.28 Assigned ratings. (a) Ratings in general. Subject to paragraphs (b... performance under the lending, investment and service tests, the community development test, the small savings...
Stress Assignment in Reading Italian Polysyllabic Pseudowords
Sulpizio, Simone; Arduino, Lisa S.; Paizi, Despina; Burani, Cristina
2013-01-01
In 4 naming experiments we investigated how Italian readers assign stress to pseudowords. We assessed whether participants assign stress following distributional information such as stress neighborhood (the proportion and number of existent words sharing orthographic ending and stress pattern) and whether such distributional information affects…
Assignment of element and isotope factors
International Nuclear Information System (INIS)
Schneider, R.A.
1984-01-01
Element and isotope factors are assigned in the NICS internal accounting system at the Exxon Fuel Fabrication Facility on the basis of coded information included on the material transfer documents. This paper explains more fully the manner in which NICS assigns these factors
Detecting Plagiarism in MS Access Assignments
Singh, Anil
2013-01-01
Assurance of individual effort from students in computer-based assignments is a challenge. Due to digitization, students can easily use a copy of their friend's work and submit it as their own. Plagiarism in assignments puts students who cheat at par with those who work honestly and this compromises the learning evaluation process. Using a…
DEFF Research Database (Denmark)
Cappeln, Gertrud; Jessen, Flemming
2002-01-01
Variation in glycogen, ATP, and IMP contents within individual cod muscles were studied in ice stored fish during the progress of rigor mortis. Rigor index was determined before muscle samples for chemical analyzes were taken at 16 different positions on the fish. During development of rigor......, the contents of glycogen and ATP decreased differently in relation to rigor index depending on sampling location. Although fish were considered to be in strong rigor according to the rigor index method, parts of the muscle were not in rigor as high ATP concentrations were found in dorsal and tall muscle....
Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals
Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna
2012-01-01
Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…
Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (prigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.
Choi, Yun-Sang
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (psalting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884
Rigorous bounds on the free energy of electron-phonon models
Raedt, Hans De; Michielsen, Kristel
1997-01-01
We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do
The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools
Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia
2016-01-01
Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…
Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies
Hagood, Margaret Carmody; Skinner, Emily Neil
2015-01-01
Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…
Real life working shift assignment problem
Sze, San-Nah; Kwek, Yeek-Ling; Tiong, Wei-King; Chiew, Kang-Leng
2017-07-01
This study concerns about the working shift assignment in an outlet of Supermarket X in Eastern Mall, Kuching. The working shift assignment needs to be solved at least once in every month. Current approval process of working shifts is too troublesome and time-consuming. Furthermore, the management staff cannot have an overview of manpower and working shift schedule. Thus, the aim of this study is to develop working shift assignment simulation and propose a working shift assignment solution. The main objective for this study is to fulfill manpower demand at minimum operation cost. Besides, the day off and meal break policy should be fulfilled accordingly. Demand based heuristic is proposed to assign working shift and the quality of the solution is evaluated by using the real data.
Mars - robust automatic backbone assignment of proteins
International Nuclear Information System (INIS)
Jung, Young-Sang; Zweckstetter, Markus
2004-01-01
MARS a program for robust automatic backbone assignment of 13 C/ 15 N labeled proteins is presented. MARS does not require tight thresholds for establishing sequential connectivity or detailed adjustment of these thresholds and it can work with a wide variety of NMR experiments. Using only 13 C α / 13 C β connectivity information, MARS allows automatic, error-free assignment of 96% of the 370-residue maltose-binding protein. MARS can successfully be used when data are missing for a substantial portion of residues or for proteins with very high chemical shift degeneracy such as partially or fully unfolded proteins. Other sources of information, such as residue specific information or known assignments from a homologues protein, can be included into the assignment process. MARS exports its result in SPARKY format. This allows visual validation and integration of automated and manual assignment
Probability, random variables, and random processes theory and signal processing applications
Shynk, John J
2012-01-01
Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app
Onset of rigor mortis is earlier in red muscle than in white muscle.
Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H
2000-01-01
Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.
Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design
Wagler, Amy; Wagler, Ron
2014-01-01
Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…
High and low rigor temperature effects on sheep meat tenderness and ageing.
Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J
2002-02-01
Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.
Random walks, random fields, and disordered systems
Černý, Jiří; Kotecký, Roman
2015-01-01
Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a mod...
Auditing the Assignments of Top-Level Semantic Types in the UMLS Semantic Network to UMLS Concepts.
He, Zhe; Perl, Yehoshua; Elhanan, Gai; Chen, Yan; Geller, James; Bian, Jiang
2017-11-01
The Unified Medical Language System (UMLS) is an important terminological system. By the policy of its curators, each concept of the UMLS should be assigned the most specific Semantic Types (STs) in the UMLS Semantic Network (SN). Hence, the Semantic Types of most UMLS concepts are assigned at or near the bottom (leaves) of the UMLS Semantic Network. While most ST assignments are correct, some errors do occur. Therefore, Quality Assurance efforts of UMLS curators for ST assignments should concentrate on automatically detected sets of UMLS concepts with higher error rates than random sets. In this paper, we investigate the assignments of top-level semantic types in the UMLS semantic network to concepts, identify potential erroneous assignments, define four categories of errors, and thus provide assistance to curators of the UMLS to avoid these assignments errors. Human experts analyzed samples of concepts assigned 10 of the top-level semantic types and categorized the erroneous ST assignments into these four logical categories. Two thirds of the concepts assigned these 10 top-level semantic types are erroneous. Our results demonstrate that reviewing top-level semantic type assignments to concepts provides an effective way for UMLS quality assurance, comparing to reviewing a random selection of semantic type assignments.
Random geometry and Yang-Mills theory
International Nuclear Information System (INIS)
Froehlich, J.
1981-01-01
The author states various problems and discusses a very few preliminary rigorous results in a branch of mathematics and mathematical physics which one might call random (or stochastic) geometry. Furthermore, he points out why random geometry is important in the quantization of Yang-Mills theory. (Auth.)
Postgraduate diploma collaborative assignment: Implications for ...
African Journals Online (AJOL)
Postgraduate diploma collaborative assignment: Implications for ESL students ... and collaborative teaching/learning model involving the major course convenors. ... The quality of the work and mood of all concerned improved tremendously.
Dynamic traffic assignment : genetic algorithms approach
1997-01-01
Real-time route guidance is a promising approach to alleviating congestion on the nations highways. A dynamic traffic assignment model is central to the development of guidance strategies. The artificial intelligence technique of genetic algorithm...
Statistical aspects of optimal treatment assignment
van der Linden, Willem J.
1980-01-01
The issues of treatment assignment is ordinarily dealt with within the framework of testing aptitude treatment interaction (ATI) hypothesis. ATI research mostly uses linear regression techniques, and an ATI exists when the aptitude treatment (AT) regression lines cross each other within the relevant interval of the aptitude variable. Consistent with this approach is the use of the points of interaction of AT regression lines as treatment-assignment rule. The replacement of such rules by monot...
On pole structure assignment in linear systems
Czech Academy of Sciences Publication Activity Database
Loiseau, J.-J.; Zagalak, Petr
2009-01-01
Roč. 82, č. 7 (2009), s. 1179-1192 ISSN 0020-7179 R&D Projects: GA ČR(CZ) GA102/07/1596 Institutional research plan: CEZ:AV0Z10750506 Keywords : linear systems * linear state feedback * pole structure assignment Subject RIV: BC - Control Systems Theory Impact factor: 1.124, year: 2009 http://library.utia.cas.cz/separaty/2009/AS/zagalak-on pole structure assignment in linear systems.pdf
Competitive Traffic Assignment in Road Networks
Directory of Open Access Journals (Sweden)
Krylatov Alexander Y.
2016-09-01
Full Text Available Recently in-vehicle route guidance and information systems are rapidly developing. Such systems are expected to reduce congestion in an urban traffic area. This social benefit is believed to be reached by imposing the route choices on the network users that lead to the system optimum traffic assignment. However, guidance service could be offered by different competitive business companies. Then route choices of different mutually independent groups of users may reject traffic assignment from the system optimum state. In this paper, a game theoretic approach is shown to be very efficient to formalize competitive traffic assignment problem with various groups of users in the form of non-cooperative network game with the Nash equilibrium search. The relationships between the Wardrop’s system optimum associated with the traffic assignment problem and the Nash equilibrium associated with the competitive traffic assignment problem are investigated. Moreover, some related aspects of the Nash equilibrium and the Wardrop’s user equilibrium assignments are also discussed.
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Flexible taxonomic assignment of ambiguous sequencing reads
Directory of Open Access Journals (Sweden)
Jansson Jesper
2011-01-01
Full Text Available Abstract Background To characterize the diversity of bacterial populations in metagenomic studies, sequencing reads need to be accurately assigned to taxonomic units in a given reference taxonomy. Reads that cannot be reliably assigned to a unique leaf in the taxonomy (ambiguous reads are typically assigned to the lowest common ancestor of the set of species that match it. This introduces a potentially severe error in the estimation of bacteria present in the sample due to false positives, since all species in the subtree rooted at the ancestor are implicitly assigned to the read even though many of them may not match it. Results We present a method that maps each read to a node in the taxonomy that minimizes a penalty score while balancing the relevance of precision and recall in the assignment through a parameter q. This mapping can be obtained in time linear in the number of matching sequences, because LCA queries to the reference taxonomy take constant time. When applied to six different metagenomic datasets, our algorithm produces different taxonomic distributions depending on whether coverage or precision is maximized. Including information on the quality of the reads reduces the number of unassigned reads but increases the number of ambiguous reads, stressing the relevance of our method. Finally, two measures of performance are described and results with a set of artificially generated datasets are discussed. Conclusions The assignment strategy of sequencing reads introduced in this paper is a versatile and a quick method to study bacterial communities. The bacterial composition of the analyzed samples can vary significantly depending on how ambiguous reads are assigned depending on the value of the q parameter. Validation of our results in an artificial dataset confirm that a combination of values of q produces the most accurate results.
Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki
2004-02-10
Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.
Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).
Suzutani, T; Ishibashi, H; Takatori, T
1978-11-01
The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.
2010-05-27
... rigorous knowledge and skills in English- language arts and mathematics that employers and colleges expect... specialists and to access the student outcome data needed to meet annual evaluation and reporting requirements...
Rigorous derivation from Landau-de Gennes theory to Ericksen-Leslie theory
Wang, Wei; Zhang, Pingwen; Zhang, Zhifei
2013-01-01
Starting from Beris-Edwards system for the liquid crystal, we present a rigorous derivation of Ericksen-Leslie system with general Ericksen stress and Leslie stress by using the Hilbert expansion method.
Performance evaluation of distributed wavelength assignment in WDM optical networks
Hashiguchi, Tomohiro; Wang, Xi; Morikawa, Hiroyuki; Aoyama, Tomonori
2004-04-01
In WDM wavelength routed networks, prior to a data transfer, a call setup procedure is required to reserve a wavelength path between the source-destination node pairs. A distributed approach to a connection setup can achieve a very high speed, while improving the reliability and reducing the implementation cost of the networks. However, along with many advantages, several major challenges have been posed by the distributed scheme in how the management and allocation of wavelength could be efficiently carried out. In this thesis, we apply a distributed wavelength assignment algorithm named priority based wavelength assignment (PWA) that was originally proposed for the use in burst switched optical networks to the problem of reserving wavelengths of path reservation protocols in the distributed control optical networks. Instead of assigning wavelengths randomly, this approach lets each node select the "safest" wavelengths based on the information of wavelength utilization history, thus unnecessary future contention is prevented. The simulation results presented in this paper show that the proposed protocol can enhance the performance of the system without introducing any apparent drawbacks.
2016-04-01
AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their
The Kinetics of Chirality Assignment in Catalytic Single Walled Carbon Nanotube Growth
Xu, Ziwei; Yan, Tianying; Ding, Feng
2014-01-01
Chirality-selected single-walled carbon nanotubes (SWCNTs) ensure a great potential of building ~1 nm sized electronics. However, the reliable method for chirality-selected SWCNT is still pending. Here we present a theoretical study on the SWCNT's chirality assignment and control during the catalytic growth. This study reveals that the chirality of a SWCNT is determined by the kinetic incorporation of the pentagon formation during SWCNT nucleation. Therefore, chirality is randomly assigned on...
First Trimester Fetal Gender Assignment by Ultrasound
Directory of Open Access Journals (Sweden)
Sabahattin Altunyurt
2010-03-01
Full Text Available Objective: To investigate the efficiency of genital tubercule angle on detecting fetal gender in first trimester by ultrasonography. Material-Method: Fetal sex assignment by ultrasound was carried out in 172 pregnancies at 11-13+6 weeks between 2007 June and 2007 December. Gestational age was determined by the measurement of crown-rump length (CRL. The ultrasound predictions were compared with actual sex at birth. Mid-sagittal planes of a section of the fetal genital tubercle were performed to identify the gender. Results: 155 of 172 patients’ data were achieved. The overall success rate was 92.3 % in sonographic assignment of fetal sex. The correct assignment rate in female fetuses was significantly higher than males (95.9 % - 88.8 % [p=0,001]. The correct identification of fetal sex improved with advancing gestational age from 89.3 % between 11-11+6 weeks, 92.5 % between 12-12+6 weeks and 93.4 % between 13-13+6 weeks (p=0,96. Conclusion: The fetal sex assignment by ultrasonography between 11-13+6 weeks had high success rate. The sensitivity of fetal sex assignment was not affected with fetus position and gestational age.
Directory of Open Access Journals (Sweden)
de Brevern Alexandre G
2005-09-01
Full Text Available Abstract Background A number of methods are now available to perform automatic assignment of periodic secondary structures from atomic coordinates, based on different characteristics of the secondary structures. In general these methods exhibit a broad consensus as to the location of most helix and strand core segments in protein structures. However the termini of the segments are often ill-defined and it is difficult to decide unambiguously which residues at the edge of the segments have to be included. In addition, there is a "twilight zone" where secondary structure segments depart significantly from the idealized models of Pauling and Corey. For these segments, one has to decide whether the observed structural variations are merely distorsions or whether they constitute a break in the secondary structure. Methods To address these problems, we have developed a method for secondary structure assignment, called KAKSI. Assignments made by KAKSI are compared with assignments given by DSSP, STRIDE, XTLSSTR, PSEA and SECSTR, as well as secondary structures found in PDB files, on 4 datasets (X-ray structures with different resolution range, NMR structures. Results A detailed comparison of KAKSI assignments with those of STRIDE and PSEA reveals that KAKSI assigns slightly longer helices and strands than STRIDE in case of one-to-one correspondence between the segments. However, KAKSI tends also to favor the assignment of several short helices when STRIDE and PSEA assign longer, kinked, helices. Helices assigned by KAKSI have geometrical characteristics close to those described in the PDB. They are more linear than helices assigned by other methods. The same tendency to split long segments is observed for strands, although less systematically. We present a number of cases of secondary structure assignments that illustrate this behavior. Conclusion Our method provides valuable assignments which favor the regularity of secondary structure segments.
Writing Assignments that Promote Active Learning
Narayanan, M.
2014-12-01
Encourage students to write a detailed, analytical report correlating classroom discussions to an important historical event or a current event. Motivate students interview an expert from industry on a topic that was discussed in class. Ask the students to submit a report with supporting sketches, drawings, circuit diagrams and graphs. Propose that the students generate a complete a set of reading responses pertaining to an assigned topic. Require each student to bring in one comment or one question about an assigned reading. The assignment should be a recent publication in an appropriate journal. Have the students conduct a web search on an assigned topic. Ask them to generate a set of ideas that can relate to classroom discussions. Provide the students with a study guide. The study guide should provide about 10 or 15 short topics. Quiz the students on one or two of the topics. Encourage the students to design or develop some creative real-world examples based on a chapter discussed or a topic of interest. Require that students originate, develop, support and defend a viewpoint using a specifically assigned material. Make the students practice using or utilizing a set of new technical terms they have encountered in an assigned chapter. Have students develop original examples explaining the different terms. Ask the students to select one important terminology from the previous classroom discussions. Encourage the students to explain why they selected that particular word. Ask them to talk about the importance of the terminology from the point of view of their educational objectives and future career. Angelo, T. A. (1991). Ten easy pieces: Assessing higher learning in four dimensions. In T. A. Angelo (Ed.), Classroom research: Early lessons from success (pp. 17-31). New Directions for Teaching and Learning, No. 46. San Francisco: Jossey-Bass.
Grouping puts figure-ground assignment in context by constraining propagation of edge assignment.
Brooks, Joseph L; Brook, Joseph L; Driver, Jon
2010-05-01
Figure-ground organization involves the assignment of edges to a figural shape on one or the other side of each dividing edge. Established visual cues for edge assignment primarily concern relatively local rather than contextual factors. In the present article, we show that an assignment for a locally unbiased edge can be affected by an assignment of a remote contextual edge that has its own locally biased assignment. We find that such propagation of edge assignment from the biased remote context occurs only when the biased and unbiased edges are grouped. This new principle, whereby grouping constrains the propagation of figural edge assignment, emerges from both subjective reports and an objective short-term edge-matching task. It generalizes from moving displays involving grouping by common fate and collinearity, to static displays with grouping by similarity of edge-contrast polarity, or apparent occlusion. Our results identify a new contextual influence on edge assignment. They also identify a new mechanistic relation between grouping and figure-ground processes, whereby grouping between remote elements can constrain the propagation of edge assignment between those elements. Supplemental materials for this article may be downloaded from http://app.psychonomic-journals.org/content/supplemental.
Lee, Jeannette Y.; Moore, Page; Kusek, John; Barry, Michael
2014-01-01
Objectives: This report assesses participant perception of treatment assignment in a randomized, double-blind, placebo-controlled trial of saw palmetto for the treatment of benign prostatic hyperplasia (BCM).
Semi-infinite assignment and transportation games
Timmer, Judith B.; Sánchez-Soriano, Joaqu´ın; Llorca, Navidad; Tijs, Stef; Goberna, Miguel A.; López, Marco A.
2001-01-01
Games corresponding to semi-infinite transportation and related assignment situations are studied. In a semi-infinite transportation situation, one aims at maximizing the profit from the transportation of a certain good from a finite number of suppliers to an infinite number of demanders. An
Capacity constrained assignment in spatial databases
DEFF Research Database (Denmark)
U, Leong Hou; Yiu, Man Lung; Mouratidis, Kyriakos
2008-01-01
large to fit in main memory. Motivated by this fact, we propose efficient algorithms for optimal assignment that employ novel edge-pruning strategies, based on the spatial properties of the problem. Additionally, we develop approximate (i.e., suboptimal) CCA solutions that provide a trade-off between...
Statistical aspects of optimal treatment assignment
van der Linden, Willem J.
The issues of treatment assignment is ordinarily dealt with within the framework of testing aptitude treatment interaction (ATI) hypothesis. ATI research mostly uses linear regression techniques, and an ATI exists when the aptitude treatment (AT) regression lines cross each other within the relevant
Tabu search for target-radar assignment
DEFF Research Database (Denmark)
Hindsberger, Magnus; Vidal, Rene Victor Valqui
2000-01-01
In the paper the problem of assigning air-defense illumination radars to enemy targets is presented. A tabu search metaheuristic solution is described and the results achieved are compared to those of other heuristic approaches, implementation and experimental aspects are discussed. It is argued ...
Strategy-Proof Assignment Of Multiple Resources
DEFF Research Database (Denmark)
Erlanson, Albin; Szwagrzak, Karol
2015-01-01
We examine the strategy-proof allocation of multiple resources; an application is the assignment of packages of tasks, workloads, and compensations among the members of an organization. In the domain of multidimensional single-peaked preferences, we find that any allocation mechanism obtained by ...
Optimal Processor Assignment for Pipeline Computations
1991-10-01
the use of ratios: initially each task is assigned a procesbuor2 the remaining proceborb are distributed in proportion to the quantities f,(1), 1 < i...algorithmns. IEEE Trans. onl Parallel and Distributed Systemns, 1 (4):470-499, October 1990. [26] P. Al. Kogge. The Architeture of Pipelined Comnputers
Incentivized optimal advert assignment via utility decomposition
Kelly, F.; Key, P.; Walton, N.
2014-01-01
We consider a large-scale Ad-auction where adverts are assigned over a potentially infinite number of searches. We capture the intrinsic asymmetries in information between advertisers, the advert platform and the space of searches: advertisers know and can optimize the average performance of their
A game theoretic approach to assignment problems
Klijn, F.
2000-01-01
Game theory deals with the mathematical modeling and analysis of conflict and cooperation in the interaction of multiple decision makers. This thesis adopts two game theoretic methods to analyze a range of assignment problems that arise in various economic situations. The first method has as
Generalised Assignment Matrix Methodology in Linear Programming
Jerome, Lawrence
2012-01-01
Discrete Mathematics instructors and students have long been struggling with various labelling and scanning algorithms for solving many important problems. This paper shows how to solve a wide variety of Discrete Mathematics and OR problems using assignment matrices and linear programming, specifically using Excel Solvers although the same…
7 CFR 1437.104 - Assigned production.
2010-01-01
...) Irrigation equipment is not capable of supplying adequate water to sustain the expected production of a... practice is not used. (7) For normal irrigated annual and biennial crops, the supply of available water at... determining losses under this section, assigned production will be used to offset the loss of production when...
Accounting for Sustainability: An Active Learning Assignment
Gusc, Joanna; van Veen-Dirks, Paula
2017-01-01
Purpose: Sustainability is one of the newer topics in the accounting courses taught in university teaching programs. The active learning assignment as described in this paper was developed for use in an accounting course in an undergraduate program. The aim was to enhance teaching about sustainability within such a course. The purpose of this…
Energy Technology Data Exchange (ETDEWEB)
Harsch, Tobias; Schneider, Philipp; Kieninger, Bärbel; Donaubauer, Harald; Kalbitzer, Hans Robert, E-mail: hans-robert.kalbitzer@biologie.uni-regensburg.de [University of Regensburg, Institute of Biophysics and Physical Biochemistry and Centre of Magnetic Resonance in Chemistry and Biomedicine (Germany)
2017-02-15
Side chain amide protons of asparagine and glutamine residues in random-coil peptides are characterized by large chemical shift differences and can be stereospecifically assigned on the basis of their chemical shift values only. The bimodal chemical shift distributions stored in the biological magnetic resonance data bank (BMRB) do not allow such an assignment. However, an analysis of the BMRB shows, that a substantial part of all stored stereospecific assignments is not correct. We show here that in most cases stereospecific assignment can also be done for folded proteins using an unbiased artificial chemical shift data base (UACSB). For a separation of the chemical shifts of the two amide resonance lines with differences ≥0.40 ppm for asparagine and differences ≥0.42 ppm for glutamine, the downfield shifted resonance lines can be assigned to H{sup δ21} and H{sup ε21}, respectively, at a confidence level >95%. A classifier derived from UASCB can also be used to correct the BMRB data. The program tool AssignmentChecker implemented in AUREMOL calculates the Bayesian probability for a given stereospecific assignment and automatically corrects the assignments for a given list of chemical shifts.
Cypress, Brigitte S
Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.
Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.
Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P
2018-03-03
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.
Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno
2016-01-01
Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to
The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.
Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko
2013-11-01
Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn
2014-11-01
The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Acosta, Joie D.; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S.
2016-01-01
Restorative practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this article describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI)…
Careerism, Committee Assignments and the Electoral Connection
Katz, Jonathan N.; Sala, Brian R.
1996-01-01
Most scholars agree that members of Congress are strongly motivated by their desire for reelection. This assumption implies that members of Congress adopt institutions, rules, and norms of behavior in part to serve their electoral interests. Direct tests of the electoral connection are rare, however, because significant, exogenous changes in the electoral environment are difficult to identify. We develop and test an electoral rationale for the norm of committee assignment "property rights...
An Ultimatum Game Approach to Billet Assignments
2015-09-01
time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed , and completing and reviewing this...treatments are needed for this investigation. To conserve the subject pool and meet the budget, we elected to pursue treatments that covered salient...across billets can be partially offset through compensating wages ( hedonic wages) and/or the potential of future superior assignments. In the
Protein secondary structure: category assignment and predictability
DEFF Research Database (Denmark)
Andersen, Claus A.; Bohr, Henrik; Brunak, Søren
2001-01-01
In the last decade, the prediction of protein secondary structure has been optimized using essentially one and the same assignment scheme known as DSSP. We present here a different scheme, which is more predictable. This scheme predicts directly the hydrogen bonds, which stabilize the secondary......-forward neural network with one hidden layer on a data set identical to the one used in earlier work....
Disciplining Bioethics: Towards a Standard of Methodological Rigor in Bioethics Research
Adler, Daniel; Shaul, Randi Zlotnik
2012-01-01
Contemporary bioethics research is often described as multi- or interdisciplinary. Disciplines are characterized, in part, by their methods. Thus, when bioethics research draws on a variety of methods, it crosses disciplinary boundaries. Yet each discipline has its own standard of rigor—so when multiple disciplinary perspectives are considered, what constitutes rigor? This question has received inadequate attention, as there is considerable disagreement regarding the disciplinary status of bioethics. This disagreement has presented five challenges to bioethics research. Addressing them requires consideration of the main types of cross-disciplinary research, and consideration of proposals aiming to ensure rigor in bioethics research. PMID:22686634
Mathematical framework for fast and rigorous track fit for the ZEUS detector
Energy Technology Data Exchange (ETDEWEB)
Spiridonov, Alexander
2008-12-15
In this note we present a mathematical framework for a rigorous approach to a common track fit for trackers located in the inner region of the ZEUS detector. The approach makes use of the Kalman filter and offers a rigorous treatment of magnetic field inhomogeneity, multiple scattering and energy loss. We describe mathematical details of the implementation of the Kalman filter technique with a reduced amount of computations for a cylindrical drift chamber, barrel and forward silicon strip detectors and a forward straw drift chamber. Options with homogeneous and inhomogeneous field are discussed. The fitting of tracks in one ZEUS event takes about of 20ms on standard PC. (orig.)
Electrocardiogram artifact caused by rigors mimicking narrow complex tachycardia: a case report.
Matthias, Anne Thushara; Indrakumar, Jegarajah
2014-02-04
The electrocardiogram (ECG) is useful in the diagnosis of cardiac and non-cardiac conditions. Rigors due to shivering can cause electrocardiogram artifacts mimicking various cardiac rhythm abnormalities. We describe an 80-year-old Sri Lankan man with an abnormal electrocardiogram mimicking narrow complex tachycardia during the immediate post-operative period. Electrocardiogram changes caused by muscle tremor during rigors could mimic a narrow complex tachycardia. Identification of muscle tremor as a cause of electrocardiogram artifact can avoid unnecessary pharmacological and non-pharmacological intervention to prevent arrhythmias.
Increased scientific rigor will improve reliability of research and effectiveness of management
Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.
2018-01-01
Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and
Assignment and Correspondence Tracking System - Tactical / Operational Reporting
Social Security Administration — Reporting data store for the Assignment and Correspondence Tracking System (ACT). ACT automates the assignment and tracking of correspondence processing within the...
Krompecher, T
1994-10-21
The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.
Algorithms for selecting informative marker panels for population assignment.
Rosenberg, Noah A
2005-11-01
Given a set of potential source populations, genotypes of an individual of unknown origin at a collection of markers can be used to predict the correct source population of the individual. For improved efficiency, informative markers can be chosen from a larger set of markers to maximize the accuracy of this prediction. However, selecting the loci that are individually most informative does not necessarily produce the optimal panel. Here, using genotypes from eight species--carp, cat, chicken, dog, fly, grayling, human, and maize--this univariate accumulation procedure is compared to new multivariate "greedy" and "maximin" algorithms for choosing marker panels. The procedures generally suggest similar panels, although the greedy method often recommends inclusion of loci that are not chosen by the other algorithms. In seven of the eight species, when applied to five or more markers, all methods achieve at least 94% assignment accuracy on simulated individuals, with one species--dog--producing this level of accuracy with only three markers, and the eighth species--human--requiring approximately 13-16 markers. The new algorithms produce substantial improvements over use of randomly selected markers; where differences among the methods are noticeable, the greedy algorithm leads to slightly higher probabilities of correct assignment. Although none of the approaches necessarily chooses the panel with optimal performance, the algorithms all likely select panels with performance near enough to the maximum that they all are suitable for practical use.
Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor
Nathues, Christina; Würbel, Hanno
2016-01-01
Accumulating evidence indicates high risk of bias in preclinical animal research, questioning the scientific validity and reproducibility of published research findings. Systematic reviews found low rates of reporting of measures against risks of bias in the published literature (e.g., randomization, blinding, sample size calculation) and a correlation between low reporting rates and inflated treatment effects. That most animal research undergoes peer review or ethical review would offer the possibility to detect risks of bias at an earlier stage, before the research has been conducted. For example, in Switzerland, animal experiments are licensed based on a detailed description of the study protocol and a harm–benefit analysis. We therefore screened applications for animal experiments submitted to Swiss authorities (n = 1,277) for the rates at which the use of seven basic measures against bias (allocation concealment, blinding, randomization, sample size calculation, inclusion/exclusion criteria, primary outcome variable, and statistical analysis plan) were described and compared them with the reporting rates of the same measures in a representative sub-sample of publications (n = 50) resulting from studies described in these applications. Measures against bias were described at very low rates, ranging on average from 2.4% for statistical analysis plan to 19% for primary outcome variable in applications for animal experiments, and from 0.0% for sample size calculation to 34% for statistical analysis plan in publications from these experiments. Calculating an internal validity score (IVS) based on the proportion of the seven measures against bias, we found a weak positive correlation between the IVS of applications and that of publications (Spearman’s rho = 0.34, p = 0.014), indicating that the rates of description of these measures in applications partly predict their rates of reporting in publications. These results indicate that the authorities licensing
Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I
2015-01-01
High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®
Rigorous lower bound on the dynamic critical exponent of some multilevel Swendsen-Wang algorithms
International Nuclear Information System (INIS)
Li, X.; Sokal, A.D.
1991-01-01
We prove the rigorous lower bound z exp ≥α/ν for the dynamic critical exponent of a broad class of multilevel (or ''multigrid'') variants of the Swendsen-Wang algorithm. This proves that such algorithms do suffer from critical slowing down. We conjecture that such algorithms in fact lie in the same dynamic universality class as the stanard Swendsen-Wang algorithm
International Nuclear Information System (INIS)
García-Valenzuela, A; Contreras-Tello, H; Márquez-Islas, R; Sánchez-Pérez, C
2013-01-01
We derive an optical model for the light intensity distribution around the critical angle in a standard Abbe refractometer when used on absorbing homogenous fluids. The model is developed using rigorous electromagnetic optics. The obtained formula is very simple and can be used suitably in the analysis and design of optical sensors relying on Abbe type refractometry.
International Nuclear Information System (INIS)
Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia
2016-01-01
We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)
Muroya, Susumu; Ohnishi-Kameyama, Mayumi; Oe, Mika; Nakajima, Ikuyo; Shibata, Masahiro; Chikuni, Koichi
2007-05-16
To investigate changes in myosin light chains (MyLCs) during postmortem aging of the bovine longissimus muscle, we performed two-dimensional gel electrophoresis followed by identification with matrix-assisted laser desorption ionization time-of-flight mass spectrometry. The results of fluorescent differential gel electrophoresis showed that two spots of the myosin regulatory light chain (MyLC2) at pI values of 4.6 and 4.7 shifted toward those at pI values of 4.5 and 4.6, respectively, by 24 h postmortem when rigor mortis was completed. Meanwhile, the MyLC1 and MyLC3 spots did not change during the 14 days postmortem. Phosphoprotein-specific staining of the gels demonstrated that the MyLC2 proteins at pI values of 4.5 and 4.6 were phosphorylated. Furthermore, possible N-terminal region peptides containing one and two phosphoserine residues were detected in each mass spectrum of the MyLC2 spots at pI values of 4.5 and 4.6, respectively. These results demonstrated that MyLC2 became doubly phosphorylated during rigor formation of the bovine longissimus, suggesting involvement of the MyLC2 phosphorylation in the progress of beef rigor mortis. Bovine; myosin regulatory light chain (RLC, MyLC2); phosphorylation; rigor mortis; skeletal muscle.
Some comments on rigorous quantum field path integrals in the analytical regularization scheme
Energy Technology Data Exchange (ETDEWEB)
Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br
2008-07-01
Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)
Some comments on rigorous quantum field path integrals in the analytical regularization scheme
International Nuclear Information System (INIS)
Botelho, Luiz C.L.
2008-01-01
Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)
Community historians and the dilemma of rigor vs relevance : A comment on Danziger and van Rappard
Dehue, Trudy
1998-01-01
Since the transition from finalism to contextualism, the history of science seems to be caught up in a basic dilemma. Many historians fear that with the new contextualist standards of rigorous historiography, historical research can no longer be relevant to working scientists themselves. The present
Goodman, Lisa A.; Epstein, Deborah; Sullivan, Cris M.
2018-01-01
Programs for domestic violence (DV) victims and their families have grown exponentially over the last four decades. The evidence demonstrating the extent of their effectiveness, however, often has been criticized as stemming from studies lacking scientific rigor. A core reason for this critique is the widespread belief that credible evidence can…
A plea for rigorous conceptual analysis as central method in transnational law design
Rijgersberg, R.; van der Kaaij, H.
2013-01-01
Although shared problems are generally easily identified in transnational law design, it is considerably more difficult to design frameworks that transcend the peculiarities of local law in a univocal fashion. The following exposition is a plea for giving more prominence to rigorous conceptual
College Readiness in California: A Look at Rigorous High School Course-Taking
Gao, Niu
2016-01-01
Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…
Cavitt, L C; Sams, A R
2003-07-01
Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.
Symmetric Logic Synthesis with Phase Assignment
Benschop, N. F.
2001-01-01
Decomposition of any Boolean Function BF_n of n binary inputs into an optimal inverter coupled network of Symmetric Boolean functions SF_k (k \\leq n) is described. Each SF component is implemented by Threshold Logic Cells, forming a complete and compact T-Cell Library. Optimal phase assignment of input polarities maximizes local symmetries. The "rank spectrum" is a new BF_n description independent of input ordering, obtained by mapping its minterms onto an othogonal n \\times n grid of (transi...
48 CFR 42.602 - Assignment and location.
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Assignment and location... Assignment and location. (a) A CACO may be assigned only when (1) the contractor has at least two locations..., or a full-time CACO may be assigned. In determining the location of the CACO, the responsible agency...
Testing the Effectiveness of Online Assignments in Theory of Finance
Batu, Michael; Bower, Nancy; Lun, Esmond; Sadanand, Asha
2018-01-01
The authors investigated the effectiveness of online versus paper assignments using final examination scores in three cohorts of theory of finance. In particular, two cohorts were exposed to online assignments while another cohort was exposed to traditional assignments. The central result is that exposure to online assignments robustly leads to…
Assigned value improves memory of proper names.
Festini, Sara B; Hartley, Alan A; Tauber, Sarah K; Rhodes, Matthew G
2013-01-01
Names are more difficult to remember than other personal information such as occupations. The current research examined the influence of assigned point value on memory and metamemory judgements for names and occupations to determine whether incentive can improve recall of proper names. In Experiment 1 participants studied face-name and face-occupation pairs assigned 1 or 10 points, made judgements of learning, and were given a cued recall test. High-value names were recalled more often than low-value names. However, recall of occupations was not influenced by value. In Experiment 2 meaningless nonwords were used for both names and occupations. The name difficulty disappeared, and value influenced recall of both names and occupations. Thus value similarly influenced names and occupations when meaningfulness was held constant. In Experiment 3 participants were required to use overt rote rehearsal for all items. Value did not boost recall of high-value names, suggesting that differential processing could not be implemented to improve memory. Thus incentives may improve memory for proper names by motivating people to engage in selective rehearsal and effortful elaborative processing.
Forster, B; Ropohl, D; Raule, P
1977-07-05
The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.
METHOD FOR SOLVING FUZZY ASSIGNMENT PROBLEM USING MAGNITUDE RANKING TECHNIQUE
D. Selvi; R. Queen Mary; G. Velammal
2017-01-01
Assignment problems have various applications in the real world because of their wide applicability in industry, commerce, management science, etc. Traditional classical assignment problems cannot be successfully used for real life problem, hence the use of fuzzy assignment problems is more appropriate. In this paper, the fuzzy assignment problem is formulated to crisp assignment problem using Magnitude Ranking technique and Hungarian method has been applied to find an optimal solution. The N...
Warriss, P D; Brown, S N; Knowles, T G
2003-12-13
The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.
On stability of Random Riccati equations
Institute of Scientific and Technical Information of China (English)
王远; 郭雷
1999-01-01
Random Riccati equations (RRE) arise frequently in filtering, estimation and control, but their stability properties are rarely rigorously explored in the literature. First a suitable stochastic observability (or excitation) condition is introduced to guarantee both the L_r-and exponential stability of RRE. Then the stability of Kalman filter is analyzed with random coefficients, and the L_r boundedness of filtering errors is established.
Structural Encoding of Static Single Assignment Form
DEFF Research Database (Denmark)
Gal, Andreas; Probst, Christian; Franz, Michael
2005-01-01
Static Single Assignment (SSA) form is often used as an intermediate representation during code optimization in Java Virtual Machines. Recently, SSA has successfully been used for bytecode verification. However, constructing SSA at the code consumer is costly. SSAbased mobile code transport formats...... Java bytecode. While the resulting bytecode sequence can still be directly executed by traditional Virtual Machines, our novel VM can infer SSA form and confirm its safety with virtually no overhead....... have been shown to eliminate this cost by shifting SSA creation to the code producer. These new formats, however, are not backward compatible with the established Java class-file format. We propose a novel approach to transport SSA information implicitly through structural code properties of standard...
Rationalization of some genetic anticodonic assignments
Lacey, J. C., Jr.; Hall, L. M.; Mullins, D. W., Jr.
1985-01-01
The hydrophobicity of most amino acids correlates well with that of their anticodon nucleotides, with Trp, Tyr, Ile, and Ser being the exceptions to this rule. Using previous data on hydrophobicity and binding constants, and new data on rates of esterification of polyadenylic acid with several N-acetylaminoacyl imidazolides, several of the anticodon assignments are rationalized. Chemical reasons are shown supporting the idea of the inclusion of the Ile in the catalog of biological amino acids late in the evolution, through a mutation of the existing tRNA and its aminoacyl-tRNA-synthetase. It was found that an addition of hexane increases the incorporation of hydrophobic Ac-Phe into poly-A, in support of the Fox (1965) and Oparin (1965) emphasis on the biogenetic importance of phase-separated systems.
Assignment of uncertainties to scientific data
International Nuclear Information System (INIS)
Froehner, F.H.
1994-01-01
Long-standing problems of uncertainty assignment to scientific data came into a sharp focus in recent years when uncertainty information ('covariance files') had to be added to application-oriented large libraries of evaluated nuclear data such as ENDF and JEF. Question arouse about the best way to express uncertainties, the meaning of statistical and systematic errors, the origin of correlation and construction of covariance matrices, the combination of uncertain data from different sources, the general usefulness of results that are strictly valid only for Gaussian or only for linear statistical models, etc. Conventional statistical theory is often unable to give unambiguous answers, and tends to fail when statistics is bad so that prior information becomes crucial. Modern probability theory, on the other hand, incorporating decision information becomes group-theoretic results, is shown to provide straight and unique answers to such questions, and to deal easily with prior information and small samples. (author). 10 refs
Solving multiconstraint assignment problems using learning automata.
Horn, Geir; Oommen, B John
2010-02-01
This paper considers the NP-hard problem of object assignment with respect to multiple constraints: assigning a set of elements (or objects) into mutually exclusive classes (or groups), where the elements which are "similar" to each other are hopefully located in the same class. The literature reports solutions in which the similarity constraint consists of a single index that is inappropriate for the type of multiconstraint problems considered here and where the constraints could simultaneously be contradictory. This feature, where we permit possibly contradictory constraints, distinguishes this paper from the state of the art. Indeed, we are aware of no learning automata (or other heuristic) solutions which solve this problem in its most general setting. Such a scenario is illustrated with the static mapping problem, which consists of distributing the processes of a parallel application onto a set of computing nodes. This is a classical and yet very important problem within the areas of parallel computing, grid computing, and cloud computing. We have developed four learning-automata (LA)-based algorithms to solve this problem: First, a fixed-structure stochastic automata algorithm is presented, where the processes try to form pairs to go onto the same node. This algorithm solves the problem, although it requires some centralized coordination. As it is desirable to avoid centralized control, we subsequently present three different variable-structure stochastic automata (VSSA) algorithms, which have superior partitioning properties in certain settings, although they forfeit some of the scalability features of the fixed-structure algorithm. All three VSSA algorithms model the processes as automata having first the hosting nodes as possible actions; second, the processes as possible actions; and, third, attempting to estimate the process communication digraph prior to probabilistically mapping the processes. This paper, which, we believe, comprehensively reports the
International Nuclear Information System (INIS)
Das, Sonjoy; Goswami, Kundan; Datta, Biswa N.
2014-01-01
Failure of structural systems under dynamic loading can be prevented via active vibration control which shifts the damped natural frequencies of the systems away from the dominant range of loading spectrum. The damped natural frequencies and the dynamic load typically show significant variations in practice. A computationally efficient methodology based on quadratic partial eigenvalue assignment technique and optimization under uncertainty has been formulated in the present work that will rigorously account for these variations and result in an economic and resilient design of structures. A novel scheme based on hierarchical clustering and importance sampling is also developed in this work for accurate and efficient estimation of probability of failure to guarantee the desired resilience level of the designed system. Numerical examples are presented to illustrate the proposed methodology
Energy Technology Data Exchange (ETDEWEB)
Das, Sonjoy; Goswami, Kundan [University at Buffalo, NY (United States); Datta, Biswa N. [Northern Illinois University, IL (United States)
2014-12-10
Failure of structural systems under dynamic loading can be prevented via active vibration control which shifts the damped natural frequencies of the systems away from the dominant range of loading spectrum. The damped natural frequencies and the dynamic load typically show significant variations in practice. A computationally efficient methodology based on quadratic partial eigenvalue assignment technique and optimization under uncertainty has been formulated in the present work that will rigorously account for these variations and result in an economic and resilient design of structures. A novel scheme based on hierarchical clustering and importance sampling is also developed in this work for accurate and efficient estimation of probability of failure to guarantee the desired resilience level of the designed system. Numerical examples are presented to illustrate the proposed methodology.
Diouf, Boucar; Rioux, Pierre
1999-01-01
Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…
Randomized Controlled Trials: The Most Powerful Tool In Modern ...
African Journals Online (AJOL)
Randomized controlled trial (RCT) can be said to be one of the simplest but most powerful tool of research. It is the most rigorous way of determining whether a cause-effect relation exists between treatment and outcome and for assessing the cost effectiveness of a treatment. Through the randomization, bias will be avoided ...
A Review on asymptotic normality of sums of associated random ...
African Journals Online (AJOL)
Association between random variables is a generalization of independence of these random variables. This concept is more and more commonly used in current trends in any research elds in Statistics. In this paper, we proceed to a simple, clear and rigorous introduction to it. We will present the fundamental asymptotic ...
A rigorous pole representation of multilevel cross sections and its practical applications
International Nuclear Information System (INIS)
Hwang, R.N.
1987-01-01
In this article a rigorous method for representing the multilevel cross sections and its practical applications are described. It is a generalization of the rationale suggested by de Saussure and Perez for the s-wave resonances. A computer code WHOPPER has been developed to convert the Reich-Moore parameters into the pole and residue parameters in momentum space. Sample calculations have been carried out to illustrate that the proposed method preserves the rigor of the Reich-Moore cross sections exactly. An analytical method has been developed to evaluate the pertinent Doppler-broadened line shape functions. A discussion is presented on how to minimize the number of pole parameters so that the existing reactor codes can be best utilized
Rigorous simulations of a helical core fiber by the use of transformation optics formalism.
Napiorkowski, Maciej; Urbanczyk, Waclaw
2014-09-22
We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.
Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W
2013-01-01
To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.
A Framework for Rigorously Identifying Research Gaps in Qualitative Literature Reviews
DEFF Research Database (Denmark)
Müller-Bloch, Christoph; Kranz, Johann
2015-01-01
Identifying research gaps is a fundamental goal of literature reviewing. While it is widely acknowledged that literature reviews should identify research gaps, there are no methodological guidelines for how to identify research gaps in qualitative literature reviews ensuring rigor and replicability....... Our study addresses this gap and proposes a framework that should help scholars in this endeavor without stifling creativity. To develop the framework we thoroughly analyze the state-of-the-art procedure of identifying research gaps in 40 recent literature reviews using a grounded theory approach....... Based on the data, we subsequently derive a framework for identifying research gaps in qualitative literature reviews and demonstrate its application with an example. Our results provide a modus operandi for identifying research gaps, thus enabling scholars to conduct literature reviews more rigorously...
Directory of Open Access Journals (Sweden)
Wang Yanqing
2016-03-01
Full Text Available A good assignment of code reviewers can effectively utilize the intellectual resources, assure code quality and improve programmers’ skills in software development. However, little research on reviewer assignment of code review has been found. In this study, a code reviewer assignment model is created based on participants’ preference to reviewing assignment. With a constraint of the smallest size of a review group, the model is optimized to maximize review outcomes and avoid the negative impact of “mutual admiration society”. This study shows that the reviewer assignment strategies incorporating either the reviewers’ preferences or the authors’ preferences get much improvement than a random assignment. The strategy incorporating authors’ preference makes higher improvement than that incorporating reviewers’ preference. However, when the reviewers’ and authors’ preference matrixes are merged, the improvement becomes moderate. The study indicates that the majority of the participants have a strong wish to work with reviewers and authors having highest competence. If we want to satisfy the preference of both reviewers and authors at the same time, the overall improvement of learning outcomes may be not the best.
Bacteria as bullies: effects of linguistic agency assignment in health message.
Bell, Robert A; McGlone, Matthew S; Dragojevic, Marko
2014-01-01
When describing health threats, communicators can assign agency to the threat (e.g., "Hepatitis C has infected 4 million Americans") or to humans (e.g., "Four million Americans have contracted hepatitis C"). In an online experiment, the authors explored how assignment of agency affects perceptions of susceptibility and severity of a health threat, response efficacy, self-efficacy, fear arousal, and intentions to adopt health-protective recommendations. Participants were 719 individuals recruited through Mechanical Turk ( www.mturk.com ), a crowdsource labor market run by Amazon ( www.amazon.com ). The participants were assigned randomly to read 1 of 8 flyers defined by a 2×4 (Agency Assignment×Topic) factorial design. Each flyer examined 1 health threat (E. coli, necrotizing fasciitis, salmonella, or Carbapenem-resistant Klebsiella pneumoniae) and was written in language that emphasized bacterial or human agency. Perceived susceptibility and severity were highest when bacterial agency language was used. Response efficacy, self-efficacy, and fear arousal were not significantly affected by agency assignment. Participants reported stronger intentions to adopt recommendations when bacteria agency language was used, but this effect did not reach conventional standards of significance (p < .051). The authors concluded that health communicators can increase target audiences' perceptions of susceptibility and severity by assigning agency to the threat in question when devising health messages.
Parental knowledge of child development and the assignment of tractor work to children.
Pickett, William; Marlenga, Barbara; Berg, Richard L
2003-07-01
Many childhood farm tractor injuries occur during the performance of work that was assigned by parents, and some tractor work is beyond the developmental capabilities of children. This has been highlighted recently by a policy statement authored by the American Academy of Pediatrics. The objective of this study was 1) to assess child development knowledge of farm parents who received a new resource, the North American Guidelines for Children's Agricultural Tasks (NAGCAT), and 2) to determine whether this knowledge was associated with use of NAGCAT in the assignment of tractor jobs and with compliance with 2 aspects of the NAGCAT tractor guideline. Secondary analysis of data collected during a randomized controlled trial that involved 450 farms in the United States and Canada was conducted. Variables assessed included 1) parental knowledge of child development across several age groups and 3 domains of child development (physical, cognitive, and psychosocial), 2) documentation of the most common tractor jobs assigned to each child, and 3) a report of whether NAGCAT was used in assigning these tractor jobs. High parental knowledge of child development was associated with enhanced use of NAGCAT and fewer violations when assigning tractor work to children. However, even in the presence of high knowledge, some farm parents still assigned to their children work that was in violation of NAGCAT. Educational interventions by themselves are not sufficient to remove many farm children from known occupational hazards. These findings are discussed in light of the recent policy statement on agricultural injuries from the American Academy of Pediatrics.
An Accurate and Impartial Expert Assignment Method for Scientific Project Review
Directory of Open Access Journals (Sweden)
Mingliang Yue
2017-12-01
Full Text Available Purpose: This paper proposes an expert assignment method for scientific project review that considers both accuracy and impartiality. As impartial and accurate peer review is extremely important to ensure the quality and feasibility of scientific projects, enhanced methods for managing the process are needed. Design/methodology/approach: To ensure both accuracy and impartiality, we design four criteria, the reviewers’ fitness degree, research intensity, academic association, and potential conflict of interest, to express the characteristics of an appropriate peer review expert. We first formalize the expert assignment problem as an optimization problem based on the designed criteria, and then propose a randomized algorithm to solve the expert assignment problem of identifying reviewer adequacy. Findings: Simulation results show that the proposed method is quite accurate and impartial during expert assignment. Research limitations: Although the criteria used in this paper can properly show the characteristics of a good and appropriate peer review expert, more criteria/conditions can be included in the proposed scheme to further enhance accuracy and impartiality of the expert assignment. Practical implications: The proposed method can help project funding agencies (e.g. the National Natural Science Foundation of China find better experts for project peer review. Originality/value: To the authors’ knowledge, this is the first publication that proposes an algorithm that applies an impartial approach to the project review expert assignment process. The simulation results show the effectiveness of the proposed method.
International Nuclear Information System (INIS)
Kinoshita, Masahiro; Naruse, Yuji
1981-08-01
The basic equations are derived for rigorous dynamic simulation of cryogenic distillation columns for hydrogen isotope separation. The model accounts for such factors as differences in latent heat of vaporization among the six isotopic species of molecular hydrogen, decay heat of tritium, heat transfer through the column wall and nonideality of the solutions. Provision is also made for simulation of columns with multiple feeds and multiple sidestreams. (author)
Rigor mortis development in turkey breast muscle and the effect of electrical stunning.
Alvarado, C Z; Sams, A R
2000-11-01
Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.
Learning from Science and Sport - How we, Safety, "Engage with Rigor"
Herd, A.
2012-01-01
As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a
Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.
Plant, D R; Lynch, G S
2001-09-01
1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.
Rigorous Integration of Non-Linear Ordinary Differential Equations in Chebyshev Basis
Czech Academy of Sciences Publication Activity Database
Dzetkulič, Tomáš
2015-01-01
Roč. 69, č. 1 (2015), s. 183-205 ISSN 1017-1398 R&D Projects: GA MŠk OC10048; GA ČR GD201/09/H057 Institutional research plan: CEZ:AV0Z10300504 Keywords : Initial value problem * Rigorous integration * Taylor model * Chebyshev basis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.366, year: 2015
A rigorous proof of the Landau-Peierls formula and much more
DEFF Research Database (Denmark)
Briet, Philippe; Cornean, Horia; Savoie, Baptiste
2012-01-01
We present a rigorous mathematical treatment of the zero-field orbital magnetic susceptibility of a non-interacting Bloch electron gas, at fixed temperature and density, for both metals and semiconductors/insulators. In particular, we obtain the Landau-Peierls formula in the low temperature and d...... and density limit as conjectured by Kjeldaas and Kohn (Phys Rev 105:806–813, 1957)....
International Nuclear Information System (INIS)
Cuttino, Laurie W.; Heffernan, Jill; Vera, Robyn; Rosu, Mihaela; Ramakrishnan, V. Ramesh; Arthur, Douglas W.
2011-01-01
Purpose: Multiple investigations have used the skin distance as a surrogate for the skin dose and have shown that distances 4.05 Gy/fraction. Conclusion: The initial skin dose recommendations have been based on safe use and the avoidance of significant toxicity. The results from the present study have suggested that patients might further benefit if more rigorous constraints were applied and if the skin dose were limited to 120% of the prescription dose.
Re-establishment of rigor mortis: evidence for a considerably longer post-mortem time span.
Crostack, Chiara; Sehner, Susanne; Raupach, Tobias; Anders, Sven
2017-07-01
Re-establishment of rigor mortis following mechanical loosening is used as part of the complex method for the forensic estimation of the time since death in human bodies and has formerly been reported to occur up to 8-12 h post-mortem (hpm). We recently described our observation of the phenomenon in up to 19 hpm in cases with in-hospital death. Due to the case selection (preceding illness, immobilisation), transfer of these results to forensic cases might be limited. We therefore examined 67 out-of-hospital cases of sudden death with known time points of death. Re-establishment of rigor mortis was positive in 52.2% of cases and was observed up to 20 hpm. In contrast to the current doctrine that a recurrence of rigor mortis is always of a lesser degree than its first manifestation in a given patient, muscular rigidity at re-establishment equalled or even exceeded the degree observed before dissolving in 21 joints. Furthermore, this is the first study to describe that the phenomenon appears to be independent of body or ambient temperature.
The MIXED framework: A novel approach to evaluating mixed-methods rigor.
Eckhardt, Ann L; DeVon, Holli A
2017-10-01
Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.
Application of the rigorous method to x-ray and neutron beam scattering on rough surfaces
International Nuclear Information System (INIS)
Goray, Leonid I.
2010-01-01
The paper presents a comprehensive numerical analysis of x-ray and neutron scattering from finite-conducting rough surfaces which is performed in the frame of the boundary integral equation method in a rigorous formulation for high ratios of characteristic dimension to wavelength. The single integral equation obtained involves boundary integrals of the single and double layer potentials. A more general treatment of the energy conservation law applicable to absorption gratings and rough mirrors is considered. In order to compute the scattering intensity of rough surfaces using the forward electromagnetic solver, Monte Carlo simulation is employed to average the deterministic diffraction grating efficiency due to individual surfaces over an ensemble of realizations. Some rules appropriate for numerical implementation of the theory at small wavelength-to-period ratios are presented. The difference between the rigorous approach and approximations can be clearly seen in specular reflectances of Au mirrors with different roughness parameters at wavelengths where grazing incidence occurs at close to or larger than the critical angle. This difference may give rise to wrong estimates of rms roughness and correlation length if they are obtained by comparing experimental data with calculations. Besides, the rigorous approach permits taking into account any known roughness statistics and allows exact computation of diffuse scattering.
Whitley, Meredith A.
2014-01-01
While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…
Bombaerts, G.; Nickel, P.J.
2017-01-01
We inquire how peer and tutor feedback influences students' optimal rigor, basic needs and motivation. We analyze questionnaires from two courses in two subsequent years. We conclude that feedback in blended learning can contribute to rigor and basic needs, but it is not clear from our data what
Optimal correction and design parameter search by modern methods of rigorous global optimization
International Nuclear Information System (INIS)
Makino, K.; Berz, M.
2011-01-01
Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle
Dual earners’ willingness to accept an international assignment.
van der Velde, E.G.; Bossink, C.J.H.; Jansen, P.G.W.
2005-01-01
Multinational organisations experience difficulties in finding managers willing to accept international assignments. This study has therefore focused on factors that can predict males' and females' willingness to accept international assignments, or to follow their partners on international
Efficient Mechanisms to Allocate Assignment Incentives in the Navy
National Research Council Canada - National Science Library
Nimon, R. W; Hall, Ricky D; Zaki, Hossam
2005-01-01
.... All assignments, however, may not necessarily be voluntary. These assignments (jobs) have been labeled as "hard to fill" by Navy leadership, and the Navy has implemented market-based, cash stipends to attract Sailors to these jobs...
Edgington, Eugene
2007-01-01
Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani
Comparing Examples: WebAssign versus Textbook
Richards, Evan; Polak, Jeff; Hardin, Ashley; Risley, John, , Dr.
2005-11-01
Research shows students can learn from worked examples.^1 This pilot study compared two groups of students' performance (10 each) in solving physics problems. One group had access to interactive examples^2 released in WebAssign^3, while the other group had access to the counterpart textbook examples. Verbal data from students in problem solving sessions was collected using a think aloud protocol^4 and the data was analyzed using Chi's procedures.^5 An explanation of the methodology and results will be presented. Future phases of this pilot study based upon these results will also be discussed. ^1Atkinson, R.K., Derry, S.J., Renkl A., Wortham, D. (2000). ``Learning from Examples: Instructional Principles from the Worked Examples Research'', Review of Educational Research, vol. 70, n. 2, pp. 181-214. ^2Serway, R.A. & Faughn, J.S. (2006). College Physics (7^th ed.). Belmont, CA: Thomson Brooks/Cole. ^3 see www.webassign.net ^4 Ericsson, K.A. & Simon, H.A. (1984). Protocol Analysis: Verbal Reports as Data. Cambridge, Massachusetts: The MIT Press. ^5 Chi, Michelene T.H. (1997). ``Quantifying Qualitative Analyses of Verbal Data: A Practical Guide,'' The Journal of the Learning Sciences, vol. 6, n. 3, pp. 271-315.
Optimal assignment of incoming flights to baggage carousels at airports
DEFF Research Database (Denmark)
Barth, Torben C.
The problem considered in this report is an assignment problem occurring at airports. This problem concerns the assignment of baggage carousels in baggage claim halls to arriving aircraft (baggage carousel assignment problem). This is a highly dynamic problem since disruptions frequently occur du...... and in general is a substantial support in decision making....
Computational Aspects of Assigning Agents to a Line
DEFF Research Database (Denmark)
Aziz, Haris; Hougaard, Jens Leth; Moreno-Ternero, Juan D.
2017-01-01
-egalitarian assignments. The approach relies on an algorithm which is shown to be faster than general purpose algorithms for the assignment problem. We also extend the approach to probabilistic assignments and explore the computational features of existing, as well as new, methods for this setting....
Computational aspects of assigning agents to a line
DEFF Research Database (Denmark)
Aziz, Haris; Hougaard, Jens Leth; Moreno-Ternero, Juan D.
2017-01-01
-egalitarian assignments. The approach relies on an algorithm which is shown to be faster than general purpose algorithms for the assignment problem. We also extend the approach to probabilistic assignments and explore the computational features of existing, as well as new, methods for this setting....
The Presentation Assignment: Creating Learning Opportunities for Diverse Student Populations.
Spencer, Brenda H.; Bartle-Angus, Kathryn
2000-01-01
Finds the presentation assignment to be an effective method of providing students with the opportunity to apply the literacy skills they are learning in ways that are personally meaningful. Describes the presentation assignment framework and provides an example of an assignment that required students to analyze and interpret works of literature…
Assignment Procedures in the Air Force Procurement Management Information System.
Ward, Joe H., Jr.; And Others
An overview is presented of the procedure for offering jobs in the Air Force Procurement Management Information System (PROMIS), an assignment system which makes possible the use of human resources research findings to improve individual personnel assignments. A general framework for viewing personnel assignment systems is presented; then job…
75 FR 55354 - Delegation of Authority and Assignment of Responsibilities
2010-09-10
... DEPARTMENT OF LABOR Office of the Secretary Delegation of Authority and Assignment of Responsibilities Secretary's Order 3-2010 Subject: Delegation of Authority and Assignment of Responsibilities to... Secretary to enforce sections 18A and 18B of the FLSA. 4. Delegation of Authority and Assignment of...
75 FR 55355 - Delegation of Authority and Assignment of Responsibility
2010-09-10
... DEPARTMENT OF LABOR Office of the Secretary Delegation of Authority and Assignment of Responsibility Secretary's Order 4-2010 Subject: Delegation of Authority and Assignment of Responsibility to the... delegations and assignments in full force and effect, except as expressly modified herein. 4. Delegation of...
7 CFR 900.106 - Assignment of mediator.
2010-01-01
... 7 Agriculture 8 2010-01-01 2010-01-01 false Assignment of mediator. 900.106 Section 900.106 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing... Assignment of mediator. The Director of the Division shall assign a mediator, from the group designated by...
A property of assignment type mixed integer linear programming problems
Benders, J.F.; van Nunen, J.A.E.E.
1982-01-01
In this paper we will proof that rather tight upper bounds can be given for the number of non-unique assignments that are achieved after solving the linear programming relaxation of some types of mixed integer linear assignment problems. Since in these cases the number of splitted assignments is
One of My Favorite Assignments: Automated Teller Machine Simulation.
Oberman, Paul S.
2001-01-01
Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)
Student generated assignments about electrical circuits in a computer simulation
Vreman-de Olde, Cornelise; de Jong, Anthonius J.M.
2004-01-01
In this study we investigated the design of assignments by students as a knowledge-generating activity. Students were required to design assignments for 'other students' in a computer simulation environment about electrical circuits. Assignments consisted of a question, alternatives, and feedback on
Effect of muscle restraint on sheep meat tenderness with rigor mortis at 18°C.
Devine, Carrick E; Payne, Steven R; Wells, Robyn W
2002-02-01
The effect on shear force of skeletal restraint and removing muscles from lamb m. longissimus thoracis et lumborum (LT) immediately after slaughter and electrical stimulation was undertaken at a rigor temperature of 18°C (n=15). The temperature of 18°C was achieved through chilling of electrically stimulated sheep carcasses in air at 12°C, air flow 1-1.5 ms(-2). In other groups, the muscle was removed at 2.5 h post-mortem and either wrapped or left non-wrapped before being placed back on the carcass to follow carcass cooling regimes. Following rigor mortis, the meat was aged for 0, 16, 40 and 65 h at 15°C and frozen. For the non-stimulated samples, the meat was aged for 0, 12, 36 and 60 h before being frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1 × 1 cm cross-section. Commencement of ageing was considered to take place at rigor mortis and this was taken as zero aged meat. There were no significant differences in the rate of tenderisation and initial shear force for all treatments. The 23% cook loss was similar for all wrapped and non-wrapped situations and the values decreased slightly with longer ageing durations. Wrapping was shown to mimic meat left intact on the carcass, as it prevented significant prerigor shortening. Such techniques allows muscles to be removed and placed in a controlled temperature environment to enable precise studies of ageing processes.
Biomedical text mining for research rigor and integrity: tasks, challenges, directions.
Kilicoglu, Halil
2017-06-13
An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.
Rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets
International Nuclear Information System (INIS)
Yang, Z.R.
1993-10-01
We have exactly calculated the rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets (SC's) by means of graph expansion and a combinatorial approach and investigated the asymptotic behaviour in the limit of long distance. The result show there is no long range correlation between spins at any finite temperature which indicates no existence of phase transition and thus finally confirms the conclusion produced by the renormalization group method and other physical arguments. (author). 7 refs, 6 figs
DEFF Research Database (Denmark)
Gaspar, Jozsef; Ritschel, Tobias Kasper Skovborg; Jørgensen, John Bagterp
2017-01-01
-linear model based control to achieve optimal techno-economic performance. Accordingly, this work presents a computationally efficient and novel approach for solving a tray-by-tray equilibrium model and its implementation for open-loop optimal-control of a cryogenic distillation column. Here, the optimisation...... objective is to reduce the cost of compression in a volatile electricity market while meeting the production requirements, i.e. product flow rate and purity. This model is implemented in Matlab and uses the ThermoLib rigorous thermodynamic library. The present work represents a first step towards plant...
A study into first-year engineering education success using a rigorous mixed methods approach
DEFF Research Database (Denmark)
van den Bogaard, M.E.D.; de Graaff, Erik; Verbraek, Alexander
2015-01-01
The aim of this paper is to combine qualitative and quantitative research methods into rigorous research into student success. Research methods have weaknesses that can be overcome by clever combinations. In this paper we use a situated study into student success as an example of how methods...... using statistical techniques. The main elements of the model were student behaviour and student disposition, which were influenced by the students’ perceptions of the education environment. The outcomes of the qualitative studies were useful in interpreting the outcomes of the structural equation...
A Rigorous Treatment of Energy Extraction from a Rotating Black Hole
Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.
2009-05-01
The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.
Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH
International Nuclear Information System (INIS)
Volk, Jochen; Herrmann, Torsten; Wuethrich, Kurt
2008-01-01
MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness
Directory of Open Access Journals (Sweden)
Rachel L. Rosenthal
2014-10-01
Full Text Available The objective of the present study is to determine the likelihood of injured or poisoned patients in special populations, such as those patients that are elderly and self-injurious, being seen within an emergency department’s triage nurse assigned urgency. Data from the National Hospital Ambulatory Medical Care Survey (2007 was utilized in this study. Multi-level models and multivariate linear regression models were used; patient age, sex, reported pain levels, wait time, and injury type were examined as potential predictors of being seen within assigned urgency. From a random sample across all US Emergency Departments, 5616 patients nested in 312 hospital emergency departments were included into the study. Typically, approximately 1 in 5 emergency department patients were not seen within their triage nurse assigned urgencies. The typical patient in the average hospital had an 81% likelihood of being seen within their assigned urgency. P atients who were oldest [odds ratio (OR=0.0990] and had self-inflicted injuries (vs assault OR=1.246 and OR=1.596 had the least likelihood to be seen within their assigned urgencies. As actual wait-time increased for patients, they were less likely to be seen within their assigned urgencies. The most powerful predictors of the study’s outcome were injury type and age, indicating that patients from special populations such as the elderly or those with injuries resulting from deliberate self-harm are less likely to be actually priority patients independent of triage nurse assigned urgencies.
Devine, Carrick; Wells, Robyn; Lowe, Tim; Waller, John
2014-01-01
The M. longissimus from lambs electrically stimulated at 15 min post-mortem were removed after grading, wrapped in polythene film and held at 4 (n=6), 7 (n=6), 15 (n=6, n=8) and 35°C (n=6), until rigor mortis then aged at 15°C for 0, 4, 24 and 72 h post-rigor. Centrifuged free water increased exponentially, and bound water, dry matter and shear force decreased exponentially over time. Decreases in shear force and increases in free water were closely related (r(2)=0.52) and were unaffected by pre-rigor temperatures. © 2013.
Tailored Random Graph Ensembles
International Nuclear Information System (INIS)
Roberts, E S; Annibale, A; Coolen, A C C
2013-01-01
Tailored graph ensembles are a developing bridge between biological networks and statistical mechanics. The aim is to use this concept to generate a suite of rigorous tools that can be used to quantify and compare the topology of cellular signalling networks, such as protein-protein interaction networks and gene regulation networks. We calculate exact and explicit formulae for the leading orders in the system size of the Shannon entropies of random graph ensembles constrained with degree distribution and degree-degree correlation. We also construct an ergodic detailed balance Markov chain with non-trivial acceptance probabilities which converges to a strictly uniform measure and is based on edge swaps that conserve all degrees. The acceptance probabilities can be generalized to define Markov chains that target any alternative desired measure on the space of directed or undirected graphs, in order to generate graphs with more sophisticated topological features.
Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.
Forgatch, Marion S; Kjøbli, John
2016-09-01
Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.
Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound
Shiraishi, Naoto; Tajima, Hiroyasu
2017-08-01
A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.
Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D
Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas
2017-11-01
One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.
Nugraheni, Z.; Budiyono, B.; Slamet, I.
2018-03-01
To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.
"Snow White" Coating Protects SpaceX Dragon's Trunk Against Rigors of Space
McMahan, Tracy
2013-01-01
He described it as "snow white." But NASA astronaut Don Pettit was not referring to the popular children's fairy tale. Rather, he was talking about the white coating of the Space Exploration Technologies Corp. (SpaceX) Dragon spacecraft that reflected from the International Space Station s light. As it approached the station for the first time in May 2012, the Dragon s trunk might have been described as the "fairest of them all," for its pristine coating, allowing Pettit to clearly see to maneuver the robotic arm to grab the Dragon for a successful nighttime berthing. This protective thermal control coating, developed by Alion Science and Technology Corp., based in McLean, Va., made its bright appearance again with the March 1 launch of SpaceX's second commercial resupply mission. Named Z-93C55, the coating was applied to the cargo portion of the Dragon to protect it from the rigors of space. "For decades, Alion has produced coatings to protect against the rigors of space," said Michael Kenny, senior chemist with Alion. "As space missions evolved, there was a growing need to dissipate electrical charges that build up on the exteriors of spacecraft, or there could be damage to the spacecraft s electronics. Alion's research led us to develop materials that would meet this goal while also providing thermal controls. The outcome of this research was Alion's proprietary Z-93C55 coating."
Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II
Energy Technology Data Exchange (ETDEWEB)
George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg; Aiysha Sultana; Tyler Van Leeuwen
2009-06-01
This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2 storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.
Dimitrakopoulos, Panagiotis
2018-03-01
The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.
Differential algebras with remainder and rigorous proofs of long-term stability
International Nuclear Information System (INIS)
Berz, Martin
1997-01-01
It is shown how in addition to determining Taylor maps of general optical systems, it is possible to obtain rigorous interval bounds for the remainder term of the n-th order Taylor expansion. To this end, the three elementary operations of addition, multiplication, and differentiation in the Differential Algebraic approach are augmented by suitable interval operations in such a way that a remainder bound of the sum, product, and derivative is obtained from the Taylor polynomial and remainder bound of the operands. The method can be used to obtain bounds for the accuracy with which a Taylor map represents the true map of the particle optical system. In a more general sense, it is also useful for a variety of other numerical problems, including rigorous global optimization of highly complex functions. Combined with methods to obtain pseudo-invariants of repetitive motion and extensions of the Lyapunov- and Nekhoroshev stability theory, the latter can be used to guarantee stability for storage rings and other weakly nonlinear systems
How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.
Gray, Kurt
2017-09-01
Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.
Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames
Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.
2017-12-01
Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.
Generating random networks and graphs
Coolen, Ton; Roberts, Ekaterina
2017-01-01
This book supports researchers who need to generate random networks, or who are interested in the theoretical study of random graphs. The coverage includes exponential random graphs (where the targeted probability of each network appearing in the ensemble is specified), growth algorithms (i.e. preferential attachment and the stub-joining configuration model), special constructions (e.g. geometric graphs and Watts Strogatz models) and graphs on structured spaces (e.g. multiplex networks). The presentation aims to be a complete starting point, including details of both theory and implementation, as well as discussions of the main strengths and weaknesses of each approach. It includes extensive references for readers wishing to go further. The material is carefully structured to be accessible to researchers from all disciplines while also containing rigorous mathematical analysis (largely based on the techniques of statistical mechanics) to support those wishing to further develop or implement the theory of rand...
Rigorous high-precision enclosures of fixed points and their invariant manifolds
Wittig, Alexander N.
The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by
Delay functions in trip assignment for transport planning process
Leong, Lee Vien
2017-10-01
In transportation planning process, volume-delay and turn-penalty functions are the functions needed in traffic assignment to determine travel time on road network links. Volume-delay function is the delay function describing speed-flow relationship while turn-penalty function is the delay function associated to making a turn at intersection. The volume-delay function used in this study is the revised Bureau of Public Roads (BPR) function with the constant parameters, α and β values of 0.8298 and 3.361 while the turn-penalty functions for signalized intersection were developed based on uniform, random and overflow delay models. Parameters such as green time, cycle time and saturation flow were used in the development of turn-penalty functions. In order to assess the accuracy of the delay functions, road network in areas of Nibong Tebal, Penang and Parit Buntar, Perak was developed and modelled using transportation demand forecasting software. In order to calibrate the models, phase times and traffic volumes at fourteen signalised intersections within the study area were collected during morning and evening peak hours. The prediction of assigned volumes using the revised BPR function and the developed turn-penalty functions show close agreement to actual recorded traffic volume with the lowest percentage of accuracy, 80.08% and the highest, 93.04% for the morning peak model. As for the evening peak model, they were 75.59% and 95.33% respectively for lowest and highest percentage of accuracy. As for the yield left-turn lanes, the lowest percentage of accuracy obtained for the morning and evening peak models were 60.94% and 69.74% respectively while the highest percentage of accuracy obtained for both models were 100%. Therefore, can be concluded that the development and utilisation of delay functions based on local road conditions are important as localised delay functions can produce better estimate of link travel times and hence better planning for future
WebAssign: Assessing Your Students' Understanding Continuously
Risley, John S.
1999-11-01
Motivating students to learn is a constant challenge for faculty. Technology can play a significant role. One such solution is WebAssign — a web-based homework system that offers new teaching and learning opportunities for educators and their students. WebAssign delivers, collects, grades, and records customized homework assignments over the Internet. Students get immediate feedback with credit and instructors can implement "Just-in-Time" teaching. In this talk, I will describe how assignments can be generated with different numerical values for each question, giving each student a unique problem to solve. This feature encourages independent thinking with the benefit of collaborative learning. Example assignments taken from textbook questions and intellectually engaging Java applet simulations will be shown. Studies and first-hand experience on the educational impact of using WebAssign will also be discussed.
Random Generators and Normal Numbers
Bailey, David H.; Crandall, Richard E.
2002-01-01
Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...
A probabilistic approach for validating protein NMR chemical shift assignments
International Nuclear Information System (INIS)
Wang Bowei; Wang, Yunjun; Wishart, David S.
2010-01-01
It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.
Dosimetric effects of edema in permanent prostate seed implants: a rigorous solution
International Nuclear Information System (INIS)
Chen Zhe; Yue Ning; Wang Xiaohong; Roberts, Kenneth B.; Peschel, Richard; Nath, Ravinder
2000-01-01
Purpose: To derive a rigorous analytic solution to the dosimetric effects of prostate edema so that its impact on the conventional pre-implant and post-implant dosimetry can be studied for any given radioactive isotope and edema characteristics. Methods and Materials: The edema characteristics observed by Waterman et al (Int. J. Rad. Onc. Biol. Phys, 41:1069-1077; 1998) was used to model the time evolution of the prostate and the seed locations. The total dose to any part of prostate tissue from a seed implant was calculated analytically by parameterizing the dose fall-off from a radioactive seed as a single inverse power function of distance, with proper account of the edema-induced time evolution. The dosimetric impact of prostate edema was determined by comparing the dose calculated with full consideration of prostate edema to that calculated with the conventional dosimetry approach where the seed locations and the target volume are assumed to be stationary. Results: A rigorous analytic solution on the relative dosimetric effects of prostate edema was obtained. This solution proved explicitly that the relative dosimetric effects of edema, as found in the previous numerical studies by Yue et. al. (Int. J. Radiat. Oncol. Biol. Phys. 43, 447-454, 1999), are independent of the size and the shape of the implant target volume and are independent of the number and the locations of the seeds implanted. It also showed that the magnitude of relative dosimetric effects is independent of the location of dose evaluation point within the edematous target volume. It implies that the relative dosimetric effects of prostate edema are universal with respect to a given isotope and edema characteristic. A set of master tables for the relative dosimetric effects of edema were obtained for a wide range of edema characteristics for both 125 I and 103 Pd prostate seed implants. Conclusions: A rigorous analytic solution of the relative dosimetric effects of prostate edema has been
Theory of Randomized Search Heuristics in Combinatorial Optimization
DEFF Research Database (Denmark)
The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS), the Metr......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... analysis of randomized algorithms to RSHs. Mostly, the expected runtime of RSHs on selected problems is analzyed. Thereby, we understand why and when RSHs are efficient optimizers and, conversely, when they cannot be efficient. The tutorial will give an overview on the analysis of RSHs for solving...
Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.
2014-01-01
In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to
Birkeland, S; Akse, L
2010-01-01
Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (Prigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (Prigor fillets (37.8 ± 0.8) and had significantly lower (Prigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®
Paulo Leminski : um estudo sobre o rigor e o relaxo em suas poesias
Dhynarte de Borba e Albuquerque
2005-01-01
O trabalho examina a trajetória da poesia de Paulo Leminski, buscando estabelecer os termos do humor, da pesquisa metalingüística e do eu-lírico, e que não deixa de exibir traços da poesia marginal dos 70. Um autor que trabalhou com a busca do rigor concretista mediante os procedimentos da fala cotidiana mais ou menos relaxada. O esforço poético do curitibano Leminski é uma “linha que nunca termina” – ele escreveu poesias, romances, peças de publicidade, letras de música e fez traduções. Em t...
Rigorous decoupling between edge states in frustrated spin chains and ladders
Chepiga, Natalia; Mila, Frédéric
2018-05-01
We investigate the occurrence of exact zero modes in one-dimensional quantum magnets of finite length that possess edge states. Building on conclusions first reached in the context of the spin-1/2 X Y chain in a field and then for the spin-1 J1-J2 Heisenberg model, we show that the development of incommensurate correlations in the bulk invariably leads to oscillations in the sign of the coupling between edge states, and hence to exact zero energy modes at the crossing points where the coupling between the edge states rigorously vanishes. This is true regardless of the origin of the frustration (e.g., next-nearest-neighbor coupling or biquadratic coupling for the spin-1 chain), of the value of the bulk spin (we report on spin-1/2, spin-1, and spin-2 examples), and of the value of the edge-state emergent spin (spin-1/2 or spin-1).
Using Project Complexity Determinations to Establish Required Levels of Project Rigor
Energy Technology Data Exchange (ETDEWEB)
Andrews, Thomas D.
2015-10-01
This presentation discusses the project complexity determination process that was developed by National Security Technologies, LLC, for the U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office for implementation at the Nevada National Security Site (NNSS). The complexity determination process was developed to address the diversity of NNSS project types, size, and complexity; to fill the need for one procedure but with provision for tailoring the level of rigor to the project type, size, and complexity; and to provide consistent, repeatable, effective application of project management processes across the enterprise; and to achieve higher levels of efficiency in project delivery. These needs are illustrated by the wide diversity of NNSS projects: Defense Experimentation, Global Security, weapons tests, military training areas, sensor development and testing, training in realistic environments, intelligence community support, sensor development, environmental restoration/waste management, and disposal of radioactive waste, among others.
Arriaza, Pablo; Nedjat-Haiem, Frances; Lee, Hee Yun; Martin, Shadi S
2015-01-01
The purpose of this article is to synthesize and chronicle the authors' experiences as four bilingual and bicultural researchers, each experienced in conducting cross-cultural/cross-language qualitative research. Through narrative descriptions of experiences with Latinos, Iranians, and Hmong refugees, the authors discuss their rewards, challenges, and methods of enhancing rigor, trustworthiness, and transparency when conducting cross-cultural/cross-language research. The authors discuss and explore how to effectively manage cross-cultural qualitative data, how to effectively use interpreters and translators, how to identify best methods of transcribing data, and the role of creating strong community relationships. The authors provide guidelines for health care professionals to consider when engaging in cross-cultural qualitative research.
Release of major ions during rigor mortis development in kid Longissimus dorsi muscle.
Feidt, C; Brun-Bellut, J
1999-01-01
Ionic strength plays an important role in post mortem muscle changes. Its increase is due to ion release during the development of rigor mortis. Twelve alpine kids were used to study the effects of chilling and meat pH on ion release. Free ions were measured in Longissimus dorsi muscle by capillary electrophoresis after water extraction. All free ion concentrations increased after death, but there were differences between ions. Temperature was not a factor affecting ion release in contrast to ultimate pH value. Three release mechanisms are believed to coexist: a passive binding to proteins, which stops as pH decreases, an active segregation which stops as ATP disappears and the production of metabolites due to anaerobic glycolysis.
Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E
2018-04-21
Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.
Direct integration of the S-matrix applied to rigorous diffraction
International Nuclear Information System (INIS)
Iff, W; Lindlein, N; Tishchenko, A V
2014-01-01
A novel Fourier method for rigorous diffraction computation at periodic structures is presented. The procedure is based on a differential equation for the S-matrix, which allows direct integration of the S-matrix blocks. This results in a new method in Fourier space, which can be considered as a numerically stable and well-parallelizable alternative to the conventional differential method based on T-matrix integration and subsequent conversions from the T-matrices to S-matrix blocks. Integration of the novel differential equation in implicit manner is expounded. The applicability of the new method is shown on the basis of 1D periodic structures. It is clear however, that the new technique can also be applied to arbitrary 2D periodic or periodized structures. The complexity of the new method is O(N 3 ) similar to the conventional differential method with N being the number of diffraction orders. (fast track communication)
Rigorous description of holograms of particles illuminated by an astigmatic elliptical Gaussian beam
Energy Technology Data Exchange (ETDEWEB)
Yuan, Y J; Ren, K F; Coetmellec, S; Lebrun, D, E-mail: fang.ren@coria.f [UMR 6614/CORIA, CNRS and Universite et INSA de Rouen Avenue de l' Universite BP 12, 76801 Saint Etienne du Rouvray (France)
2009-02-01
The digital holography is a non-intrusive optical metrology and well adapted for the measurement of the size and velocity field of particles in the spray of a fluid. The simplified model of an opaque disk is often used in the treatment of the diagrams and therefore the refraction and the third dimension diffraction of the particle are not taken into account. We present in this paper a rigorous description of the holographic diagrams and evaluate the effects of the refraction and the third dimension diffraction by comparison to the opaque disk model. It is found that the effects are important when the real part of the refractive index is near unity or the imaginary part is non zero but small.
A new method for deriving rigorous results on ππ scattering
International Nuclear Information System (INIS)
Caprini, I.; Dita, P.
1979-06-01
We develop a new approach to the problem of constraining the ππ scattering amplitudes by means of the axiomatically proved properties of unitarity, analyticity and crossing symmetry. The method is based on the solution of an extremal problem on a convex set of analytic functions and provides a global description of the domain of values taken by any finite number of partial waves at an arbitrary set of unphysical energies, compatible with unitarity, the bounds at complex energies derived from generalized dispersion relations and the crossing integral relations. From this doma domain we obtain new absolute bounds for the amplitudes as well as rigorous correlations between the values of various partial waves. (author)
Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation
Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe
2018-04-01
In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.
Increasing rigor in NMR-based metabolomics through validated and open source tools.
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2017-02-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.
A rigorous phenomenological analysis of the ππ scattering lengths
International Nuclear Information System (INIS)
Caprini, I.; Dita, P.; Sararu, M.
1979-11-01
The constraining power of the present experimental data, combined with the general theoretical knowledge about ππ scattering, upon the scattering lengths of this process, is investigated by means of a rigorous functional method. We take as input the experimental phase shifts and make no hypotheses about the high energy behaviour of the amplitudes, using only absolute bounds derived from axiomatic field theory and exact consequences of crossing symmetry. In the simplest application of the method, involving only the π 0 π 0 S-wave, we explored numerically a number of values proposed by various authors for the scattering lengths a 0 and a 2 and found that no one appears to be especially favoured. (author)
Guilak, Farshid
2017-03-21
We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Liu, Tao; Zhu, Guanghu; He, Jianfeng; Song, Tie; Zhang, Meng; Lin, Hualiang; Xiao, Jianpeng; Zeng, Weilin; Li, Xing; Li, Zhihao; Xie, Runsheng; Zhong, Haojie; Wu, Xiaocheng; Hu, Wenbiao; Zhang, Yonghui; Ma, Wenjun
2017-08-02
Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6%) in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI) data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed) model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250). The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.
International Nuclear Information System (INIS)
Ahmadi, A.; Meyer, M.; Rouzineau, D.; Prevost, M.; Alix, P.; Laloue, N.
2010-01-01
This paper gives the first step of the development of a rigorous multicomponent reactive separation model. Such a model is highly essential to further the optimization of acid gases removal plants (CO 2 capture, gas treating, etc.) in terms of size and energy consumption, since chemical solvents are conventionally used. Firstly, two main modelling approaches are presented: the equilibrium-based and the rate-based approaches. Secondly, an extended rate-based model with rigorous modelling methodology for diffusion-reaction phenomena is proposed. The film theory and the generalized Maxwell-Stefan equations are used in order to characterize multicomponent interactions. The complete chain of chemical reactions is taken into account. The reactions can be kinetically controlled or at chemical equilibrium, and they are considered for both liquid film and liquid bulk. Thirdly, the method of numerical resolution is described. Coupling the generalized Maxwell-Stefan equations with chemical equilibrium equations leads to a highly non-linear Differential-Algebraic Equations system known as DAE index 3. The set of equations is discretized with finite-differences as its integration by Gear method is complex. The resulting algebraic system is resolved by the Newton- Raphson method. Finally, the present model and the associated methods of numerical resolution are validated for the example of esterification of methanol. This archetype non-electrolytic system permits an interesting analysis of reaction impact on mass transfer, especially near the phase interface. The numerical resolution of the model by Newton-Raphson method gives good results in terms of calculation time and convergence. The simulations show that the impact of reactions at chemical equilibrium and that of kinetically controlled reactions with high kinetics on mass transfer is relatively similar. Moreover, the Fick's law is less adapted for multicomponent mixtures where some abnormalities such as counter
Rigorous upper bounds for transport due to passive advection by inhomogeneous turbulence
International Nuclear Information System (INIS)
Krommes, J.A.; Smith, R.A.
1987-05-01
A variational procedure, due originally to Howard and explored by Busse and others for self-consistent turbulence problems, is employed to determine rigorous upper bounds for the advection of a passive scalar through an inhomogeneous turbulent slab with arbitrary generalized Reynolds number R and Kubo number K. In the basic version of the method, the steady-state energy balance is used as a constraint; the resulting bound, though rigorous, is independent of K. A pedagogical reference model (one dimension, K = ∞) is described in detail; the bound compares favorably with the exact solution. The direct-interaction approximation is also worked out for this model; it is somewhat more accurate than the bound, but requires considerably more labor to solve. For the basic bound, a general formalism is presented for several dimensions, finite correlation length, and reasonably general boundary conditions. Part of the general method, in which a Green's function technique is employed, applies to self-consistent as well as to passive problems, and thereby generalizes previous results in the fluid literature. The formalism is extended for the first time to include time-dependent constraints, and a bound is deduced which explicitly depends on K and has the correct physical scalings in all regimes of R and K. Two applications from the theory of turbulent plasmas ae described: flux in velocity space, and test particle transport in stochastic magnetic fields. For the velocity space problem the simplest bound reproduces Dupree's original scaling for the strong turbulence diffusion coefficient. For the case of stochastic magnetic fields, the scaling of the bounds is described for the magnetic diffusion coefficient as well as for the particle diffusion coefficient in the so-called collisionless, fluid, and double-streaming regimes
McKee, S R; Sams, A R
1998-01-01
Development of rigor mortis at elevated post-mortem temperatures may contribute to turkey meat characteristics that are similar to those found in pale, soft, exudative pork. To evaluate this effect, 36 Nicholas tom turkeys were processed at 19 wk of age and placed in water at 40, 20, and 0 C immediately after evisceration. Pectoralis muscle samples were taken at 15 min, 30 min, 1 h, 2 h, and 4 h post-mortem and analyzed for R-value (an indirect measure of adenosine triphosphate), glycogen, pH, color, and sarcomere length. At 4 h, the remaining intact Pectoralis muscle was harvested, and aged on ice 23 h, and analyzed for drip loss, cook loss, shear values, and sarcomere length. By 15 min post-mortem, the 40 C treatment had higher R-values, which persisted through 4 h. By 1 h, the 40 C treatment pH and glycogen levels were lower than the 0 C treatment; however, they did not differ from those of the 20 C treatment. Increased L* values indicated that color became more pale by 2 h post-mortem in the 40 C treatment when compared to the 20 and 0 C treatments. Drip loss, cook loss, and shear value were increased whereas sarcomere lengths were decreased as a result of the 40 C treatment. These findings suggested that elevated post-mortem temperatures during processing resulted in acceleration of rigor mortis and biochemical changes in the muscle that produced pale, exudative meat characteristics in turkey.
Directory of Open Access Journals (Sweden)
Tao Liu
2017-08-01
Full Text Available Abstract Background Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? Methods We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6% in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. Results A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250. The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. Conclusions This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.
Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical
Directory of Open Access Journals (Sweden)
David C Pavlacky
Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous
Nevid, Jeffrey S.; Ambrose, Michael A.; Pyun, Yea Seul
2017-01-01
Our study examined whether brief writing-to-learn assignments linked to lower and higher levels in Bloom's taxonomy affected performance differentially on examination performance in assessing these skill levels. Using a quasi-random design, 91 undergraduate students in an introductory psychology class completed eight lower level and eight higher…
Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem
2018-01-01
Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282
The spectral dimension of random trees
International Nuclear Information System (INIS)
Destri, Claudio; Donetti, Luca
2002-01-01
We present a simple yet rigorous approach to the determination of the spectral dimension of random trees, based on the study of the massless limit of the Gaussian model on such trees. As a by-product, we obtain evidence in favour of a new scaling hypothesis for the Gaussian model on generic bounded graphs and in favour of a previously conjectured exact relation between spectral and connectivity dimensions on more general tree-like structures
Weak limits for quantum random walks
International Nuclear Information System (INIS)
Grimmett, Geoffrey; Janson, Svante; Scudo, Petra F.
2004-01-01
We formulate and prove a general weak limit theorem for quantum random walks in one and more dimensions. With X n denoting position at time n, we show that X n /n converges weakly as n→∞ to a certain distribution which is absolutely continuous and of bounded support. The proof is rigorous and makes use of Fourier transform methods. This approach simplifies and extends certain preceding derivations valid in one dimension that make use of combinatorial and path integral methods
Negotiating Languages and Cultures: Enacting Translingualism through a Translation Assignment
Kiernan, Julia; Meier, Joyce; Wang, Xiqiao
2016-01-01
This collaborative project explores the affordances of a translation assignment in the context of a learner-centered pedagogy that places composition students' movement among languages and cultures as both a site for inquiry and subject of analysis. The translation assignment asks students to translate scholarly articles or culture stories from…
On some special cases of the restricted assignment problem
Wang, C. (Chao); R.A. Sitters (René)
2016-01-01
textabstractWe consider some special cases of the restricted assignment problem. In this scheduling problem on parallel machines, any job j can only be assigned to one of the machines in its given subset Mj of machines. We give an LP-formulation for the problem with two job sizes and show that it
25 CFR 225.33 - Assignment of minerals agreements.
2010-04-01
... 25 Indians 1 2010-04-01 2010-04-01 false Assignment of minerals agreements. 225.33 Section 225.33 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR ENERGY AND MINERALS OIL AND GAS, GEOTHERMAL, AND SOLID MINERALS AGREEMENTS Minerals Agreements § 225.33 Assignment of minerals agreements. An...
On the Use of Writing Assignments in Intermediate Microeconomic Theory
O'Neill, Patrick B.
2009-01-01
A typical writing assignment in upper level required courses is a term paper. However many economics majors, particularly those in business schools, need to develop skill at writing shorter pieces. In this paper I describe numerous examples of shorter writing assignments that I have incorporated into an Intermediate Microeconomic Theory course.…
75 FR 55352 - Delegation of Authorities and Assignment of Responsibilities
2010-09-10
... DEPARTMENT OF LABOR Office of the Secretary Delegation of Authorities and Assignment of Responsibilities Secretary's Order 5-2010 Subject: Delegation of Authorities and Assignment of Responsibilities to... rather than the Administrator, WHD (see also Secretary's Order 3-2010). 5. Delegations of Authority and...
7 CFR 1900.5 - Assignment of cases.
2010-01-01
... 7 Agriculture 12 2010-01-01 2010-01-01 false Assignment of cases. 1900.5 Section 1900.5 Agriculture Regulations of the Department of Agriculture (Continued) RURAL HOUSING SERVICE, RURAL BUSINESS... REGULATIONS GENERAL Delegations of Authority § 1900.5 Assignment of cases. The State Director may, in writing...
Students' Evaluation of Writing Assignments in an Abnormal Psychology Course.
Procidano, Mary E.
1991-01-01
Presents a study in which students in an abnormal psychology class rated the usefulness of drafts for two writing assignments. Reports that a research proposal was more effective than a case study in generating interest in psychology and opportunity for creativity. Concludes that writing assignments should reflect important aspects of a…
14 CFR 1245.109 - Assignment of title to NASA.
2010-01-01
... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Assignment of title to NASA. 1245.109... INTELLECTUAL PROPERTY RIGHTS Patent Waiver Regulations § 1245.109 Assignment of title to NASA. (a) The instrument of waiver set forth in § 1245.115(c) shall be voided by NASA with respect to the domestic title to...
32 CFR 644.396 - Assignment of personnel to administer.
2010-07-01
... 32 National Defense 4 2010-07-01 2010-07-01 true Assignment of personnel to administer. 644.396... PROPERTY REAL ESTATE HANDBOOK Disposal Predisposal Action § 644.396 Assignment of personnel to administer... responsible representative to each installation, or group of installations, to act under his staff supervision...
Exogenous spatial attention influences figure-ground assignment.
Vecera, Shaun P; Flevaris, Anastasia V; Filapek, Joseph C
2004-01-01
In a hierarchical stage account of vision, figure-ground assignment is thought to be completed before the operation of focal spatial attention. Results of previous studies have supported this account by showing that unpredictive, exogenous spatial precues do not influence figure-ground assignment, although voluntary attention can influence figure-ground assignment. However, in these studies, attention was not summoned directly to a region in a figure-ground display. In three experiments, we addressed the relationship between figure-ground assignment and visuospatial attention. In Experiment 1, we replicated the finding that exogenous precues do not influence figure-ground assignment when they direct attention outside of a figure-ground stimulus. In Experiment 2, we demonstrated that exogenous attention can influence figure-ground assignment if it is directed to one of the regions in a figure-ground stimulus. In Experiment 3, we demonstrated that exogenous attention can influence figure-ground assignment in displays that contain a Gestalt figure-ground cue; this result suggests that figure-ground processes are not entirely completed prior to the operation of focal spatial attention. Exogenous spatial attention acts as a cue for figure-ground assignment and can affect the outcome of figure-ground processes.
Parentage assignment of progeny in mixed milt fertilization of ...
African Journals Online (AJOL)
Administrator
2011-06-13
Jun 13, 2011 ... individuals. Overall, 98.8% of progeny were assigned to their parents using Family Assignment. Program (FAP). Selection of hyper-variable microsatellites in Caspian brown trout to identify unique alleles was effective for unambiguous parentage determination and estimation of genetic diversity in this study.
Genetics of traffic assignment models for strategic transport planning
Bliemer, M.C.J.; Raadsen, M.P.H.; Brederode, L.J.N.; Bell, M.G.H.; Wismans, Luc Johannes Josephus; Smith, M.J.
2016-01-01
This paper presents a review and classification of traffic assignment models for strategic transport planning purposes by using concepts analogous to genetics in biology. Traffic assignment models share the same theoretical framework (DNA), but differ in capability (genes). We argue that all traffic
Scaffolding Assignments and Activities for Undergraduate Research Methods
Fisher, Sarah; Justwan, Florian
2018-01-01
This article details assignments and lessons created for and tested in research methods courses at two different universities, a large state school and a small liberal arts college. Each assignment or activity utilized scaffolding. Students were asked to push beyond their comfort zone while utilizing concrete and/or creative examples,…
A Poster Assignment Connects Information Literacy and Writing Skills
Waters, Natalie
2015-01-01
This paper describes the implementation of a poster assignment in a writing and information literacy course required for undergraduate Life Sciences and Environmental Biology majors with the Faculty of Agricultural and Environmental Sciences at McGill University. The assignment was introduced in response to weaknesses identified through course…
Personnel shift assignment: Existence conditions and network models
van den Berg, Jeroen P.; van den Berg, J.P.; Panton, David M.
1994-01-01
The personnel scheduling problem is known to be a five-stage process in which the final stage involves the assignment of shifts to the days worked in the schedule. This paper discusses the existence conditions for both continuous and forward rotating shift assignments and heuristic network
28 CFR 545.23 - Inmate work/program assignment.
2010-07-01
... community living area, unless the pretrial inmate has signed a waiver of his or her right not to work (see... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Inmate work/program assignment. 545.23... WORK AND COMPENSATION Inmate Work and Performance Pay Program § 545.23 Inmate work/program assignment...
Automating Formative and Summative Feedback for Individualised Assignments
Hamilton, Ian Robert
2009-01-01
Purpose: The purpose of this paper is to report on the rationale behind the use of a unique paper-based individualised accounting assignment, which automated the provision to students of immediate formative and timely summative feedback. Design/methodology/approach: As students worked towards completing their assignment, the package provided…
submission of art studio-based assignments: students experience
African Journals Online (AJOL)
PUBLICATIONS1
are reluctant to complete their studio assignments on time are critically ... tative and qualitative data, derived from survey and interviews were used to ... is therefore exploratory and studio based. It ... mogenous group of students who report pro- ... Assignment management .... The analyses in this study are based on data.
13 CFR 500.210 - Assignment or transfer of loans.
2010-01-01
... has the effect of distributing the risks of the credit among other Lenders if: (i) Neither the loan... be modified, assigned, conveyed, sold or otherwise transferred by the Lender, in whole or in part... assignment or transfer of less than 100 percent of a Lender's interest in the Loan Documents and Guarantee...
Designing Internet research assignments: building a framework for instructor collaboration
Directory of Open Access Journals (Sweden)
David Ward
2000-01-01
Full Text Available Internet knowledge is increasing steadily among instructors in the academic world. As courses incorporate more instructional technology, traditional undergraduate research assignments are adapting to reflect the changing world of information and information access. New library assignments reflect this shift as well, with term papers and research projects asking students to use Web sites as an information resource, in addition to the standard literature of periodicals and monographs. But the many pitfalls the library profession has learned in its own metamorphosis during the past decade are often repeated in these newer course assignments. The authors in this paper present a framework for librarians to interact with instructors to incorporate Internet resources into traditional term paper and research assignments. They suggest a framework for creating sample assignments librarians can take to campus instructional units, to show the teaching community at large what the library profession has learned from first-hand experience.
GENERAL ISSUES CONCERNING THE ASSIGNMENT OF SOCIAL PARTS
Directory of Open Access Journals (Sweden)
Stela Mihăilescu
2012-11-01
Full Text Available By means of the present study, we try to offer a thorough image and an analysis concerning the assignment mode of social parts within a company having limited liability. The assignment of social parts is free and unrestricted except for the cases provided by article 202, paragraph 2 from Law no. 31/ 1990- the law of commercial companies with further modifications and completions and the ones provided in OUG no. 54/ 2010 concerning some measures for fighting fiscal evasion. By means of the assignment operation a transmission is made up by an assignment of social parts contract towards one or more already associated persons in the company or towards other individual or legal persons who are going to obtain the associate quality. The principle governing any assignment is the one of goods circulation freedom, a freedom restricted only by the public order and imperative judicial norms.
International Nuclear Information System (INIS)
Tahir-Kheli, R.A.
1975-01-01
A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt
Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M
1978-07-01
The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.
Mummah, Sarah; Robinson, Thomas N; Mathur, Maya; Farzinkhou, Sarah; Sutton, Stephen; Gardner, Christopher D
2017-09-15
Mobile applications (apps) have been heralded as transformative tools to deliver behavioral health interventions at scale, but few have been tested in rigorous randomized controlled trials. We tested the effect of a mobile app to increase vegetable consumption among overweight adults attempting weight loss maintenance. Overweight adults (n=135) aged 18-50 years with BMI=28-40 kg/m 2 near Stanford, CA were recruited from an ongoing 12-month weight loss trial (parent trial) and randomly assigned to either the stand-alone, theory-based Vegethon mobile app (enabling goal setting, self-monitoring, and feedback and using "process motivators" including fun, surprise, choice, control, social comparison, and competition) or a wait-listed control condition. The primary outcome was daily vegetables servings, measured by an adapted Harvard food frequency questionnaire (FFQ) 8 weeks post-randomization. Daily vegetable servings from 24-hour dietary recalls, administered by trained, certified, and blinded interviewers 5 weeks post-randomization, was included as a secondary outcome. All analyses were conducted according to principles of intention-to-treat. Daily vegetable consumption was significantly greater in the intervention versus control condition for both measures (adjusted mean difference: 2.0 servings; 95% CI: 0.1, 3.8, p=0.04 for FFQ; and 1.0 servings; 95% CI: 0.2, 1.9; p=0.02 for 24-hour recalls). Baseline vegetable consumption was a significant moderator of intervention effects (p=0.002) in which effects increased as baseline consumption increased. These results demonstrate the efficacy of a mobile app to increase vegetable consumption among overweight adults. Theory-based mobile interventions may present a low-cost, scalable, and effective approach to improving dietary behaviors and preventing associated chronic diseases. ClinicalTrials.gov NCT01826591. Registered 27 March 2013.
2012-02-06
... evaluation, the evaluation team, led by researchers from Mathematica and its subcontractor MDRC, submitted... anyone to services to which they would not have referred them in the absence of the study. The study...
2010-11-10
... study, a consent form, a baseline information form (BIF), and a contact information form (CIF). WIA adult and dislocated worker applicants will be asked to sign a consent form to confirm that they have... be collected through cost collection forms and interviews with program staff during the second site...
DEFF Research Database (Denmark)
Damm, Anna Piil
2012-01-01
of men living in the neighborhood, but positively affected by the employment rate of non-Western immigrant men and co-national men living in the neighborhood. This is strong evidence that immigrants find jobs in part through their employed immigrant and co-ethnic contacts in the neighborhood of residence...... successfully addresses the methodological problem of endogenous neighborhood selection. Taking account of location sorting, living in a socially deprived neighborhood does not affect labor market outcomes of refugee men. Furthermore, their labor market outcomes are not affected by the overall employment rate...
Parise, Leigh M.; Corrin, William; Granito, Kelly; Haider, Zeest; Somers, Marie-Andrée; Cerna, Oscar
2017-01-01
While high school graduation rates are on the rise nationwide, too many students still never reach that milestone, with 7,000 on average dropping out every day. Recognizing that many students need additional support to succeed in school, Communities In Schools (CIS) works to provide and connect students with integrated support services to keep…
Weiss, Michael J.; Mayer, Alexander K.; Cullinan, Dan; Ratledge, Alyssa; Sommo, Colleen; Diamond, John
2015-01-01
Community colleges play a vital role in higher education, enrolling more than one in every three postsecondary students. While their market share has grown over the past 50 years, students' success rates remain low. Consequently, community college stakeholders are searching with mounting urgency for strategies that increase rates of success. We…
2011-07-21
... prior employment and training service delivery systems. The recent recession, high unemployment rate and... Evaluate Workforce Investment Act Adult and Dislocated Worker Programs; Request for Comment AGENCY... estimates of the net impacts of intensive services and training provided under the Workforce Investment Act...
Weiss, Michael J.; Mayer, Alexander; Cullinan, Dan; Ratledge, Alyssa; Sommo, Colleen; Diamond, John
2014-01-01
Empirical evidence confirms that increased education is positively associated with higher earnings across a wide spectrum of fields and student demographics (Barrow & Rouse, 2005; Card, 2001; Carneiro, Heckman, & Vytlacil, 2011; Dadgar & Weiss, 2012; Dynarski, 2008; Jacobson & Mokher, 2009; Jepsen, Troske, & Coomes, 2009; Kane…
Directory of Open Access Journals (Sweden)
Jin-Chao Liu
2017-12-01
Full Text Available "AIM: To compare visual prognoses and postoperative adverse events of congenital cataract surgery performed at different times and using different surgical approaches. METHODS: In this prospective, randomized controlled trial, we recruited congenital cataract patients aged 3mo or younger before cataract surgery. Sixty-one eligible patients were randomly assigned to two groups according to surgical timing: a 3-month-old group and a 6-month-old group. Each eye underwent one of three randomly assigned surgical procedures, as follows: surgery A, lens aspiration (I/A; surgery B, lens aspiration with posterior continuous curvilinear capsulorhexis (I/A+PCCC; and surgery C, lens aspiration with posterior continuous curvilinear capsulorhexis and anterior vitrectomy (I/A+PCCC+A-Vit. The long-term best-corrected visual acuity (BCVA and the incidence of complications in the different groups were compared and analyzed. RESULTS: A total of 57 participants (114 eyes with a mean follow-up period of 48.7mo were included in the final analysis. The overall logMAR BCVA in the 6-month-old group was better than that in the 3-month-old group (0.81±0.28 vs 0.96±0.30; P=0.02. The overall logMAR BCVA scores in the surgery B group were lower than the scores in the A and C groups (A: 0.80±0.29, B: 1.02±0.28, and C: 0.84±0.28; P=0.007. A multivariate linear regression revealed no significant relationships between the incidence of complications and long-term BCVA. CONCLUSION: It might be safer and more beneficial for bilateral total congenital cataract patients to undergo surgery at 6mo of age than 3mo. Moreover, with rigorous follow-up and timely intervention, the postoperative complications in these patients are treatable and do not compromise visual outcomes."
Layout optimization of DRAM cells using rigorous simulation model for NTD
Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe
2014-03-01
scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.
Heuristic for Task-Worker Assignment with Varying Learning Slopes
Directory of Open Access Journals (Sweden)
Wipawee Tharmmaphornphilas
2010-04-01
Full Text Available Fashion industry has variety products, so the multi-skilled workers are required to improve flexibility in production and assignment. Generally the supervisor will assign task to the workers based on skill and skill levels of worker. Since in fashion industry new product styles are launched more frequently and the order size tends to be smaller, the workers always learn when the raw material and the production process changes. Consequently they require less time to produce the succeeding units of a task based on their learning ability. Since the workers have both experience and inexperience workers, so each worker has different skill level and learning ability. Consequently, the assignment which assumed constant skill level is not proper to use. This paper proposes a task-worker assignment considering worker skill levels and learning abilities. Processing time of each worker changes along production period due to a worker learning ability. We focus on a task-worker assignment in a fashion industry where tasks are ordered in series; the number of tasks is greater than the number of workers. Therefore, workers can perform multiple assignments followed the precedence restriction as an assembly line balancing problem. The problem is formulated in an integer linear programming model with objective to minimize makespan. A heuristic is proposed to determine the lower bound (LB and the upper bound (UB of the problem and the best assignment is determined. The performance of the heuristic method is tested by comparing quality of solution and computational time to optimal solutions.
Staff assignment practices in nursing homes: review of the literature.
Rahman, Anna; Straker, Jane K; Manning, Lydia
2009-01-01
Consistent assignment, whereby nursing home staff members, particularly certified nurse aides, are assigned to the same residents on most shifts, is increasingly viewed as a cornerstone of culture change in nursing homes. It has been advocated as a best-care model that increases residents' quality of life while contributing to a more stable frontline staff. Given these potential benefits, consistent assignment is now widely viewed as superior to rotating assignment, an alternative staffing model that aims to distribute care burden more fairly among staff and ensure that workers are familiar with most residents. Despite favorable anecdotal reports about the benefits of consistent assignment, the research literature reports mixed and sometimes contradictory findings for this staffing practice. This article reviews the research pertaining to staff assignment practices in nursing homes. Reviewed here are 13 reports on experimental trials (6 reports), evaluation research (4 reports), and nursing home surveys (3 reports). The review reveals broad diversity in staffing practices and raises questions that challenge popular assumptions about consistent assignment. The article closes with a discussion of the research, policy, and practice implications of the research findings.
International Nuclear Information System (INIS)
Yang, Yu; Fritzsching, Keith J.; Hong, Mei
2013-01-01
A multi-objective genetic algorithm is introduced to predict the assignment of protein solid-state NMR (SSNMR) spectra with partial resonance overlap and missing peaks due to broad linewidths, molecular motion, and low sensitivity. This non-dominated sorting genetic algorithm II (NSGA-II) aims to identify all possible assignments that are consistent with the spectra and to compare the relative merit of these assignments. Our approach is modeled after the recently introduced Monte-Carlo simulated-annealing (MC/SA) protocol, with the key difference that NSGA-II simultaneously optimizes multiple assignment objectives instead of searching for possible assignments based on a single composite score. The multiple objectives include maximizing the number of consistently assigned peaks between multiple spectra (“good connections”), maximizing the number of used peaks, minimizing the number of inconsistently assigned peaks between spectra (“bad connections”), and minimizing the number of assigned peaks that have no matching peaks in the other spectra (“edges”). Using six SSNMR protein chemical shift datasets with varying levels of imperfection that was introduced by peak deletion, random chemical shift changes, and manual peak picking of spectra with moderately broad linewidths, we show that the NSGA-II algorithm produces a large number of valid and good assignments rapidly. For high-quality chemical shift peak lists, NSGA-II and MC/SA perform similarly well. However, when the peak lists contain many missing peaks that are uncorrelated between different spectra and have chemical shift deviations between spectra, the modified NSGA-II produces a larger number of valid solutions than MC/SA, and is more effective at distinguishing good from mediocre assignments by avoiding the hazard of suboptimal weighting factors for the various objectives. These two advantages, namely diversity and better evaluation, lead to a higher probability of predicting the correct
Yang, Yu; Fritzsching, Keith J; Hong, Mei
2013-11-01
A multi-objective genetic algorithm is introduced to predict the assignment of protein solid-state NMR (SSNMR) spectra with partial resonance overlap and missing peaks due to broad linewidths, molecular motion, and low sensitivity. This non-dominated sorting genetic algorithm II (NSGA-II) aims to identify all possible assignments that are consistent with the spectra and to compare the relative merit of these assignments. Our approach is modeled after the recently introduced Monte-Carlo simulated-annealing (MC/SA) protocol, with the key difference that NSGA-II simultaneously optimizes multiple assignment objectives instead of searching for possible assignments based on a single composite score. The multiple objectives include maximizing the number of consistently assigned peaks between multiple spectra ("good connections"), maximizing the number of used peaks, minimizing the number of inconsistently assigned peaks between spectra ("bad connections"), and minimizing the number of assigned peaks that have no matching peaks in the other spectra ("edges"). Using six SSNMR protein chemical shift datasets with varying levels of imperfection that was introduced by peak deletion, random chemical shift changes, and manual peak picking of spectra with moderately broad linewidths, we show that the NSGA-II algorithm produces a large number of valid and good assignments rapidly. For high-quality chemical shift peak lists, NSGA-II and MC/SA perform similarly well. However, when the peak lists contain many missing peaks that are uncorrelated between different spectra and have chemical shift deviations between spectra, the modified NSGA-II produces a larger number of valid solutions than MC/SA, and is more effective at distinguishing good from mediocre assignments by avoiding the hazard of suboptimal weighting factors for the various objectives. These two advantages, namely diversity and better evaluation, lead to a higher probability of predicting the correct assignment for a
Assigning breed origin to alleles in crossbred animals.
Vandenplas, Jérémie; Calus, Mario P L; Sevillano, Claudia A; Windig, Jack J; Bastiaansen, John W M
2016-08-22
For some species, animal production systems are based on the use of crossbreeding to take advantage of the increased performance of crossbred compared to purebred animals. Effects of single nucleotide polymorphisms (SNPs) may differ between purebred and crossbred animals for several reasons: (1) differences in linkage disequilibrium between SNP alleles and a quantitative trait locus; (2) differences in genetic backgrounds (e.g., dominance and epistatic interactions); and (3) differences in environmental conditions, which result in genotype-by-environment interactions. Thus, SNP effects may be breed-specific, which has led to the development of genomic evaluations for crossbred performance that take such effects into account. However, to estimate breed-specific effects, it is necessary to know breed origin of alleles in crossbred animals. Therefore, our aim was to develop an approach for assigning breed origin to alleles of crossbred animals (termed BOA) without information on pedigree and to study its accuracy by considering various factors, including distance between breeds. The BOA approach consists of: (1) phasing genotypes of purebred and crossbred animals; (2) assigning breed origin to phased haplotypes; and (3) assigning breed origin to alleles of crossbred animals based on a library of assigned haplotypes, the breed composition of crossbred animals, and their SNP genotypes. The accuracy of allele assignments was determined for simulated datasets that include crosses between closely-related, distantly-related and unrelated breeds. Across these scenarios, the percentage of alleles of a crossbred animal that were correctly assigned to their breed origin was greater than 90 %, and increased with increasing distance between breeds, while the percentage of incorrectly assigned alleles was always less than 2 %. For the remaining alleles, i.e. 0 to 10 % of all alleles of a crossbred animal, breed origin could not be assigned. The BOA approach accurately assigns
Energy Technology Data Exchange (ETDEWEB)
Zhaoyuan Liu; Kord Smith; Benoit Forget; Javier Ortensi
2016-05-01
A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices. Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.
Burrus, Barri B.; Scott, Alicia Richmond
2012-01-01
Adolescent parents and their children are at increased risk for adverse short- and long-term health and social outcomes. Effective interventions are needed to support these young families. We studied the evidence base and found a dearth of rigorously evaluated programs. Strategies from successful interventions are needed to inform both intervention design and policies affecting these adolescents. The lack of rigorous evaluations may be attributable to inadequate emphasis on and sufficient funding for evaluation, as well as to challenges encountered by program evaluators working with this population. More rigorous program evaluations are urgently needed to provide scientifically sound guidance for programming and policy decisions. Evaluation lessons learned have implications for other vulnerable populations. PMID:22897541
On the entropy of random surfaces with arbitrary genus
International Nuclear Information System (INIS)
Kostov, I.K.; Krzywicki, A.
1987-01-01
We calculate the susceptibility critical exponent γ for Polyakov random surfaces with arbitrary genus, using the Liouville theory to one-loop order. Some rigorous results obtained for special dimensionalities in a discrete version of the model are also noted. In all cases γ grows linearly with the genus of the surface. (orig.)
Randomized random walk on a random walk
International Nuclear Information System (INIS)
Lee, P.A.
1983-06-01
This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)
Directory of Open Access Journals (Sweden)
Ming-Wen An
2015-01-01
Full Text Available Background. A phase II design with an option for direct assignment (stop randomization and assign all patients to experimental treatment based on interim analysis, IA for a predefined subgroup was previously proposed. Here, we illustrate the modularity of the direct assignment option by applying it to the setting of two predefined subgroups and testing for separate subgroup main effects. Methods. We power the 2-subgroup direct assignment option design with 1 IA (DAD-1 to test for separate subgroup main effects, with assessment of power to detect an interaction in a post-hoc test. Simulations assessed the statistical properties of this design compared to the 2-subgroup balanced randomized design with 1 IA, BRD-1. Different response rates for treatment/control in subgroup 1 (0.4/0.2 and in subgroup 2 (0.1/0.2, 0.4/0.2 were considered. Results. The 2-subgroup DAD-1 preserves power and type I error rate compared to the 2-subgroup BRD-1, while exhibiting reasonable power in a post-hoc test for interaction. Conclusion. The direct assignment option is a flexible design component that can be incorporated into broader design frameworks, while maintaining desirable statistical properties, clinical appeal, and logistical simplicity.
Rigorous classification and carbon accounting principles for low and Zero Carbon Cities
International Nuclear Information System (INIS)
Kennedy, Scott; Sgouridis, Sgouris
2011-01-01
A large number of communities, new developments, and regions aim to lower their carbon footprint and aspire to become 'zero carbon' or 'Carbon Neutral.' Yet there are neither clear definitions for the scope of emissions that such a label would address on an urban scale, nor is there a process for qualifying the carbon reduction claims. This paper addresses the question of how to define a zero carbon, Low Carbon, or Carbon Neutral urban development by proposing hierarchical emissions categories with three levels: Internal Emissions based on the geographical boundary, external emissions directly caused by core municipal activities, and internal or external emissions due to non-core activities. Each level implies a different carbon management strategy (eliminating, balancing, and minimizing, respectively) needed to meet a Net Zero Carbon designation. The trade-offs, implications, and difficulties of implementing carbon debt accounting based upon these definitions are further analyzed. - Highlights: → A gap exists in comprehensive and standardized accounting methods for urban carbon emissions. → We propose a comprehensive and rigorous City Framework for Carbon Accounting (CiFCA). → CiFCA classifies emissions hierarchically with corresponding carbon management strategies. → Adoption of CiFCA allows for meaningful comparisons of claimed performance of eco-cities.
Revisiting the constant growth angle: Estimation and verification via rigorous thermal modeling
Virozub, Alexander; Rasin, Igal G.; Brandon, Simon
2008-12-01
Methods for estimating growth angle ( θgr) values, based on the a posteriori analysis of directionally solidified material (e.g. drops) often involve assumptions of negligible gravitational effects as well as a planar solid/liquid interface during solidification. We relax both of these assumptions when using experimental drop shapes from the literature to estimate the relevant growth angles at the initial stages of solidification. Assumed to be constant, we use these values as input into a rigorous heat transfer and solidification model of the growth process. This model, which is shown to reproduce the experimental shape of a solidified sessile water drop using the literature value of θgr=0∘, yields excellent agreement with experimental profiles using our estimated values for silicon ( θgr=10∘) and germanium ( θgr=14.3∘) solidifying on an isotropic crystalline surface. The effect of gravity on the solidified drop shape is found to be significant in the case of germanium, suggesting that gravity should either be included in the analysis or that care should be taken that the relevant Bond number is truly small enough in each measurement. The planar solidification interface assumption is found to be unjustified. Although this issue is important when simulating the inflection point in the profile of the solidified water drop, there are indications that solidified drop shapes (at least in the case of silicon) may be fairly insensitive to the shape of this interface.
Manley, J.; Chegwidden, D.; Mote, A. S.; Ledley, T. S.; Lynds, S. E.; Haddad, N.; Ellins, K.
2016-02-01
EarthLabs, envisioned as a national model for high school Earth or Environmental Science lab courses, is adaptable for both undergraduate middle school students. The collection includes ten online modules that combine to feature a global view of our planet as a dynamic, interconnected system, by engaging learners in extended investigations. EarthLabs support state and national guidelines, including the NGSS, for science content. Four modules directly guide students to discover vital aspects of the oceans while five other modules incorporate ocean sciences in order to complete an understanding of Earth's climate system. Students gain a broad perspective on the key role oceans play in fishing industry, droughts, coral reefs, hurricanes, the carbon cycle, as well as life on land and in the seas to drive our changing climate by interacting with scientific research data, manipulating satellite imagery, numerical data, computer visualizations, experiments, and video tutorials. Students explore Earth system processes and build quantitative skills that enable them to objectively evaluate scientific findings for themselves as they move through ordered sequences that guide the learning. As a robust collection, EarthLabs modules engage students in extended, rigorous investigations allowing a deeper understanding of the ocean, climate and weather. This presentation provides an overview of the ten curriculum modules that comprise the EarthLabs collection developed by TERC and found at http://serc.carleton.edu/earthlabs/index.html. Evaluation data on the effectiveness and use in secondary education classrooms will be summarized.
A Rigorous Investigation on the Ground State of the Penson-Kolb Model
Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi
2003-05-01
By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002
Directory of Open Access Journals (Sweden)
Surendra P. Singh
2015-11-01
Full Text Available The Himalaya range encompasses enormous variation in elevation, precipitation, biodiversity, and patterns of human livelihoods. These mountains modify the regional climate in complex ways; the ecosystem services they provide influence the lives of almost 1 billion people in 8 countries. However, our understanding of these ecosystems remains rudimentary. The 2007 Intergovernmental Panel on Climate Change report that erroneously predicted a date for widespread glacier loss exposed how little was known of Himalayan glaciers. Recent research shows how variably glaciers respond to climate change in different Himalayan regions. Alarmist theories are not new. In the 1980s, the Theory of Himalayan Degradation warned of complete forest loss and devastation of downstream areas, an eventuality that never occurred. More recently, the debate on hydroelectric construction appears driven by passions rather than science. Poor data, hasty conclusions, and bad science plague Himalayan research. Rigorous sampling, involvement of civil society in data collection, and long-term collaborative research involving institutions from across the Himalaya are essential to improve knowledge of this region.
Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.
2017-09-01
The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.
A Rigorous Theory of Many-Body Prethermalization for Periodically Driven and Closed Quantum Systems
Abanin, Dmitry; De Roeck, Wojciech; Ho, Wen Wei; Huveneers, François
2017-09-01
Prethermalization refers to the transient phenomenon where a system thermalizes according to a Hamiltonian that is not the generator of its evolution. We provide here a rigorous framework for quantum spin systems where prethermalization is exhibited for very long times. First, we consider quantum spin systems under periodic driving at high frequency {ν}. We prove that up to a quasi-exponential time {τ_* ˜ e^{c ν/log^3 ν}}, the system barely absorbs energy. Instead, there is an effective local Hamiltonian {\\widehat D} that governs the time evolution up to {τ_*}, and hence this effective Hamiltonian is a conserved quantity up to {τ_*}. Next, we consider systems without driving, but with a separation of energy scales in the Hamiltonian. A prime example is the Fermi-Hubbard model where the interaction U is much larger than the hopping J. Also here we prove the emergence of an effective conserved quantity, different from the Hamiltonian, up to a time {τ_*} that is (almost) exponential in {U/J}.
Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death
Directory of Open Access Journals (Sweden)
Evgeniy R. Galimov
2018-03-01
Full Text Available Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence. Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC. This phenomenon is accompanied by a wave of intramuscular Ca2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death.
Inosine-5'-monophosphate is a candidate agent to resolve rigor mortis of skeletal muscle.
Matsuishi, Masanori; Tsuji, Mariko; Yamaguchi, Megumi; Kitamura, Natsumi; Tanaka, Sachi; Nakamura, Yukinobu; Okitani, Akihiro
2016-11-01
The object of the present study was to reveal the action of inosine-5'-monophosphate (IMP) toward myofibrils in postmortem muscles. IMP solubilized isolated actomyosin within a narrow range of KCl concentration, 0.19-0.20 mol/L, because of the dissociation of actomyosin into actin and myosin, but it did not solubilize the proteins in myofibrils with 0.2 mol/L KCl. However, IMP could solubilize both proteins in myofibrils with 0.2 mol/L KCl in the presence of 1 m mol/L pyrophosphate or 1.0-3.3 m mol/L adenosine-5'-diphosphate (ADP). Thus, we presumed that pyrophosphate and ADP released thin filaments composed of actin, and thick filaments composed of myosin from restraints of myofibrils, and then both filaments were solubilized through the IMP-induced dissociation of actomyosin. Thus, we concluded that IMP is a candidate agent to resolve rigor mortis because of its ability to break the association between thick and thin filaments. © 2016 Japanese Society of Animal Science.
Alternative pre-rigor foreshank positioning can improve beef shoulder muscle tenderness.
Grayson, A L; Lawrence, T E
2013-09-01
Thirty beef carcasses were harvested and the foreshank of each side was independently positioned (cranial, natural, parallel, or caudal) 1h post-mortem to determine the effect of foreshank angle at rigor mortis on the sarcomere length and tenderness of six beef shoulder muscles. The infraspinatus (IS), pectoralis profundus (PP), serratus ventralis (SV), supraspinatus (SS), teres major (TM) and triceps brachii (TB) were excised 48 h post-mortem for Warner-Bratzler shear force (WBSF) and sarcomere length evaluations. All muscles except the SS had altered (P<0.05) sarcomere lengths between positions; the cranial position resulted in the longest sarcomeres for the SV and TB muscles whilst the natural position had longer sarcomeres for the PP and TM muscles. The SV from the cranial position had lower (P<0.05) shear than the caudal position and TB from the natural position had lower (P<0.05) shear than the parallel or caudal positions. Sarcomere length was moderately correlated (r=-0.63; P<0.01) to shear force. Copyright © 2013 Elsevier Ltd. All rights reserved.
Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death.
Galimov, Evgeniy R; Pryor, Rosina E; Poole, Sarah E; Benedetto, Alexandre; Pincus, Zachary; Gems, David
2018-03-06
Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence). Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC). This phenomenon is accompanied by a wave of intramuscular Ca 2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca 2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death. VIDEO ABSTRACT. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
Rigor mortis and the epileptology of Charles Bland Radcliffe (1822-1889).
Eadie, M J
2007-03-01
Charles Bland Radcliffe (1822-1889) was one of the physicians who made major contributions to the literature on epilepsy in the mid-19th century, when the modern understanding of the disorder was beginning to emerge, particularly in England. His experimental work was concerned with the electrical properties of frog muscle and nerve. Early in his career he related his experimental findings to the phenomenon of rigor mortis and concluded that, contrary to the general belief of the time, muscle contraction depended on the cessation of nerve input, and muscle relaxation on its presence. He adhered to this counter-intuitive interpretation throughout his life and, based on it, produced an epileptology that was very different from those of his contemporaries and successors. His interpretations were ultimately without any direct influence on the advance of knowledge. However, his idea that withdrawal of an inhibitory process released previously suppressed muscular contractile powers, when applied to the brain rather than the periphery of the nervous system, permitted Hughlings Jackson to explain certain psychological phenomena that accompany or follow some epileptic events. As well, Radcliffe was one of the chief early advocates for potassium bromide, the first effective anticonvulsant.
Hamid, H.
2018-01-01
The purpose of this study is to analyze an improvement of students’ mathematical critical thinking (CT) ability in Real Analysis course by using Rigorous Teaching and Learning (RTL) model with informal argument. In addition, this research also attempted to understand students’ CT on their initial mathematical ability (IMA). This study was conducted at a private university in academic year 2015/2016. The study employed the quasi-experimental method with pretest-posttest control group design. The participants of the study were 83 students in which 43 students were in the experimental group and 40 students were in the control group. The finding of the study showed that students in experimental group outperformed students in control group on mathematical CT ability based on their IMA (high, medium, low) in learning Real Analysis. In addition, based on medium IMA the improvement of mathematical CT ability of students who were exposed to RTL model with informal argument was greater than that of students who were exposed to CI (conventional instruction). There was also no effect of interaction between RTL model and CI model with both (high, medium, and low) IMA increased mathematical CT ability. Finally, based on (high, medium, and low) IMA there was a significant improvement in the achievement of all indicators of mathematical CT ability of students who were exposed to RTL model with informal argument than that of students who were exposed to CI.
Control group design: enhancing rigor in research of mind-body therapies for depression.
Kinser, Patricia Anne; Robins, Jo Lynne
2013-01-01
Although a growing body of research suggests that mind-body therapies may be appropriate to integrate into the treatment of depression, studies consistently lack methodological sophistication particularly in the area of control groups. In order to better understand the relationship between control group selection and methodological rigor, we provide a brief review of the literature on control group design in yoga and tai chi studies for depression, and we discuss challenges we have faced in the design of control groups for our recent clinical trials of these mind-body complementary therapies for women with depression. To address the multiple challenges of research about mind-body therapies, we suggest that researchers should consider 4 key questions: whether the study design matches the research question; whether the control group addresses performance, expectation, and detection bias; whether the control group is ethical, feasible, and attractive; and whether the control group is designed to adequately control for nonspecific intervention effects. Based on these questions, we provide specific recommendations about control group design with the goal of minimizing bias and maximizing validity in future research.
Directory of Open Access Journals (Sweden)
Henrik von Wehrden
2017-02-01
Full Text Available Sustainability science encompasses a unique field that is defined through its purpose, the problem it addresses, and its solution-oriented agenda. However, this orientation creates significant methodological challenges. In this discussion paper, we conceptualize sustainability problems as wicked problems to tease out the key challenges that sustainability science is facing if scientists intend to deliver on its solution-oriented agenda. Building on the available literature, we discuss three aspects that demand increased attention for advancing sustainability science: 1 methods with higher diversity and complementarity are needed to increase the chance of deriving solutions to the unique aspects of wicked problems; for instance, mixed methods approaches are potentially better suited to allow for an approximation of solutions, since they cover wider arrays of knowledge; 2 methodologies capable of dealing with wicked problems demand strict procedural and ethical guidelines, in order to ensure their integration potential; for example, learning from solution implementation in different contexts requires increased comparability between research approaches while carefully addressing issues of legitimacy and credibility; and 3 approaches are needed that allow for longitudinal research, since wicked problems are continuous and solutions can only be diagnosed in retrospect; for example, complex dynamics of wicked problems play out across temporal patterns that are not necessarily aligned with the common timeframe of participatory sustainability research. Taken together, we call for plurality in methodologies, emphasizing procedural rigor and the necessity of continuous research to effectively addressing wicked problems as well as methodological challenges in sustainability science.
Directory of Open Access Journals (Sweden)
MIHAI NOVAC
2012-05-01
Full Text Available According to many of its traditional critics, the main weakness in Kantian moral-political philosophy resides in its impossibility of admitting exceptions. In nuce, all these critical positions have converged, despite their reciprocal heterogeneity, in the so called accuse of moral rigorism (unjustly, I would say directed against Kant’s moral and political perspective. As such, basically, I will seek to defend Kant against this type of criticism, by showing that any perspective attempting to evaluate Kant’s ethics on the grounds of its capacity or incapacity to admit exceptions is apriorily doomed to lack of sense, in its two logical alternatives, i.e. either as nonsense (predicating about empty notions, or as tautology (formulating ad hoc definitions and criteria with respect to Kant’s system and then claiming that it does not hold with respect to them. Essentially, I will try to show that Kantian ethics can organically immunize itself epistemologically against any such so called antirigorist criticism.
Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance.
Directory of Open Access Journals (Sweden)
Sophie Marchal
Full Text Available Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs' greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately.
Applying rigorous decision analysis methodology to optimization of a tertiary recovery project
International Nuclear Information System (INIS)
Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.
1992-01-01
This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized
Energy Technology Data Exchange (ETDEWEB)
Maksymov, Ivan S., E-mail: ivan.maksymov@uwa.edu.au [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); ARC Centre of Excellence for Nanoscale BioPhotonics, School of Applied Sciences, RMIT University, Melbourne, VIC 3001 (Australia); Hutomo, Jessica; Nam, Donghee; Kostylev, Mikhail [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia)
2015-05-21
We demonstrate theoretically a ∼350-fold local enhancement of the intensity of the in-plane microwave magnetic field in multilayered structures made from a magneto-insulating yttrium iron garnet (YIG) layer sandwiched between two non-magnetic layers with a high dielectric constant matching that of YIG. The enhancement is predicted for the excitation regime when the microwave magnetic field is induced inside the multilayer by the transducer of a stripline Broadband Ferromagnetic Resonance (BFMR) setup. By means of a rigorous numerical solution of the Landau-Lifshitz-Gilbert equation consistently with the Maxwell's equations, we investigate the magnetisation dynamics in the multilayer. We reveal a strong photon-magnon coupling, which manifests itself as anti-crossing of the ferromagnetic resonance magnon mode supported by the YIG layer and the electromagnetic resonance mode supported by the whole multilayered structure. The frequency of the magnon mode depends on the external static magnetic field, which in our case is applied tangentially to the multilayer in the direction perpendicular to the microwave magnetic field induced by the stripline of the BFMR setup. The frequency of the electromagnetic mode is independent of the static magnetic field. Consequently, the predicted photon-magnon coupling is sensitive to the applied magnetic field and thus can be used in magnetically tuneable metamaterials based on simultaneously negative permittivity and permeability achievable thanks to the YIG layer. We also suggest that the predicted photon-magnon coupling may find applications in microwave quantum information systems.
Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun
2017-11-01
Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.
Rigorous constraints on the matrix elements of the energy–momentum tensor
Directory of Open Access Journals (Sweden)
Peter Lowdon
2017-11-01
Full Text Available The structure of the matrix elements of the energy–momentum tensor play an important role in determining the properties of the form factors A(q2, B(q2 and C(q2 which appear in the Lorentz covariant decomposition of the matrix elements. In this paper we apply a rigorous frame-independent distributional-matching approach to the matrix elements of the Poincaré generators in order to derive constraints on these form factors as q→0. In contrast to the literature, we explicitly demonstrate that the vanishing of the anomalous gravitomagnetic moment B(0 and the condition A(0=1 are independent of one another, and that these constraints are not related to the specific properties or conservation of the individual Poincaré generators themselves, but are in fact a consequence of the physical on-shell requirement of the states in the matrix elements and the manner in which these states transform under Poincaré transformations.
Kornhaber, Rachel Anne; de Jong, A E E; McLean, L
2015-12-01
Qualitative methods are progressively being implemented by researchers for exploration within healthcare. However, there has been a longstanding and wide-ranging debate concerning the relative merits of qualitative research within the health care literature. This integrative review aimed to exam the contribution of qualitative research in burns care and subsequent rehabilitation. Studies were identified using an electronic search strategy using the databases PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), Excerpta Medica database (EMBASE) and Scopus of peer reviewed primary research in English between 2009 to April 2014 using Whittemore and Knaﬂ's integrative review method as a guide for analysis. From the 298 papers identified, 26 research papers met the inclusion criteria. Across all studies there was an average of 22 participants involved in each study with a range of 6-53 participants conducted across 12 nations that focussed on burns prevention, paediatric burns, appropriate acquisition and delivery of burns care, pain and psychosocial implications of burns trauma. Careful and rigorous application of qualitative methodologies promotes and enriches the development of burns knowledge. In particular, the key elements in qualitative methodological process and its publication are critical in disseminating credible and methodologically sound qualitative research. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.
International Nuclear Information System (INIS)
Calahorra, Yonatan; Mendels, Dan; Epstein, Ariel
2014-01-01
Bounded geometries introduce a fundamental problem in calculating the image force barrier lowering of metal-wrapped semiconductor systems. In bounded geometries, the derivation of the barrier lowering requires calculating the reference energy of the system, when the charge is at the geometry center. In the following, we formulate and rigorously solve this problem; this allows combining the image force electrostatic potential with the band diagram of the bounded geometry. The suggested approach is applied to spheres as well as cylinders. Furthermore, although the expressions governing cylindrical systems are complex and can only be evaluated numerically, we present analytical approximations for the solution, which allow easy implementation in calculated band diagrams. The results are further used to calculate the image force barrier lowering of metal-wrapped cylindrical nanowires; calculations show that although the image force potential is stronger than that of planar systems, taking the complete band-structure into account results in a weaker effect of barrier lowering. Moreover, when considering small diameter nanowires, we find that the electrostatic effects of the image force exceed the barrier region, and influence the electronic properties of the nanowire core. This study is of interest to the nanowire community, and in particular for the analysis of nanowire I−V measurements where wrapped or omega-shaped metallic contacts are used. (paper)
Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows
Qi, Di; Majda, Andrew J.
2018-04-01
Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.
Diffraction-based overlay measurement on dedicated mark using rigorous modeling method
Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang
2012-03-01
Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.
Estimation of the convergence order of rigorous coupled-wave analysis for OCD metrology
Ma, Yuan; Liu, Shiyuan; Chen, Xiuguo; Zhang, Chuanwei
2011-12-01
In most cases of optical critical dimension (OCD) metrology, when applying rigorous coupled-wave analysis (RCWA) to optical modeling, a high order of Fourier harmonics is usually set up to guarantee the convergence of the final results. However, the total number of floating point operations grows dramatically as the truncation order increases. Therefore, it is critical to choose an appropriate order to obtain high computational efficiency without losing much accuracy in the meantime. In this paper, the convergence order associated with the structural and optical parameters has been estimated through simulation. The results indicate that the convergence order is linear with the period of the sample when fixing the other parameters, both for planar diffraction and conical diffraction. The illuminated wavelength also affects the convergence of a final result. With further investigations concentrated on the ratio of illuminated wavelength to period, it is discovered that the convergence order decreases with the growth of the ratio, and when the ratio is fixed, convergence order jumps slightly, especially in a specific range of wavelength. This characteristic could be applied to estimate the optimum convergence order of given samples to obtain high computational efficiency.
Nyitrai, M; Hild, G; Lukács, A; Bódis, E; Somogyi, B
2000-01-28
Cyclic conformational changes in the myosin head are considered essential for muscle contraction. We hereby show that the extension of the fluorescence resonance energy transfer method described originally by Taylor et al. (Taylor, D. L., Reidler, J., Spudich, J. A., and Stryer, L. (1981) J. Cell Biol. 89, 362-367) allows determination of the position of a labeled point outside the actin filament in supramolecular complexes and also characterization of the conformational heterogeneity of an actin-binding protein while considering donor-acceptor distance distributions. Using this method we analyzed proximity relationships between two labeled points of S1 and the actin filament in the acto-S1 rigor complex. The donor (N-[[(iodoacetyl)amino]ethyl]-5-naphthylamine-1-sulfonate) was attached to either the catalytic domain (Cys-707) or the essential light chain (Cys-177) of S1, whereas the acceptor (5-(iodoacetamido)fluorescein) was attached to the actin filament (Cys-374). In contrast to the narrow positional distribution (assumed as being Gaussian) of Cys-707 (5 +/- 3 A), the positional distribution of Cys-177 was found to be broad (102 +/- 4 A). Such a broad positional distribution of the label on the essential light chain of S1 may be important in accommodating the helically arranged acto-myosin binding relative to the filament axis.
K. Di; Y. Liu; B. Liu; M. Peng
2012-01-01
Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D c...
Imprint of DESI fiber assignment on the anisotropic power spectrum of emission line galaxies
Energy Technology Data Exchange (ETDEWEB)
Pinol, Lucas [Département de Physique, École Normale Supérieure, Paris (France); Cahn, Robert N. [Lawrence Berkeley National Laboratory, Berkeley, California (United States); Hand, Nick [Department of Astronomy, University of California, Berkeley, California (United States); Seljak, Uroš; White, Martin, E-mail: lucas.pinol@ens.fr, E-mail: rncahn@lbl.gov, E-mail: nhand@berkeley.edu, E-mail: useljak@berkeley.edu, E-mail: mwhite@berkeley.edu [Department of Physics, University of California, Berkeley, California (United States)
2017-04-01
The Dark Energy Spectroscopic Instrument (DESI), a multiplexed fiber-fed spectrograph, is a Stage-IV ground-based dark energy experiment aiming to measure redshifts for 29 million Emission-Line Galaxies (ELG), 4 million Luminous Red Galaxies (LRG), and 2 million Quasi-Stellar Objects (QSO). The survey design includes a pattern of tiling on the sky, the locations of the fiber positioners in the focal plane of the telescope, and an observation strategy determined by a fiber assignment algorithm that optimizes the allocation of fibers to targets. This strategy allows a given region to be covered on average five times for a five-year survey, with a typical variation of about 1.5 about the mean, which imprints a spatially-dependent pattern on the galaxy clustering. We investigate the systematic effects of the fiber assignment coverage on the anisotropic galaxy clustering of ELGs and show that, in the absence of any corrections, it leads to discrepancies of order ten percent on large scales for the power spectrum multipoles. We introduce a method where objects in a random catalog are assigned a coverage, and the mean density is separately computed for each coverage factor. We show that this method reduces, but does not eliminate the effect. We next investigate the angular dependence of the contaminated signal, arguing that it is mostly localized to purely transverse modes. We demonstrate that the cleanest way to remove the contaminating signal is to perform an analysis of the anisotropic power spectrum P ( k ,μ) and remove the lowest μ bin, leaving μ > 0 modes accurate at the few-percent level. Here, μ is the cosine of the angle between the line-of-sight and the direction of k-vector . We also investigate two alternative definitions of the random catalog and show that they are comparable but less effective than the coverage randoms method.
Calibrated peer review assignments for the earth sciences
Rudd, J.A.; Wang, V.Z.; Cervato, C.; Ridky, R.W.
2009-01-01
Calibrated Peer Review ??? (CPR), a web-based instructional tool developed as part of the National Science Foundation reform initiatives in undergraduate science education, allows instructors to incorporate multiple writing assignments in large courses without overwhelming the instructor. This study reports successful implementation of CPR in a large, introductory geology course and student learning of geoscience content. For each CPR assignment in this study, students studied web-based and paper resources, wrote an essay, and reviewed seven essays (three from the instructor, three from peers, and their own) on the topic. Although many students expressed negative attitudes and concerns, particularly about the peer review process of this innovative instructional approach, they also recognized the learning potential of completing CPR assignments. Comparing instruction on earthquakes and plate boundaries using a CPR assignment vs. an instructional video lecture and homework essay with extensive instructor feedback, students mastered more content via CPR instruction.
Characterizing Sailor and Command Enlisted Placement and Assignment Preferences
National Research Council Canada - National Science Library
Butler, Virginia
2002-01-01
.... DON currently matches sailors to billets using a labor-intensive detailing process. With evolving information technology, the assignment process could be accomplished using intelligent agents and web-based markets...
A Qualitative Analysis of the Turkish Gendarmerie Assignment Process
National Research Council Canada - National Science Library
Soylemez, Kadir
2005-01-01
...; this number increases to 43 million (65% of the population) in the summer months. This study is an organizational analysis of the current assignment process of the Turkish General Command of the Gendarmerie...
46 CFR 42.05-10 - Assigning authority.
2010-10-01
... Definition of Terms Used in This Subchapter § 42.05-10 Assigning authority. This term means the “American Bureau of Shipping” or such other recognized classification society which the Commandant may approve as...
ZAP: a distributed channel assignment algorithm for cognitive radio networks
Directory of Open Access Journals (Sweden)
Munaretto Anelise
2011-01-01
Full Text Available Abstract We propose ZAP, an algorithm for the distributed channel assignment in cognitive radio (CR networks. CRs are capable of identifying underutilized licensed bands of the spectrum, allowing their reuse by secondary users without interfering with primary users. In this context, efficient channel assignment is challenging as ideally it must be simple, incur acceptable communication overhead, provide timely response, and be adaptive to accommodate frequent changes in the network. Another challenge is the optimization of network capacity through interference minimization. In contrast to related work, ZAP addresses these challenges with a fully distributed approach based only on local (neighborhood knowledge, while significantly reducing computational costs and the number of messages required for channel assignment. Simulations confirm the efficiency of ZAP in terms of (i the performance tradeoff between different metrics and (ii the fast achievement of a suitable assignment solution regardless of network size and density.
Dynamic Passenger Assignment during Disruptions in Railway Systems
Zhu, Y.; Goverde, R.M.P.
2017-01-01
Passenger-oriented rescheduling problems receive increasing attention. However, the passenger assignment models used for evaluating the rescheduling solutions are usually simplified by many assumptions. To estimate passenger inconvenience more accurately, this paper establishes a dynamic passenger
A singular value sensitivity approach to robust eigenstructure assignment
DEFF Research Database (Denmark)
Søgaard-Andersen, Per; Trostmann, Erik; Conrad, Finn
1986-01-01
A design technique for improving the feedback properties of multivariable state feedback systems designed using eigenstructure assignment is presented. Based on a singular value analysis of the feedback properties a design parameter adjustment procedure is outlined. This procedure allows...
Subcarrier Group Assignment for MC-CDMA Wireless Networks
Directory of Open Access Journals (Sweden)
Le-Ngoc Tho
2007-01-01
Full Text Available Two interference-based subcarrier group assignment strategies in dynamic resource allocation are proposed for MC-CDMA wireless systems to achieve high throughput in a multicell environment. Least interfered group assignment (LIGA selects for each session the subcarrier group on which the user receives the minimum interference, while best channel ratio group assignment (BCRGA chooses the subcarrier group with the largest channel response-to-interference ratio. Both analytical framework and simulation model are developed for evaluation of throughput distribution of the proposed schemes. An iterative approach is devised to handle the complex interdependency between multicell interference profiles in the throughput analysis. Illustrative results show significant throughput improvement offered by the interference-based assignment schemes for MC-CDMA multicell wireless systems. In particular, under low loading conditions, LIGA renders the best performance. However, as the load increases BCRGA tends to offer superior performance.
Subcarrier Group Assignment for MC-CDMA Wireless Networks
Directory of Open Access Journals (Sweden)
Tho Le-Ngoc
2007-12-01
Full Text Available Two interference-based subcarrier group assignment strategies in dynamic resource allocation are proposed for MC-CDMA wireless systems to achieve high throughput in a multicell environment. Least interfered group assignment (LIGA selects for each session the subcarrier group on which the user receives the minimum interference, while best channel ratio group assignment (BCRGA chooses the subcarrier group with the largest channel response-to-interference ratio. Both analytical framework and simulation model are developed for evaluation of throughput distribution of the proposed schemes. An iterative approach is devised to handle the complex interdependency between multicell interference profiles in the throughput analysis. Illustrative results show significant throughput improvement offered by the interference-based assignment schemes for MC-CDMA multicell wireless systems. In particular, under low loading conditions, LIGA renders the best performance. However, as the load increases BCRGA tends to offer superior performance.
ZAP: a distributed channel assignment algorithm for cognitive radio networks
Junior , Paulo Roberto ,; Fonseca , Mauro; Munaretto , Anelise; Viana , Aline ,; Ziviani , Artur
2011-01-01
Abstract We propose ZAP, an algorithm for the distributed channel assignment in cognitive radio (CR) networks. CRs are capable of identifying underutilized licensed bands of the spectrum, allowing their reuse by secondary users without interfering with primary users. In this context, efficient channel assignment is challenging as ideally it must be simple, incur acceptable communication overhead, provide timely response, and be adaptive to accommodate frequent changes in the network. Another ...
A parametric visualization software for the assignment problem
Directory of Open Access Journals (Sweden)
Papamanthou Charalampos
2005-01-01
Full Text Available In this paper we present a parametric visualization software used to assist the teaching of the Network Primal Simplex Algorithm for the assignment problem (AP. The assignment problem is a special case of the balanced transportation problem. The main functions of the algorithm and design techniques are also presented. Through this process, we aim to underline the importance and necessity of using such educational methods in order to improve the teaching of Computer Algorithms.
A demand assignment control in international business satellite communications network
Nohara, Mitsuo; Takeuchi, Yoshio; Takahata, Fumio; Hirata, Yasuo
An experimental system is being developed for use in an international business satellite (IBS) communications network based on demand-assignment (DA) and TDMA techniques. This paper discusses its system design, in particular from the viewpoints of a network configuration, a DA control, and a satellite channel-assignment algorithm. A satellite channel configuration is also presented along with a tradeoff study on transmission rate, HPA output power, satellite resource efficiency, service quality, and so on.
Interactive visual exploration and refinement of cluster assignments.
Kern, Michael; Lex, Alexander; Gehlenborg, Nils; Johnson, Chris R
2017-09-12
With ever-increasing amounts of data produced in biology research, scientists are in need of efficient data analysis methods. Cluster analysis, combined with visualization of the results, is one such method that can be used to make sense of large data volumes. At the same time, cluster analysis is known to be imperfect and depends on the choice of algorithms, parameters, and distance measures. Most clustering algorithms don't properly account for ambiguity in the source data, as records are often assigned to discrete clusters, even if an assignment is unclear. While there are metrics and visualization techniques that allow analysts to compare clusterings or to judge cluster quality, there is no comprehensive method that allows analysts to evaluate, compare, and refine cluster assignments based on the source data, derived scores, and contextual data. In this paper, we introduce a method that explicitly visualizes the quality of cluster assignments, allows comparisons of clustering results and enables analysts to manually curate and refine cluster assignments. Our methods are applicable to matrix data clustered with partitional, hierarchical, and fuzzy clustering algorithms. Furthermore, we enable analysts to explore clustering results in context of other data, for example, to observe whether a clustering of genomic data results in a meaningful differentiation in phenotypes. Our methods are integrated into Caleydo StratomeX, a popular, web-based, disease subtype analysis tool. We show in a usage scenario that our approach can reveal ambiguities in cluster assignments and produce improved clusterings that better differentiate genotypes and phenotypes.
PARETO OPTIMAL SOLUTIONS FOR MULTI-OBJECTIVE GENERALIZED ASSIGNMENT PROBLEM
Directory of Open Access Journals (Sweden)
S. Prakash
2012-01-01
Full Text Available
ENGLISH ABSTRACT: The Multi-Objective Generalized Assignment Problem (MGAP with two objectives, where one objective is linear and the other one is non-linear, has been considered, with the constraints that a job is assigned to only one worker – though he may be assigned more than one job, depending upon the time available to him. An algorithm is proposed to find the set of Pareto optimal solutions of the problem, determining assignments of jobs to workers with two objectives without setting priorities for them. The two objectives are to minimise the total cost of the assignment and to reduce the time taken to complete all the jobs.
AFRIKAANSE OPSOMMING: ‘n Multi-doelwit veralgemeende toekenningsprobleem (“multi-objective generalised assignment problem – MGAP” met twee doelwitte, waar die een lineêr en die ander nielineêr is nie, word bestudeer, met die randvoorwaarde dat ‘n taak slegs toegedeel word aan een werker – alhoewel meer as een taak aan hom toegedeel kan word sou die tyd beskikbaar wees. ‘n Algoritme word voorgestel om die stel Pareto-optimale oplossings te vind wat die taaktoedelings aan werkers onderhewig aan die twee doelwitte doen sonder dat prioriteite toegeken word. Die twee doelwitte is om die totale koste van die opdrag te minimiseer en om die tyd te verminder om al die take te voltooi.
Energy Technology Data Exchange (ETDEWEB)
Ahn, Hye-Kyung; Kim, Byoung Chan; Jun, Seung-Hyun; Chang, Mun Seock; Lopez-Ferrer, Daniel; Smith, Richard D.; Gu, Man Bock; Lee, Sang-Won; Kim, Beom S.; Kim, Jungbae
2010-12-15
An efficient protein digestion in proteomic analysis requires the stabilization of proteases such as trypsin. In the present work, trypsin was stabilized in the form of enzyme coating on electrospun polymer nanofibers (EC-TR), which crosslinks additional trypsin molecules onto covalently-attached trypsin (CA-TR). EC-TR showed better stability than CA-TR in rigorous conditions, such as at high temperatures of 40 °C and 50 °C, in the presence of organic co-solvents, and at various pH's. For example, the half-lives of CA-TR and EC-TR were 0.24 and 163.20 hours at 40 ºC, respectively. The improved stability of EC-TR can be explained by covalent-linkages on the surface of trypsin molecules, which effectively inhibits the denaturation, autolysis, and leaching of trypsin. The protein digestion was performed at 40 °C by using both CA-TR and EC-TR in digesting a model protein, enolase. EC-TR showed better performance and stability than CA-TR by maintaining good performance of enolase digestion under recycled uses for a period of one week. In the same condition, CA-TR showed poor performance from the beginning, and could not be used for digestion at all after a few usages. The enzyme coating approach is anticipated to be successfully employed not only for protein digestion in proteomic analysis, but also for various other fields where the poor enzyme stability presently hampers the practical applications of enzymes.
Rigorous construction and Hadamard property of the Unruh state in Schwarzschild spacetime
International Nuclear Information System (INIS)
Dappiaggi, Claudio; Pinamonti, Nicola
2009-07-01
The discovery of the radiation properties of black holes prompted the search for a natural candidate quantum ground state for a massless scalar field theory on Schwarzschild spacetime, here considered in the Eddington-Finkelstein representation. Among the several available proposals in the literature, an important physical role is played by the so-called Unruh state which is supposed to be appropriate to capture the physics of a black hole formed by spherically symmetric collapsing matter. Within this respect, we shall consider a massless Klein-Gordon field and we shall rigorously and globally construct such state, that is on the algebra of Weyl observables localised in the union of the static external region, the future event horizon and the non-static black hole region. Eventually, out of a careful use of microlocal techniques, we prove that the built state fulfils, where defined, the so-called Hadamard condition; hence, it is perturbatively stable, in other words realizing the natural candidate with which one could study purely quantum phenomena such as the role of the back reaction of Hawking's radiation. From a geometrical point of view, we shall make a profitable use of a bulk-to-boundary reconstruction technique which carefully exploits the Killing horizon structure as well as the conformal asymptotic behaviour of the underlying background. From an analytical point of view, our tools will range from Hoermander's theorem on propagation of singularities, results on the role of passive states, and a detailed use of the recently discovered peeling behaviour of the solutions of the wave equation in Schwarzschild spacetime. (orig.)
Rigor mortis at the myocardium investigated by post-mortem magnetic resonance imaging.
Bonzon, Jérôme; Schön, Corinna A; Schwendener, Nicole; Zech, Wolf-Dieter; Kara, Levent; Persson, Anders; Jackowski, Christian
2015-12-01
Post-mortem cardiac MR exams present with different contraction appearances of the left ventricle in cardiac short axis images. It was hypothesized that the grade of post-mortem contraction may be related to the post-mortem interval (PMI) or cause of death and a phenomenon caused by internal rigor mortis that may give further insights in the circumstances of death. The cardiac contraction grade was investigated in 71 post-mortem cardiac MR exams (mean age at death 52 y, range 12-89 y; 48 males, 23 females). In cardiac short axis images the left ventricular lumen volume as well as the left ventricular myocardial volume were assessed by manual segmentation. The quotient of both (LVQ) represents the grade of myocardial contraction. LVQ was correlated to the PMI, sex, age, cardiac weight, body mass and height, cause of death and pericardial tamponade when present. In cardiac causes of death a separate correlation was investigated for acute myocardial infarction cases and arrhythmic deaths. LVQ values ranged from 1.99 (maximum dilatation) to 42.91 (maximum contraction) with a mean of 15.13. LVQ decreased slightly with increasing PMI, however without significant correlation. Pericardial tamponade positively correlated with higher LVQ values. Variables such as sex, age, body mass and height, cardiac weight and cause of death did not correlate with LVQ values. There was no difference in LVQ values for myocardial infarction without tamponade and arrhythmic deaths. Based on the observation in our investigated cases, the phenomenon of post-mortem myocardial contraction cannot be explained by the influence of the investigated variables, except for pericardial tamponade cases. Further research addressing post-mortem myocardial contraction has to focus on other, less obvious factors, which may influence the early post-mortem phase too. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
A Generic Model for Relative Adjustment Between Optical Sensors Using Rigorous Orbit Mechanics
Directory of Open Access Journals (Sweden)
B. Islam
2008-06-01
Full Text Available The classical calibration or space resection is the fundamental task in photogrammetry. The lack of sufficient knowledge of interior and exterior orientation parameters lead to unreliable results in the photogrammetric process. One of the earliest in approaches using in photogrammetry was the plumb line calibration method. This method is suitable to recover the radial and decentering lens distortion coefficients, while the remaining interior(focal length and principal point coordinates and exterior orientation parameters have to be determined by a complimentary method. As the lens distortion remains very less it not considered as the interior orientation parameters, in the present rigorous sensor model. There are several other available methods based on the photogrammetric collinearity equations, which consider the determination of exterior orientation parameters, with no mention to the simultaneous determination of inner orientation parameters. Normal space resection methods solve the problem using control points, whose coordinates are known both in image and object reference systems. The non-linearity of the model and the problems, in point location in digital images and identifying the maximum GPS measured control points are the main drawbacks of the classical approaches. This paper addresses mathematical model based on the fundamental assumption of collineariy of three points of two Along-Track Stereo imagery sensors and independent object point. Assuming this condition it is possible to extract the exterior orientation (EO parameters for a long strip and single image together, without and with using the control points. Moreover, after extracting the EO parameters the accuracy for satellite data products are compared in with using single and with no control points.
Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications
Directory of Open Access Journals (Sweden)
Vassilis Gikas
2016-08-01
Full Text Available With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS, Inertial Measurement Unit (IMU and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness capabilities (i.e., potential and limitations based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android and iPhone 5s (iOS. Our findings indicate that the deviation of the smartphone locations from ground truth (trueness deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of
Bounding Averages Rigorously Using Semidefinite Programming: Mean Moments of the Lorenz System
Goluskin, David
2018-04-01
We describe methods for proving bounds on infinite-time averages in differential dynamical systems. The methods rely on the construction of nonnegative polynomials with certain properties, similarly to the way nonlinear stability can be proved using Lyapunov functions. Nonnegativity is enforced by requiring the polynomials to be sums of squares, a condition which is then formulated as a semidefinite program (SDP) that can be solved computationally. Although such computations are subject to numerical error, we demonstrate two ways to obtain rigorous results: using interval arithmetic to control the error of an approximate SDP solution, and finding exact analytical solutions to relatively small SDPs. Previous formulations are extended to allow for bounds depending analytically on parametric variables. These methods are illustrated using the Lorenz equations, a system with three state variables ( x, y, z) and three parameters (β ,σ ,r). Bounds are reported for infinite-time averages of all eighteen moments x^ly^mz^n up to quartic degree that are symmetric under (x,y)\\mapsto (-x,-y). These bounds apply to all solutions regardless of stability, including chaotic trajectories, periodic orbits, and equilibrium points. The analytical approach yields two novel bounds that are sharp: the mean of z^3 can be no larger than its value of (r-1)^3 at the nonzero equilibria, and the mean of xy^3 must be nonnegative. The interval arithmetic approach is applied at the standard chaotic parameters to bound eleven average moments that all appear to be maximized on the shortest periodic orbit. Our best upper bound on each such average exceeds its value on the maximizing orbit by less than 1%. Many bounds reported here are much tighter than would be possible without computer assistance.
Miyazaki, Hideki T; Miyazaki, Hiroshi; Miyano, Kenjiro
2003-09-01
We have recently identified the resonant scattering from dielectric bispheres in the specular direction, which has long been known as the specular resonance, to be a type of rainbow (a caustic) and a general phenomenon for bispheres. We discuss the details of the specular resonance on the basis of systematic calculations. In addition to the rigorous theory, which precisely describes the scattering even in the resonance regime, the ray-tracing method, which gives the scattering in the geometrical-optics limit, is used. Specular resonance is explicitly defined as strong scattering in the direction of the specular reflection from the symmetrical axis of the bisphere whose intensity exceeds that of the scattering from noninteracting bispheres. Then the range of parameters for computing a particular specular resonance is specified. This resonance becomes prominent in a wide range of refractive indices (from 1.2 to 2.2) in a wide range of size parameters (from five to infinity) and for an arbitrarily polarized light incident within an angle of 40 degrees to the symmetrical axis. This particular scattering can stay evident even when the spheres are not in contact or the sizes of the spheres are different. Thus specular resonance is a common and robust phenomenon in dielectric bispheres. Furthermore, we demonstrate that various characteristic features in the scattering from bispheres can be explained successfully by using intuitive and simple representations. Most of the significant scatterings other than the specular resonance are also understandable as caustics in geometrical-optics theory. The specular resonance becomes striking at the smallest size parameter among these caustics because its optical trajectory is composed of only the refractions at the surfaces and has an exceptionally large intensity. However, some characteristics are not accounted for by geometrical optics. In particular, the oscillatory behaviors of their scattering intensity are well described by
Directory of Open Access Journals (Sweden)
Andrea Sebastiano Staiti
2012-01-01
Full Text Available In this paper I present and assess Husserl's arguments against epistomological and psychological naturalism in his essay Philosophy as a Rigorous Science. I show that his critique is directed against positions that are generally more extreme than most currently debated variants of naturalism. However, Husserl has interesting thoughts to contribute to philosophy today. First, he shows that there is an important connection between naturalism in epistemology (which in his view amounts to the position that the validity of logic can be reduced to the validity natural laws of thinking and naturalism in psychology (which in his view amounts to the position that all psychic occurrences are merely parallel accompaniments of physiological occurrences. Second, he shows that a strong version of epistemological naturalism is self-undermining and fails to translate the cogency of logic in psychological terms. Third, and most importantly for current debates, he attacks Cartesianism as a form of psychological naturalism because of its construal of the psyche as a substance. Against this position, Husserl defends the necessity to formulate new epistemic aims for the investigation of consciousness. He contends that what is most interesting about consciousness is not its empirical fact but its transcendental function of granting cognitive access to all kinds of objects (both empirical and ideal. The study of this function requires a specific method (eidetics that cannot be conflated with empirical methods. I conclude that Husserl's analyses offer much-needed insight into the fabric of consciousness and compelling arguments against unwarranted metaphysical speculations about the relationship between mind and body.
A Development of Advanced Rigorous 2 Step System for the High Resolution Residual Dose Evaluation
Energy Technology Data Exchange (ETDEWEB)
Kim, Do Hyun; Kim, Jong Woo; Kim, Jea Hyun; Lee, Jae Yong; Shin, Chang Ho [Hanyang Univ., Seoul (Korea, Republic of); Kim, Song Hyun [Kyoto University, Sennan (Japan)
2016-10-15
In these days, an activation problem such as residual radiation is one of the important issues. The activated devices and structures can emit the residual radiation. Therefore, the activation should be properly analyzed to make a plan for design, operation, and decontamination of nuclear facilities. For activation calculation, Rigorous 2 Step (R2S) method is introduced as following strategy: (1) the particle transport calculation is performed for an object geometry to get particle spectra and total fluxes; (2) inventories of each cell are calculated by using flux information according to irradiation and decay history; (3) the residual gamma distribution was evaluated by transport code, if needed. This scheme is based on cell calculation of used geometry. In this method, the particle spectra and total fluxes are obtained by mesh tally for activation calculation. It is useful to reduce the effects of gradient flux information. Nevertheless, several limitations are known as follows: Firstly, high relative error of spectra, when lots of meshes were used; secondly, different flux information from spectrum of void in mesh-tally. To calculate high resolution residual dose, several method are developed such as R2Smesh and MCR2S unstructured mesh. The R2Smesh method products better efficiency for obtaining neutron spectra by using fine/coarse mesh. Also, the MCR2S unstructured mesh can effectively separate void spectrum. In this study, the AR2S system was developed to combine the features of those mesh based R2S method. To confirm the AR2S system, the simple activation problem was evaluated and compared with R2S method using same division. Those results have good agreement within 0.83 %. Therefore, it is expected that the AR2S system can properly estimate an activation problem.
Rigorous construction and Hadamard property of the Unruh state in Schwarzschild spacetime
Energy Technology Data Exchange (ETDEWEB)
Dappiaggi, Claudio; Pinamonti, Nicola [Hamburg Univ. (Germany). II. Inst. fuer Theoretische Physik; Moretti, Valter [Trento Univ., Povo (Italy). Dipt. di Matematica; Istituto Nazionale di Fisica Nucleare, Povo (Italy); Istituto Nazionale di Alta Matematica ' ' F. Severi' ' , GNFM, Sesto Fiorentino (Italy)
2009-07-15
The discovery of the radiation properties of black holes prompted the search for a natural candidate quantum ground state for a massless scalar field theory on Schwarzschild spacetime, here considered in the Eddington-Finkelstein representation. Among the several available proposals in the literature, an important physical role is played by the so-called Unruh state which is supposed to be appropriate to capture the physics of a black hole formed by spherically symmetric collapsing matter. Within this respect, we shall consider a massless Klein-Gordon field and we shall rigorously and globally construct such state, that is on the algebra of Weyl observables localised in the union of the static external region, the future event horizon and the non-static black hole region. Eventually, out of a careful use of microlocal techniques, we prove that the built state fulfils, where defined, the so-called Hadamard condition; hence, it is perturbatively stable, in other words realizing the natural candidate with which one could study purely quantum phenomena such as the role of the back reaction of Hawking's radiation. From a geometrical point of view, we shall make a profitable use of a bulk-to-boundary reconstruction technique which carefully exploits the Killing horizon structure as well as the conformal asymptotic behaviour of the underlying background. From an analytical point of view, our tools will range from Hoermander's theorem on propagation of singularities, results on the role of passive states, and a detailed use of the recently discovered peeling behaviour of the solutions of the wave equation in Schwarzschild spacetime. (orig.)
Erikson, U; Misimi, E
2008-03-01
The changes in skin and fillet color of anesthetized and exhausted Atlantic salmon were determined immediately after killing, during rigor mortis, and after ice storage for 7 d. Skin color (CIE L*, a*, b*, and related values) was determined by a Minolta Chroma Meter. Roche SalmoFan Lineal and Roche Color Card values were determined by a computer vision method and a sensory panel. Before color assessment, the stress levels of the 2 fish groups were characterized in terms of white muscle parameters (pH, rigor mortis, and core temperature). The results showed that perimortem handling stress initially significantly affected several color parameters of skin and fillets. Significant transient fillet color changes also occurred in the prerigor phase and during the development of rigor mortis. Our results suggested that fillet color was affected by postmortem glycolysis (pH drop, particularly in anesthetized fillets), then by onset and development of rigor mortis. The color change patterns during storage were different for the 2 groups of fish. The computer vision method was considered suitable for automated (online) quality control and grading of salmonid fillets according to color.
Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C
2016-01-15
Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.
Warren, Mark R.; Calderón, José; Kupscznk, Luke Aubry; Squires, Gregory; Su, Celina
2018-01-01
Contrary to the charge that advocacy-oriented research cannot meet social science research standards because it is inherently biased, the authors of this article argue that collaborative, community-engaged scholarship (CCES) must meet high standards of rigor if it is to be useful to support equity-oriented, social justice agendas. In fact, they…
Directory of Open Access Journals (Sweden)
Kun Hu
2016-09-01
Full Text Available High precision geometric rectification of High Resolution Satellite Imagery (HRSI is the basis of digital mapping and Three-Dimensional (3D modeling. Taking advantage of line features as basic geometric control conditions instead of control points, the Line-Based Transformation Model (LBTM provides a practical and efficient way of image rectification. It is competent to build the mathematical relationship between image space and the corresponding object space accurately, while it reduces the workloads of ground control and feature recognition dramatically. Based on generalization and the analysis of existing LBTMs, a novel rigorous LBTM is proposed in this paper, which can further eliminate the geometric deformation caused by sensor inclination and terrain variation. This improved nonlinear LBTM is constructed based on a generalized point strategy and resolved by least squares overall adjustment. Geo-positioning accuracy experiments with IKONOS, GeoEye-1 and ZiYuan-3 satellite imagery are performed to compare rigorous LBTM with other relevant line-based and point-based transformation models. Both theoretic analysis and experimental results demonstrate that the rigorous LBTM is more accurate and reliable without adding extra ground control. The geo-positioning accuracy of satellite imagery rectified by rigorous LBTM can reach about one pixel with eight control lines and can be further improved by optimizing the horizontal and vertical distribution of control lines.
Vaade, Elizabeth; McCready, Bo
2012-01-01
Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…
Directory of Open Access Journals (Sweden)
van Mechelen Willem
2010-11-01
Full Text Available Abstract Background Preliminary studies suggest that physical exercise interventions can improve physical fitness, fatigue and quality of life in cancer patients after completion of chemotherapy. Additional research is needed to rigorously test the effects of exercise programmes among cancer patients and to determine optimal training intensity accordingly. The present paper presents the design of a randomized controlled trial evaluating the effectiveness and cost-effectiveness of a high intensity exercise programme compared to a low-to-moderate intensity exercise programme and a waiting list control group on physical fitness and fatigue as primary outcomes. Methods After baseline measurements, cancer patients who completed chemotherapy are randomly assigned to either a 12-week high intensity exercise programme or a low-to-moderate intensity exercise programme. Next, patients from both groups are randomly assigned to immediate training or a waiting list (i.e. waiting list control group. After 12 weeks, patients of the waiting list control group start with the exercise programme they have been allocated to. Both interventions consist of equal bouts of resistance and endurance interval exercises with the same frequency and duration, but differ in training intensity. Additionally, patients of both exercise programmes are counselled to improve compliance and achieve and maintain an active lifestyle, tailored to their individual preferences and capabilities. Measurements will be performed at baseline (t = 0, 12 weeks after randomization (t = 1, and 64 weeks after randomization (t = 2. The primary outcome measures are cardiorespiratory fitness and muscle strength assessed by means of objective performance indicators, and self-reported fatigue. Secondary outcome measures include health-related quality of life, self-reported physical activity, daily functioning, body composition, mood and sleep disturbances, and return to work. In addition, compliance
Directory of Open Access Journals (Sweden)
Jae-Young Shin
2015-01-01
Full Text Available Purpose. This trial was performed to investigate the efficacy of laser acupuncture for the alleviation of lower back pain. Methods. This was a randomized, placebo-controlled, double-blind trial. Fifty-six participants were randomly assigned to either the laser acupuncture group (n=28 or the sham laser acupuncture group (n=28. Participants in both groups received three treatment sessions over the course of one week. Thirteen acupuncture points were selected. The visual analogue scale for pain, pressure pain threshold, Patient Global Impression of Change, and Euro-Quality-of-Life Five Dimensions questionnaire (Korean version were used to evaluate the effect of laser acupuncture treatment on lower back pain. Results. There were no significant differences in any outcome between the two groups, although the participants in both groups showed a significant improvement in each assessed parameter relative to the baseline values. Conclusion. Although there was no significant difference in outcomes between the two groups, the results suggest that laser acupuncture can provide effective pain alleviation and can be considered an option for relief from lower back pain. Further studies using long-term intervention, a larger sample size, and rigorous methodology are required to clarify the effect of laser acupuncture on lower back pain.
Directory of Open Access Journals (Sweden)
Aleksandra E. Zgierska
2017-01-01
Full Text Available Background. Treatment fidelity is essential to methodological rigor of clinical trials evaluating behavioral interventions such as Mindfulness Meditation (MM. However, procedures for monitoring and maintenance of treatment fidelity are inconsistently applied, limiting the strength of such research. Objective. To describe the implementation and findings related to fidelity monitoring of the Mindfulness-Based Relapse Prevention for Alcohol Dependence (MBRP-A intervention in a 26-week randomized controlled trial. Methods. 123 alcohol dependent adults were randomly assigned to MM (MBRP-A and home practice, adjunctive to usual care; N=64 or control (usual care alone; N=59. Treatment fidelity assessment strategies recommended by the National Institutes of Health Behavior Change Consortium for study/intervention design, therapist training, intervention delivery, and treatment receipt and enactment were applied. Results. Ten 8-session interventions were delivered. Therapist adherence and competence, assessed using the modified MBRP Adherence and Competence Scale, were high. Among the MM group participants, 46 attended ≥4 sessions; over 90% reported at-home MM practice at 8 weeks and 72% at 26 weeks. They also reported satisfaction with and usefulness of MM for maintaining sobriety. No adverse events were reported. Conclusions. A systematic approach to assessment of treatment fidelity in behavioral clinical trials allows determination of the degree of consistency between intended and actual delivery and receipt of intervention.
Vanmarcke, Erik
1983-03-01
Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.
Comparison of participatively set and assigned goals in the reduction of alcohol use.
Lozano, Brian E; Stephens, Robert S
2010-12-01
The effects of setting goals on goal commitment, self-efficacy for goal achievement, and goal achievement in the context of an alcohol use intervention were examined using an experimental design in which participants were randomized to participatively set goals, assigned goals, and no goal conditions. One hundred and twenty-six heavy-drinking college students received a single cognitive-behavioral assessment/intervention session and completed measures of goal commitment, self-efficacy for goal achievement, and alcohol use. Results were consistent with, and expanded upon, previous research by demonstrating that having a goal for limiting alcohol consumption was predictive of lower quantity and frequency of alcohol use relative to not having a goal. Participation in goal setting yielded greater goal commitment and self-efficacy for goal achievement than assigned goals, but did not result in significantly greater reductions in alcohol use relative to assigned goals. Goal commitment and self-efficacy explained unique variance in the prediction of alcohol use at follow-up. Findings support the importance of goal setting in alcohol interventions and suggest areas for further research. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Effect on injuries of assigning shoes based on foot shape in air force basic training.
Knapik, Joseph J; Brosch, Lorie C; Venuto, Margaret; Swedler, David I; Bullock, Steven H; Gaines, Lorraine S; Murphy, Ryan J; Tchandja, Juste; Jones, Bruce H
2010-01-01
This study examined whether assigning running shoes based on the shape of the bottom of the foot (plantar surface) influenced injury risk in Air Force Basic Military Training (BMT) and examined risk factors for injury in BMT. Data were collected from BMT recruits during 2007; analysis took place during 2008. After foot examinations, recruits were randomly consigned to either an experimental group (E, n=1042 men, 375 women) or a control group (C, n=913 men, 346 women). Experimental group recruits were assigned motion control, stability, or cushioned shoes for plantar shapes indicative of low, medium, or high arches, respectively. Control group recruits received a stability shoe regardless of plantar shape. Injuries during BMT were determined from outpatient visits provided from the Defense Medical Surveillance System. Other injury risk factors (fitness, smoking, physical activity, prior injury, menstrual history, and demographics) were obtained from a questionnaire, existing databases, or BMT units. Multivariate Cox regression controlling for other risk factors showed little difference in injury risk between the groups among men (hazard ratio [E/C]=1.11, 95% CI=0.89-1.38) or women (hazard ratio [E/C]=1.20, 95% CI= 0.90-1.60). Independent injury risk factors among both men and women included low aerobic fitness and cigarette smoking. This prospective study demonstrated that assigning running shoes based on the shape of the plantar surface had little influence on injury risk in BMT even after controlling for other injury risk factors. Published by Elsevier Inc.
USER S GUIDE FOR THE RANDOM DRUG SCREENING SYSTEM
Energy Technology Data Exchange (ETDEWEB)
McNeany, Karen I [ORNL
2013-12-01
The Random Drug Screening System (RDSS) is a desktop computing application designed to assign nongameable drug testing dates to each member in a population of employees, within a specific time line. The program includes reporting capabilities, test form generation, unique test ID number assignment, and the ability to flag high-risk employees for a higher frequency of drug testing than the general population.
MSOAR 2.0: Incorporating tandem duplications into ortholog assignment based on genome rearrangement
Directory of Open Access Journals (Sweden)
Zhang Liqing
2010-01-01
Full Text Available Abstract Background Ortholog assignment is a critical and fundamental problem in comparative genomics, since orthologs are considered to be functional counterparts in different species and can be used to infer molecular functions of one species from those of other species. MSOAR is a recently developed high-throughput system for assigning one-to-one orthologs between closely related species on a genome scale. It attempts to reconstruct the evolutionary history of input genomes in terms of genome rearrangement and gene duplication events. It assumes that a gene duplication event inserts a duplicated gene into the genome of interest at a random location (i.e., the random duplication model. However, in practice, biologists believe that genes are often duplicated by tandem duplications, where a duplicated gene is located next to the original copy (i.e., the tandem duplication model. Results In this paper, we develop MSOAR 2.0, an improved system for one-to-one ortholog assignment. For a pair of input genomes, the system first focuses on the tandemly duplicated genes of each genome and tries to identify among them those that were duplicated after the speciation (i.e., the so-called inparalogs, using a simple phylogenetic tree reconciliation method. For each such set of tandemly duplicated inparalogs, all but one gene will be deleted from the concerned genome (because they cannot possibly appear in any one-to-one ortholog pairs, and MSOAR is invoked. Using both simulated and real data experiments, we show that MSOAR 2.0 is able to achieve a better sensitivity and specificity than MSOAR. In comparison with the well-known genome-scale ortholog assignment tool InParanoid, Ensembl ortholog database, and the orthology information extracted from the well-known whole-genome multiple alignment program MultiZ, MSOAR 2.0 shows the highest sensitivity. Although the specificity of MSOAR 2.0 is slightly worse than that of InParanoid in the real data experiments
Socially-assigned race, healthcare discrimination and preventive healthcare services.
Directory of Open Access Journals (Sweden)
Tracy Macintosh
Full Text Available Race and ethnicity, typically defined as how individuals self-identify, are complex social constructs. Self-identified racial/ethnic minorities are less likely to receive preventive care and more likely to report healthcare discrimination than self-identified non-Hispanic whites. However, beyond self-identification, these outcomes may vary depending on whether racial/ethnic minorities are perceived by others as being minority or white; this perception is referred to as socially-assigned race.To examine the associations between socially-assigned race and healthcare discrimination and receipt of selected preventive services.Cross-sectional analysis of the 2004 Behavioral Risk Factor Surveillance System "Reactions to Race" module. Respondents from seven states and the District of Columbia were categorized into 3 groups, defined by a composite of self-identified race/socially-assigned race: Minority/Minority (M/M, n = 6,837, Minority/White (M/W, n = 929, and White/White (W/W, n = 25,913. Respondents were 18 years or older, with 61.7% under age 60; 51.8% of respondents were female. Measures included reported healthcare discrimination and receipt of vaccinations and cancer screenings.Racial/ethnic minorities who reported being socially-assigned as minority (M/M were more likely to report healthcare discrimination compared with those who reported being socially-assigned as white (M/W (8.9% vs. 5.0%, p = 0.002. Those reporting being socially-assigned as white (M/W and W/W had similar rates for past-year influenza (73.1% vs. 74.3% and pneumococcal (69.3% vs. 58.6% vaccinations; however, rates were significantly lower among M/M respondents (56.2% and 47.6%, respectively, p-values<0.05. There were no significant differences between the M/M and M/W groups in the receipt of cancer screenings.Racial/ethnic minorities who reported being socially-assigned as white are more likely to receive preventive vaccinations and less likely to report
RNA-PAIRS: RNA probabilistic assignment of imino resonance shifts
International Nuclear Information System (INIS)
Bahrami, Arash; Clos, Lawrence J.; Markley, John L.; Butcher, Samuel E.; Eghbalnia, Hamid R.
2012-01-01
The significant biological role of RNA has further highlighted the need for improving the accuracy, efficiency and the reach of methods for investigating RNA structure and function. Nuclear magnetic resonance (NMR) spectroscopy is vital to furthering the goals of RNA structural biology because of its distinctive capabilities. However, the dispersion pattern in the NMR spectra of RNA makes automated resonance assignment, a key step in NMR investigation of biomolecules, remarkably challenging. Herein we present RNA Probabilistic Assignment of Imino Resonance Shifts (RNA-PAIRS), a method for the automated assignment of RNA imino resonances with synchronized verification and correction of predicted secondary structure. RNA-PAIRS represents an advance in modeling the assignment paradigm because it seeds the probabilistic network for assignment with experimental NMR data, and predicted RNA secondary structure, simultaneously and from the start. Subsequently, RNA-PAIRS sets in motion a dynamic network that reverberates between predictions and experimental evidence in order to reconcile and rectify resonance assignments and secondary structure information. The procedure is halted when assignments and base-parings are deemed to be most consistent with observed crosspeaks. The current implementation of RNA-PAIRS uses an initial peak list derived from proton-nitrogen heteronuclear multiple quantum correlation ( 1 H– 15 N 2D HMQC) and proton–proton nuclear Overhauser enhancement spectroscopy ( 1 H– 1 H 2D NOESY) experiments. We have evaluated the performance of RNA-PAIRS by using it to analyze NMR datasets from 26 previously studied RNAs, including a 111-nucleotide complex. For moderately sized RNA molecules, and over a range of comparatively complex structural motifs, the average assignment accuracy exceeds 90%, while the average base pair prediction accuracy exceeded 93%. RNA-PAIRS yielded accurate assignments and base pairings consistent with imino resonances for a
RNA-PAIRS: RNA probabilistic assignment of imino resonance shifts
Energy Technology Data Exchange (ETDEWEB)
Bahrami, Arash; Clos, Lawrence J.; Markley, John L.; Butcher, Samuel E. [National Magnetic Resonance Facility at Madison (United States); Eghbalnia, Hamid R., E-mail: eghbalhd@uc.edu [University of Cincinnati, Department of Molecular and Cellular Physiology (United States)
2012-04-15
The significant biological role of RNA has further highlighted the need for improving the accuracy, efficiency and the reach of methods for investigating RNA structure and function. Nuclear magnetic resonance (NMR) spectroscopy is vital to furthering the goals of RNA structural biology because of its distinctive capabilities. However, the dispersion pattern in the NMR spectra of RNA makes automated resonance assignment, a key step in NMR investigation of biomolecules, remarkably challenging. Herein we present RNA Probabilistic Assignment of Imino Resonance Shifts (RNA-PAIRS), a method for the automated assignment of RNA imino resonances with synchronized verification and correction of predicted secondary structure. RNA-PAIRS represents an advance in modeling the assignment paradigm because it seeds the probabilistic network for assignment with experimental NMR data, and predicted RNA secondary structure, simultaneously and from the start. Subsequently, RNA-PAIRS sets in motion a dynamic network that reverberates between predictions and experimental evidence in order to reconcile and rectify resonance assignments and secondary structure information. The procedure is halted when assignments and base-parings are deemed to be most consistent with observed crosspeaks. The current implementation of RNA-PAIRS uses an initial peak list derived from proton-nitrogen heteronuclear multiple quantum correlation ({sup 1}H-{sup 15}N 2D HMQC) and proton-proton nuclear Overhauser enhancement spectroscopy ({sup 1}H-{sup 1}H 2D NOESY) experiments. We have evaluated the performance of RNA-PAIRS by using it to analyze NMR datasets from 26 previously studied RNAs, including a 111-nucleotide complex. For moderately sized RNA molecules, and over a range of comparatively complex structural motifs, the average assignment accuracy exceeds 90%, while the average base pair prediction accuracy exceeded 93%. RNA-PAIRS yielded accurate assignments and base pairings consistent with imino
Effect of an educational toolkit on quality of care: a pragmatic cluster randomized trial.
Shah, Baiju R; Bhattacharyya, Onil; Yu, Catherine H Y; Mamdani, Muhammad M; Parsons, Janet A; Straus, Sharon E; Zwarenstein, Merrick
2014-02-01
Printed educational materials for clinician education are one of the most commonly used approaches for quality improvement. The objective of this pragmatic cluster randomized trial was to evaluate the effectiveness of an educational toolkit focusing on cardiovascular disease screening and risk reduction in people with diabetes. All 933,789 people aged ≥40 years with diagnosed diabetes in Ontario, Canada were studied using population-level administrative databases, with additional clinical outcome data collected from a random sample of 1,592 high risk patients. Family practices were randomly assigned to receive the educational toolkit in June 2009 (intervention group) or May 2010 (control group). The primary outcome in the administrative data study, death or non-fatal myocardial infarction, occurred in 11,736 (2.5%) patients in the intervention group and 11,536 (2.5%) in the control group (p = 0.77). The primary outcome in the clinical data study, use of a statin, occurred in 700 (88.1%) patients in the intervention group and 725 (90.1%) in the control group (p = 0.26). Pre-specified secondary outcomes, including other clinical events, processes of care, and measures of risk factor control, were also not improved by the intervention. A limitation is the high baseline rate of statin prescribing in this population. The educational toolkit did not improve quality of care or cardiovascular outcomes in a population with diabetes. Despite being relatively easy and inexpensive to implement, printed educational materials were not effective. The study highlights the need for a rigorous and scientifically based approach to the development, dissemination, and evaluation of quality improvement interventions. http://www.ClinicalTrials.gov NCT01411865 and NCT01026688.
Triage level assignment and nurse characteristics and experience.
Gómez-Angelats, Elisenda; Miró, Òscar; Bragulat Baur, Ernesto; Antolín Santaliestra, Alberto; Sánchez Sánchez, Miquel
2018-06-01
To study the relation between nursing staff demographics and experience and their assignment of triage level in the emergency department. One-year retrospective observational study in the triage area of a tertiary care urban university hospital that applies the Andorran-Spanish triage model. Variables studied were age, gender, nursing experience, triage experience, shift, usual level of emergency work the nurse undertakes, number of triage decisions made, and percentage of patients assigned to each level. Fifty nurses (5 men, 45 women) with a mean (SD) age of 45 (9) years triaged 67 803 patients during the year. Nurses classified more patients in level 5 on the morning shift (7.9%) than on the afternoon shift (5.5%) (P=.003). The difference in the rate of level-5 triage classification became significant when nurses were older (β = 0.092, P=.037) and experience was greater (β = 0.103, P=.017). The number of triages recorded by a nurse was significantly and directly related to the percentage of patients assigned to level 3 (β = 0.003, P=.006) and inversely related to the percentages assigned to level 4 (β = -0.002, P=.008) and level 5 (β = -0.001, P=.017). We found that triage level assignments were related to age, experience, shift, and total number of patients triaged by a nurse.
Assigning spectra of chaotic molecules with diabatic correlation diagrams
International Nuclear Information System (INIS)
Rose, J.P.; Kellman, M.E.
1996-01-01
An approach for classifying and organizing spectra of highly excited vibrational states of molecules is investigated. As a specific example, we analyze the spectrum of an effective spectroscopic fitting Hamiltonian for H 2 O. In highly excited spectra, multiple resonance couplings and anharmonicity interact to give branching of the N original normal modes into new anharmonic modes, accompanied by the onset of widespread chaos. The anharmonic modes are identified by means of a bifurcation analysis of the spectroscopic Hamiltonian. A diabatic correlation diagram technique is developed to assign the levels with approximate open-quote open-quote dynamical close-quote close-quote quantum numbers corresponding to the dynamics determined from the bifurcation analysis. The resulting assignment shows significant disturbance from the conventional spectral pattern organization into sequences and progressions. The open-quote open-quote dynamical close-quote close-quote assignment is then converted into an assignment in terms of open-quote open-quote nominal close-quote close-quote quantum numbers that function like the N normal mode quantum numbers at low energy. The nominal assignments are used to reconstruct, as much as possible, an organization of the spectrum resembling the usual separation into sequences and progressions. copyright 1996 American Institute of Physics
Wildlife forensic science: A review of genetic geographic origin assignment.
Ogden, Rob; Linacre, Adrian
2015-09-01
Wildlife forensic science has become a key means of enforcing legislation surrounding the illegal trade in protected and endangered species. A relatively new dimension to this area of forensic science is to determine the geographic origin of a seized sample. This review focuses on DNA testing, which relies on assignment of an unknown sample to its genetic population of origin. Key examples of this are the trade in timber, fish and ivory and these are used only to illustrate the large number of species for which this type of testing is potentially available. The role of mitochondrial and nuclear DNA markers is discussed, alongside a comparison of neutral markers with those exhibiting signatures of selection, which potentially offer much higher levels of assignment power to address specific questions. A review of assignment tests is presented along with detailed methods for evaluating error rates and considerations for marker selection. The availability and quality of reference data are of paramount importance to support assignment applications and ensure reliability of any conclusions drawn. The genetic methods discussed have been developed initially as investigative tools but comment is made regarding their use in courts. The potential to compliment DNA markers with elemental assays for greater assignment power is considered and finally recommendations are made for the future of this type of testing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Reflective practice: assessment of assignments in English for Specific Purposes
Directory of Open Access Journals (Sweden)
Galina Kavaliauskiené
2007-10-01
Full Text Available The construct alternative assessment has been widely used in higher education. It is often defined as any type of assessment of learners who provide a response to an assignment. The key features of alternative assessment are active participation of learners in self-evaluation of their performance, and the development of reflective thinking through reflective thinking (Schön, 1983. The success of alternative assessment in language teaching is predetermined by student’s performance and demonstrates learner’s language proficiency in contemporary communicative classrooms. This paper aims at researching the influence of students’ evaluations of various assignments for their linguistic development in English for Specific Purposes (ESP. The study uses learners’ assessment of different assignments and learners’ in-course and post-course written reflections on benefits to language mastery. Learners’ assignments included were contributions to portfolios (dossiers, such as essays and summaries, oral presentations, short impromptu talks, creative tasks, tests, and self-assessment notes (reflections on activities in learning ESP. Findings were obtained for two streams of the project participants. Results showed that self-assessment was beneficial for learners’ linguistic development. The context of learners’ reflections reveals that the attitudes to various assignments are affected by success or failure in students’ performance. Reflective practice might help teachers develop ways of dealing with previously identified difficulties and improve the quality of teaching.
Visual words assignment via information-theoretic manifold embedding.
Deng, Yue; Li, Yipeng; Qian, Yanjun; Ji, Xiangyang; Dai, Qionghai
2014-10-01
Codebook-based learning provides a flexible way to extract the contents of an image in a data-driven manner for visual recognition. One central task in such frameworks is codeword assignment, which allocates local image descriptors to the most similar codewords in the dictionary to generate histogram for categorization. Nevertheless, existing assignment approaches, e.g., nearest neighbors strategy (hard assignment) and Gaussian similarity (soft assignment), suffer from two problems: 1) too strong Euclidean assumption and 2) neglecting the label information of the local descriptors. To address the aforementioned two challenges, we propose a graph assignment method with maximal mutual information (GAMI) regularization. GAMI takes the power of manifold structure to better reveal the relationship of massive number of local features by nonlinear graph metric. Meanwhile, the mutual information of descriptor-label pairs is ultimately optimized in the embedding space for the sake of enhancing the discriminant property of the selected codewords. According to such objective, two optimization models, i.e., inexact-GAMI and exact-GAMI, are respectively proposed in this paper. The inexact model can be efficiently solved with a closed-from solution. The stricter exact-GAMI nonparametrically estimates the entropy of descriptor-label pairs in the embedding space and thus leads to a relatively complicated but still trackable optimization. The effectiveness of GAMI models are verified on both the public and our own datasets.
Directory of Open Access Journals (Sweden)
Stout Lydia
2007-12-01
Full Text Available Abstract Background The treatment of painful osteoporotic vertebral compression fractures has historically been limited to several weeks of bed rest, anti-inflammatory and analgesic medications, calcitonin injections, or external bracing. Percutaneous vertebroplasty (the injection of bone cement into the fractured vertebral body is a relatively new procedure used to treat these fractures. There is increasing interest to examine the efficacy and safety of percutaneous vertebroplasty and to study the possibility of a placebo effect or whether the pain relief is from local anesthetics placed directly on the bone during the vertebroplasty procedure. Methods/Designs Our goal is to test the hypothesis that patients with painful osteoporotic vertebral compression fractures who undergo vertebroplasty have less disability and pain at 1 month than patients who undergo a control intervention. The control intervention is placement of local anesthesia near the fracture, without placement of cement. One hundred sixty-six patients with painful osteoporotic vertebral compression fractures will be recruited over 5 years from US and foreign sites performing the vertebroplasty procedure. We will exclude patients with malignant tumor deposit (multiple myeloma, tumor mass or tumor extension into the epidural space at the level of the fracture. We will randomly assign participants to receive either vertebroplasty or the control intervention. Subjects will complete a battery of validated, standardized measures of pain, functional disability, and health related quality of life at baseline and at post-randomization time points (days 1, 2, 3, and 14, and months 1, 3, 6, and 12. Both subjects and research interviewers performing the follow-up assessments will be blinded to the randomization assignment. Subjects will have a clinic visit at months 1 and 12. Spine X-rays will be obtained at the end of the study (month 12 to determine subsequent fracture rates. Our co
Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series
Liang, X. S.
2017-12-01
Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming
Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A
2015-01-08
science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.
Rigorous study of the gap equation for an inhomogeneous superconducting state near T/sub c/
International Nuclear Information System (INIS)
Hu, C.
1975-01-01
A rigorous analytic study of the self-consistent gap equation (symobolically Δ=F/sub T/Δ), for an inhomogeneous superconducting state, is presented in the Bogoliubov formulation. The gap function Δ (r) is taken to simulate a planar normal-superconducting phase boundary: Δ (r) =Δ/sub infinity/ tanh(αΔ/sub infinity/z/v/sub F/) THETA (z), where Δ/sub infinity/(T) is the equilibrium gap, v/subF/ is the Fermi velocity, and THETA (z) is a unit step function. First a special space integral of the gap equation proportional∫ 0 /sub +//sup infinity/(F/sub T/-Δ)(dΔ/dz) dz is evaluated essentially exactly, except for a nonperturbative WKBJ approximation used in solving the Bogoliubov--de Gennes equations. It is then expanded near the transition temperature T/sub c/ in power of Δ/sub infinity/proportional (1-T/T/sub c/) 1 / 2 , demonstrating an exact cancellation of a subseries of ''anomalous-order'' terms. The leading surviving term is found to agree in order, but not in magnitude, with the Ginzburg-Landau-Gor'kov (GLG) approximation. The discrepancy is found to be linked to the slope discontinuity in our chosen Δ. A contour-integral technique in a complex-energy plane is then devised to evaluate the local value of F/sub T/-Δ exactly. Our result reveals that near T/sub c/ this method can reproduce the GLG result essentially everywhere, except within a BCS coherence length not xi (T) exclamation from a singularity in Δ, where F/sub T/-Δ can have a singular contribution with an ''anomalous'' local magnitude, not expected from the GLG approach. This anomalous term precisely accounts for the discrepancy found in the special integral of the gap equation as mentioned above, and likely explains the ultimate origin of the anomalous terms found in the free energy of an isolated vortex line by Cleary
Ar-Ar_Redux: rigorous error propagation of 40Ar/39Ar data, including covariances
Vermeesch, P.
2015-12-01
Rigorous data reduction and error propagation algorithms are needed to realise Earthtime's objective to improve the interlaboratory accuracy of 40Ar/39Ar dating to better than 1% and thereby facilitate the comparison and combination of the K-Ar and U-Pb chronometers. Ar-Ar_Redux is a new data reduction protocol and software program for 40Ar/39Ar geochronology which takes into account two previously underappreciated aspects of the method: 1. 40Ar/39Ar measurements are compositional dataIn its simplest form, the 40Ar/39Ar age equation can be written as: t = log(1+J [40Ar/39Ar-298.5636Ar/39Ar])/λ = log(1 + JR)/λ Where λ is the 40K decay constant and J is the irradiation parameter. The age t does not depend on the absolute abundances of the three argon isotopes but only on their relative ratios. Thus, the 36Ar, 39Ar and 40Ar abundances can be normalised to unity and plotted on a ternary diagram or 'simplex'. Argon isotopic data are therefore subject to the peculiar mathematics of 'compositional data', sensu Aitchison (1986, The Statistical Analysis of Compositional Data, Chapman & Hall). 2. Correlated errors are pervasive throughout the 40Ar/39Ar methodCurrent data reduction protocols for 40Ar/39Ar geochronology propagate the age uncertainty as follows: σ2(t) = [J2 σ2(R) + R2 σ2(J)] / [λ2 (1 + R J)], which implies zero covariance between R and J. In reality, however, significant error correlations are found in every step of the 40Ar/39Ar data acquisition and processing, in both single and multi collector instruments, during blank, interference and decay corrections, age calculation etc. Ar-Ar_Redux revisits every aspect of the 40Ar/39Ar method by casting the raw mass spectrometer data into a contingency table of logratios, which automatically keeps track of all covariances in a compositional context. Application of the method to real data reveals strong correlations (r2 of up to 0.9) between age measurements within a single irradiation batch. Propertly taking
Peridynamics as a rigorous coarse-graining of atomistics for multiscale materials design
International Nuclear Information System (INIS)
Lehoucq, Richard B.; Aidun, John Bahram; Silling, Stewart Andrew; Sears, Mark P.; Kamm, James R.; Parks, Michael L.
2010-01-01
This report summarizes activities undertaken during FY08-FY10 for the LDRD Peridynamics as a Rigorous Coarse-Graining of Atomistics for Multiscale Materials Design. The goal of our project was to develop a coarse-graining of finite temperature molecular dynamics (MD) that successfully transitions from statistical mechanics to continuum mechanics. The goal of our project is to develop a coarse-graining of finite temperature molecular dynamics (MD) that successfully transitions from statistical mechanics to continuum mechanics. Our coarse-graining overcomes the intrinsic limitation of coupling atomistics with classical continuum mechanics via the FEM (finite element method), SPH (smoothed particle hydrodynamics), or MPM (material point method); namely, that classical continuum mechanics assumes a local force interaction that is incompatible with the nonlocal force model of atomistic methods. Therefore FEM, SPH, and MPM inherit this limitation. This seemingly innocuous dichotomy has far reaching consequences; for example, classical continuum mechanics cannot resolve the short wavelength behavior associated with atomistics. Other consequences include spurious forces, invalid phonon dispersion relationships, and irreconcilable descriptions/treatments of temperature. We propose a statistically based coarse-graining of atomistics via peridynamics and so develop a first of a kind mesoscopic capability to enable consistent, thermodynamically sound, atomistic-to-continuum (AtC) multiscale material simulation. Peridynamics (PD) is a microcontinuum theory that assumes nonlocal forces for describing long-range material interaction. The force interactions occurring at finite distances are naturally accounted for in PD. Moreover, PDs nonlocal force model is entirely consistent with those used by atomistics methods, in stark contrast to classical continuum mechanics. Hence, PD can be employed for mesoscopic phenomena that are beyond the realms of classical continuum mechanics and
Directory of Open Access Journals (Sweden)
Yue-Ying Pan
2015-01-01
Conclusions: Cognition partially improved in patients with OSAS after CPAP treatment. The only domain with significant improvement was vigilance. Rigorous randomized controlled trials need to be performed to obtain clear results.
DNATCO: assignment of DNA conformers at dnatco.org.
Černý, Jiří; Božíková, Paulína; Schneider, Bohdan
2016-07-08
The web service DNATCO (dnatco.org) classifies local conformations of DNA molecules beyond their traditional sorting to A, B and Z DNA forms. DNATCO provides an interface to robust algorithms assigning conformation classes called NTC: to dinucleotides extracted from DNA-containing structures uploaded in PDB format version 3.1 or above. The assigned dinucleotide NTC: classes are further grouped into DNA structural alphabet NTA: , to the best of our knowledge the first DNA structural alphabet. The results are presented at two levels: in the form of user friendly visualization and analysis of the assignment, and in the form of a downloadable, more detailed table for further analysis offline. The website is free and open to all users and there is no login requirement. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
The Utility of Writing Assignments in Undergraduate Bioscience
Libarkin, Julie; Ording, Gabriel
2012-01-01
We tested the hypothesis that engagement in a few, brief writing assignments in a nonmajors science course can improve student ability to convey critical thought about science. A sample of three papers written by students (n = 30) was coded for presence and accuracy of elements related to scientific writing. Scores for different aspects of scientific writing were significantly correlated, suggesting that students recognized relationships between components of scientific thought. We found that students' ability to write about science topics and state conclusions based on data improved over the course of three writing assignments, while the abilities to state a hypothesis and draw clear connections between human activities and environmental impacts did not improve. Three writing assignments generated significant change in student ability to write scientifically, although our results suggest that three is an insufficient number to generate complete development of scientific writing skills. PMID:22383616
Integrated project scheduling and staff assignment with controllable processing times.
Fernandez-Viagas, Victor; Framinan, Jose M
2014-01-01
This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.
Integrated Project Scheduling and Staff Assignment with Controllable Processing Times
Directory of Open Access Journals (Sweden)
Victor Fernandez-Viagas
2014-01-01
Full Text Available This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.
A search asymmetry reversed by figure-ground assignment.
Humphreys, G W; Müller, H
2000-05-01
We report evidence demonstrating that a search asymmetry favoring concave over convex targets can be reversed by altering the figure-ground assignment of edges in shapes. Visual search for a concave target among convex distractors is faster than search for a convex target among concave distractors (a search asymmetry). By using shapes with ambiguous local figure-ground relations, we demonstrated that search can be efficient (with search slopes around 10 ms/item) or inefficient (with search slopes around 30-40 ms/item) with the same stimuli, depending on whether edges are assigned to concave or convex "figures." This assignment process can operate in a top-down manner, according to the task set. The results suggest that attention is allocated to spatial regions following the computation of figure-ground relations in parallel across the elements present. This computation can also be modulated by top-down processes.
Lower region: a new cue for figure-ground assignment.
Vecera, Shaun P; Vogel, Edward K; Woodman, Geoffrey F
2002-06-01
Figure-ground assignment is an important visual process; humans recognize, attend to, and act on figures, not backgrounds. There are many visual cues for figure-ground assignment. A new cue to figure-ground assignment, called lower region, is presented: Regions in the lower portion of a stimulus array appear more figurelike than regions in the upper portion of the display. This phenomenon was explored, and it was demonstrated that the lower-region preference is not influenced by contrast, eye movements, or voluntary spatial attention. It was found that the lower region is defined relative to the stimulus display, linking the lower-region preference to pictorial depth perception cues. The results are discussed in terms of the environmental regularities that this new figure-ground cue may reflect.
A Stone Resource Assignment Model under the Fuzzy Environment
Directory of Open Access Journals (Sweden)
Liming Yao
2012-01-01
to tackle a stone resource assignment problem with the aim of decreasing dust and waste water emissions. On the upper level, the local government wants to assign a reasonable exploitation amount to each stone plant so as to minimize total emissions and maximize employment and economic profit. On the lower level, stone plants must reasonably assign stone resources to produce different stone products under the exploitation constraint. To deal with inherent uncertainties, the object functions and constraints are defuzzified using a possibility measure. A fuzzy simulation-based improved simulated annealing algorithm (FS-ISA is designed to search for the Pareto optimal solutions. Finally, a case study is presented to demonstrate the practicality and efficiency of the model. Results and a comparison analysis are presented to highlight the performance of the optimization method, which proves to be very efficient compared with other algorithms.
Quality Control Test for Sequence-Phenotype Assignments
Ortiz, Maria Teresa Lara; Rosario, Pablo Benjamín Leon; Luna-Nevarez, Pablo; Gamez, Alba Savin; Martínez-del Campo, Ana; Del Rio, Gabriel
2015-01-01
Relating a gene mutation to a phenotype is a common task in different disciplines such as protein biochemistry. In this endeavour, it is common to find false relationships arising from mutations introduced by cells that may be depurated using a phenotypic assay; yet, such phenotypic assays may introduce additional false relationships arising from experimental errors. Here we introduce the use of high-throughput DNA sequencers and statistical analysis aimed to identify incorrect DNA sequence-phenotype assignments and observed that 10–20% of these false assignments are expected in large screenings aimed to identify critical residues for protein function. We further show that this level of incorrect DNA sequence-phenotype assignments may significantly alter our understanding about the structure-function relationship of proteins. We have made available an implementation of our method at http://bis.ifc.unam.mx/en/software/chispas. PMID:25700273
The ICAP (Interactive Course Assignment Pages Publishing System
Directory of Open Access Journals (Sweden)
Kim Griggs
2008-03-01
Full Text Available The ICAP publishing system is an open source custom content management system that enables librarians to easily and quickly create and manage library help pages for course assignments (ICAPs, without requiring knowledge of HTML or other web technologies. The system's unique features include an emphasis on collaboration and content reuse and an easy-to-use interface that includes in-line help, simple forms and drag and drop functionality. The system generates dynamic, attractive course assignment pages that blend Web 2.0 features with traditional library resources, and makes the pages easier to find by providing a central web page for the course assignment pages. As of December 2007, the code is available as free, open-source software under the GNU General Public License.
Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms
Kwok, Kwan S.; Driessen, Brian J.; Phillips, Cynthia A.; Tovey, Craig A.
1997-09-01
This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. We wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which we must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solution times for one hundred robots took only seconds on a silicon graphics crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. We have found these mobile robot problems to be a very interesting application of network optimization methods, and we expect this to be a fruitful area for future research.
Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms
International Nuclear Information System (INIS)
Kwok, K.S.; Driessen, B.J.; Phillips, C.A.; Tovey, C.A.
1997-01-01
This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. The authors wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which they must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solutions times for one hundred robots took only seconds on a Silicon Graphics Crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. They have found these mobile robot problems to be a very interesting application of network optimization methods, and they expect this to be a fruitful area for future research
Gomar, Jesús J; Valls, Elia; Radua, Joaquim; Mareca, Celia; Tristany, Josep; del Olmo, Francisco; Rebolleda-Gil, Carlos; Jañez-Álvarez, María; de Álvaro, Francisco J; Ovejero, María R; Llorente, Ana; Teixidó, Cristina; Donaire, Ana M; García-Laredo, Eduardo; Lazcanoiturburu, Andrea; Granell, Luis; Mozo, Cristina de Pablo; Pérez-Hernández, Mónica; Moreno-Alcázar, Ana; Pomarol-Clotet, Edith; McKenna, Peter J
2015-11-01
The effectiveness of cognitive remediation therapy (CRT) for the neuropsychological deficits seen in schizophrenia is supported by meta-analysis. However, a recent methodologically rigorous trial had negative findings. In this study, 130 chronic schizophrenic patients were randomly assigned to computerized CRT, an active computerized control condition (CC) or treatment as usual (TAU). Primary outcome measures were 2 ecologically valid batteries of executive function and memory, rated under blind conditions; other executive and memory tests and a measure of overall cognitive function were also employed. Carer ratings of executive and memory failures in daily life were obtained before and after treatment. Computerized CRT was found to produce improvement on the training tasks, but this did not transfer to gains on the primary outcome measures and most other neuropsychological tests in comparison to either CC or TAU conditions. Nor did the intervention result in benefits on carer ratings of daily life cognitive failures. According to this study, computerized CRT is not effective in schizophrenia. The use of both active and passive CCs suggests that nature of the control group is not an important factor influencing results. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.
Nurse leader mindfulness meditation program for stress management: a randomized controlled trial.
Pipe, Teri Britt; Bortz, Jennifer J; Dueck, Amylou; Pendergast, Debra; Buchda, Vicki; Summers, Jay
2009-03-01
The aim of this study was to rigorously evaluate a brief stress management intervention for nurse leaders. Despite the nursing shortage, evidence-based workplace approaches addressing nurse stress have not been well studied. Nurse leaders (n = 33) were randomly assigned to brief mindfulness meditation course (MMC) or leadership course (control). Self-report measures of stress were administered at baseline and within 1 week of course completion. Among MMC participants, change scores (from baseline to postintervention) on several subscales of the Symptom Checklist 90-Revised showed significantly more improvement in self-reported stress symptoms relative to controls. Mindfulness meditation course participants had significantly more improvement in Positive Symptom Distress Index (P = 0.010; confidence interval [CI] = -0.483 to -0.073) and Global Severity Index (P = 0.019; CI = -0.475 to -0.046) and nearly significantly more improvement in Positive Symptom Total (P = 0.066; CI = -16.66 to 0.581) compared with controls. Results support preliminary effectiveness of a 4-week MMC in reducing self-reported stress symptoms among nursing leaders.
Assignment of stock keeping units to parallel undirectional picking
Directory of Open Access Journals (Sweden)
Matthews, Jason
2015-05-01
Full Text Available An order picking system consisting of a number of parallel unidirectional picking lines is investigated. Stock keeping units (SKUs that are grouped by product type into distributions (DBNs are assigned daily to available picking lines. A mathematical programming formulation and its relaxations is presented. A greedy insertion and a greedy phased insertion are further introduced to obtain feasible results within usable computation times for all test cases. The walking distance of the pickers was shown to decrease by about 22 per cent compared with the current assignment approach. However, product handling and operational risk increases.
Statistical methods of spin assignment in compound nuclear reactions
International Nuclear Information System (INIS)
Mach, H.; Johns, M.W.
1984-01-01
Spin assignment to nuclear levels can be obtained from standard in-beam gamma-ray spectroscopy techniques and in the case of compound nuclear reactions can be complemented by statistical methods. These are based on a correlation pattern between level spin and gamma-ray intensities feeding low-lying levels. Three types of intensity and level spin correlations are found suitable for spin assignment: shapes of the excitation functions, ratio of intensity at two beam energies or populated in two different reactions, and feeding distributions. Various empirical attempts are examined and the range of applicability of these methods as well as the limitations associated with them are given. 12 references
Statistical methods of spin assignment in compound nuclear reactions
International Nuclear Information System (INIS)
Mach, H.; Johns, M.W.
1985-01-01
Spin assignment to nuclear levels can be obtained from standard in-beam gamma-ray spectroscopy techniques and in the case of compound nuclear reactions can be complemented by statistical methods. These are based on a correlation pattern between level spin and gamma-ray intensities feeding low-lying levels. Three types of intensity and level spin correlations are found suitable for spin assignment: shapes of the excitation functions, ratio of intensity at two beam energies or populated in two different reactions, and feeding distributions. Various empirical attempts are examined and the range of applicability of these methods as well as the limitations associated with them are given
Automated Negotiation for Resource Assignment in Wireless Surveillance Sensor Networks
Directory of Open Access Journals (Sweden)
Enrique de la Hoz
2015-11-01
Full Text Available Due to the low cost of CMOS IP-based cameras, wireless surveillance sensor networks have emerged as a new application of sensor networks able to monitor public or private areas or even country borders. Since these networks are bandwidth intensive and the radioelectric spectrum is limited, especially in unlicensed bands, it is mandatory to assign frequency channels in a smart manner. In this work, we propose the application of automated negotiation techniques for frequency assignment. Results show that these techniques are very suitable for the problem, being able to obtain the best solutions among the techniques with which we have compared them.
Probabilistic validation of protein NMR chemical shift assignments
International Nuclear Information System (INIS)
Dashti, Hesam; Tonelli, Marco; Lee, Woonghee; Westler, William M.; Cornilescu, Gabriel; Ulrich, Eldon L.; Markley, John L.
2016-01-01
Data validation plays an important role in ensuring the reliability and reproducibility of studies. NMR investigations of the functional properties, dynamics, chemical kinetics, and structures of proteins depend critically on the correctness of chemical shift assignments. We present a novel probabilistic method named ARECA for validating chemical shift assignments that relies on the nuclear Overhauser effect data. ARECA has been evaluated through its application to 26 case studies and has been shown to be complementary to, and usually more reliable than, approaches based on chemical shift databases. ARECA is available online at http://areca.nmrfam.wisc.edu/ http://areca.nmrfam.wisc.edu/
Single machine scheduling with slack due dates assignment
Liu, Weiguo; Hu, Xiangpei; Wang, Xuyin
2017-04-01
This paper considers a single machine scheduling problem in which each job is assigned an individual due date based on a common flow allowance (i.e. all jobs have slack due date). The goal is to find a sequence for jobs, together with a due date assignment, that minimizes a non-regular criterion comprising the total weighted absolute lateness value and common flow allowance cost, where the weight is a position-dependent weight. In order to solve this problem, an ? time algorithm is proposed. Some extensions of the problem are also shown.
Probabilistic validation of protein NMR chemical shift assignments
Energy Technology Data Exchange (ETDEWEB)
Dashti, Hesam [University of Wisconsin-Madison, Graduate Program in Biophysics, Biochemistry Department (United States); Tonelli, Marco; Lee, Woonghee; Westler, William M.; Cornilescu, Gabriel [University of Wisconsin-Madison, Biochemistry Department, National Magnetic Resonance Facility at Madison (United States); Ulrich, Eldon L. [University of Wisconsin-Madison, BioMagResBank, Biochemistry Department (United States); Markley, John L., E-mail: markley@nmrfam.wisc.edu, E-mail: jmarkley@wisc.edu [University of Wisconsin-Madison, Biochemistry Department, National Magnetic Resonance Facility at Madison (United States)
2016-01-15
Data validation plays an important role in ensuring the reliability and reproducibility of studies. NMR investigations of the functional properties, dynamics, chemical kinetics, and structures of proteins depend critically on the correctness of chemical shift assignments. We present a novel probabilistic method named ARECA for validating chemical shift assignments that relies on the nuclear Overhauser effect data. ARECA has been evaluated through its application to 26 case studies and has been shown to be complementary to, and usually more reliable than, approaches based on chemical shift databases. ARECA is available online at http://areca.nmrfam.wisc.edu/ http://areca.nmrfam.wisc.edu/.
Evoked Feelings, Assigned Meanings and Constructed Knowledge Based on Mistakes
Directory of Open Access Journals (Sweden)
Luciane Guimarães Batistella Bianchini
2017-09-01
Full Text Available By means of Piaget's critical clinical method, the study investigated the meanings assigned to mistakes by four students in different activities and interactive situations. The research also analyzed the results of using self-regulatory situations in understanding mistakes initially committed by the students. Data collection instruments consisted of games, video recordings, diaries and interviews. Following intervention, the students were able to recognize their competence, establish positive interactions within the group, and avoid viewing mistakes as obstacles to learning. We concluded that the meanings assigned to mistakes depend on certain variables, among them feelings nurtured by the individuals about themselves, the other, and the object of knowledge.
Asymmetry in some common assignment algorithms: the dispersion factor solution
T de la Barra; B Pérez
1986-01-01
Many common assignment algorithms are based on Dial's original design to determine the paths that trip makers will follow from a given origin to destination centroids. The purpose of this paper is to show that the rules that have to be applied result in two unwanted properties. The first is that trips assigned from an origin centroid i to a destination j can be dramatically different to those resulting from centroid j to centroid i , even if the number of trips is the same and the network is ...
Vada-Kovács, M
1996-01-01
Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.