WorldWideScience

Sample records for rigorously analyzed verified

  1. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  2. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  3. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  4. Rigorous high-precision enclosures of fixed points and their invariant manifolds

    Science.gov (United States)

    Wittig, Alexander N.

    The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by

  5. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    Science.gov (United States)

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  6. Monitoring muscle optical scattering properties during rigor mortis

    Science.gov (United States)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  7. Putrefactive rigor: apparent rigor mortis due to gas distension.

    Science.gov (United States)

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  8. Unconditionally verifiable blind quantum computation

    Science.gov (United States)

    Fitzsimons, Joseph F.; Kashefi, Elham

    2017-07-01

    Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  9. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    Science.gov (United States)

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  10. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  11. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    Science.gov (United States)

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  12. Realizing rigor in the mathematics classroom

    CERN Document Server

    Hull, Ted H (Henry); Balka, Don S

    2014-01-01

    Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data

  13. ATP, IMP, and glycogen in cod muscle at onset and during development of rigor mortis depend on the sampling location

    DEFF Research Database (Denmark)

    Cappeln, Gertrud; Jessen, Flemming

    2002-01-01

    Variation in glycogen, ATP, and IMP contents within individual cod muscles were studied in ice stored fish during the progress of rigor mortis. Rigor index was determined before muscle samples for chemical analyzes were taken at 16 different positions on the fish. During development of rigor......, the contents of glycogen and ATP decreased differently in relation to rigor index depending on sampling location. Although fish were considered to be in strong rigor according to the rigor index method, parts of the muscle were not in rigor as high ATP concentrations were found in dorsal and tall muscle....

  14. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    Science.gov (United States)

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  15. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  16. Rigorous Science: a How-To Guide

    Directory of Open Access Journals (Sweden)

    Arturo Casadevall

    2016-11-01

    Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.

  17. Feedback for relatedness and competence : Can feedback in blended learning contribute to optimal rigor, basic needs, and motivation?

    NARCIS (Netherlands)

    Bombaerts, G.; Nickel, P.J.

    2017-01-01

    We inquire how peer and tutor feedback influences students' optimal rigor, basic needs and motivation. We analyze questionnaires from two courses in two subsequent years. We conclude that feedback in blended learning can contribute to rigor and basic needs, but it is not clear from our data what

  18. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  19. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  20. A Framework for Rigorously Identifying Research Gaps in Qualitative Literature Reviews

    DEFF Research Database (Denmark)

    Müller-Bloch, Christoph; Kranz, Johann

    2015-01-01

    Identifying research gaps is a fundamental goal of literature reviewing. While it is widely acknowledged that literature reviews should identify research gaps, there are no methodological guidelines for how to identify research gaps in qualitative literature reviews ensuring rigor and replicability....... Our study addresses this gap and proposes a framework that should help scholars in this endeavor without stifling creativity. To develop the framework we thoroughly analyze the state-of-the-art procedure of identifying research gaps in 40 recent literature reviews using a grounded theory approach....... Based on the data, we subsequently derive a framework for identifying research gaps in qualitative literature reviews and demonstrate its application with an example. Our results provide a modus operandi for identifying research gaps, thus enabling scholars to conduct literature reviews more rigorously...

  1. Long persistence of rigor mortis at constant low temperature.

    Science.gov (United States)

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  2. A case of instantaneous rigor?

    Science.gov (United States)

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  3. Mathematical Rigor in Introductory Physics

    Science.gov (United States)

    Vandyke, Michael; Bassichis, William

    2011-10-01

    Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.

  4. "Rigor mortis" in a live patient.

    Science.gov (United States)

    Chakravarthy, Murali

    2010-03-01

    Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.

  5. Classroom Talk for Rigorous Reading Comprehension Instruction

    Science.gov (United States)

    Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.

    2004-01-01

    This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…

  6. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Directory of Open Access Journals (Sweden)

    Augusto Beléndez

    2012-08-01

    Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  7. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Science.gov (United States)

    Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto

    2012-01-01

    There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  8. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles

    Science.gov (United States)

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-01-01

    Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282

  9. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    Science.gov (United States)

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  10. [Experimental study of restiffening of the rigor mortis].

    Science.gov (United States)

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  11. [Rigor mortis -- a definite sign of death?].

    Science.gov (United States)

    Heller, A R; Müller, M P; Frank, M D; Dressler, J

    2005-04-01

    In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.

  12. Dynamic Symmetric Key Mobile Commerce Scheme Based on Self-Verified Mechanism

    Directory of Open Access Journals (Sweden)

    Jiachen Yang

    2014-01-01

    Full Text Available In terms of the security and efficiency of mobile e-commerce, the authors summarized the advantages and disadvantages of several related schemes, especially the self-verified mobile payment scheme based on the elliptic curve cryptosystem (ECC and then proposed a new type of dynamic symmetric key mobile commerce scheme based on self-verified mechanism. The authors analyzed the basic algorithm based on self-verified mechanisms and detailed the complete transaction process of the proposed scheme. The authors analyzed the payment scheme based on the security and high efficiency index. The analysis shows that the proposed scheme not only meets the high efficiency of mobile electronic payment premise, but also takes the security into account. The user confirmation mechanism at the end of the proposed scheme further strengthens the security of the proposed scheme. In brief, the proposed scheme is more efficient and practical than most of the existing schemes.

  13. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    Science.gov (United States)

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  14. An ultramicroscopic study on rigor mortis.

    Science.gov (United States)

    Suzuki, T

    1976-01-01

    Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.

  15. Enabling food security by verifying agricultural carbon

    DEFF Research Database (Denmark)

    Kahiluoto, H; Smith, P; Moran, D

    2014-01-01

    Rewarding smallholders for sequestering carbon in agricultural land can improve food security while mitigating climate change. Verification of carbon offsets in food-insecure regions is possible and achievable through rigorously controlled monitoring......Rewarding smallholders for sequestering carbon in agricultural land can improve food security while mitigating climate change. Verification of carbon offsets in food-insecure regions is possible and achievable through rigorously controlled monitoring...

  16. Tenderness of pre- and post rigor lamb longissimus muscle.

    Science.gov (United States)

    Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad

    2011-08-01

    Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Externally Verifiable Oblivious RAM

    Directory of Open Access Journals (Sweden)

    Gancher Joshua

    2017-04-01

    Full Text Available We present the idea of externally verifiable oblivious RAM (ORAM. Our goal is to allow a client and server carrying out an ORAM protocol to have disputes adjudicated by a third party, allowing for the enforcement of penalties against an unreliable or malicious server. We give a security definition that guarantees protection not only against a malicious server but also against a client making false accusations. We then give modifications of the Path ORAM [15] and Ring ORAM [9] protocols that meet this security definition. These protocols both have the same asymptotic runtimes as the semi-honest original versions and require the external verifier to be involved only when the client or server deviates from the protocol. Finally, we implement externally verified ORAM, along with an automated cryptocurrency contract to use as the external verifier.

  18. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    Science.gov (United States)

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Auto-identification fiberoptical seal verifier

    International Nuclear Information System (INIS)

    Yamamoto, Yoichi; Mukaiyama, Takehiro

    1998-08-01

    An auto COBRA seal verifier was developed by Japan Atomic Energy Research Institute (JAERI) to provide more efficient and simpler inspection measures for IAEA safeguards. The verifier is designed to provide means of a simple, quantitative and objective judgment on in-situ verification for the COBRA seal. The equipment is a portable unit with hand-held weight and size. It can be operated by battery or AC power. The verifier reads a COBRA seal signature by using a built-in CCD camera and carries out the signature comparison procedure automatically on digital basis. The result of signature comparison is given as a YES/NO answer. The production model of the verifier was completed in July 1996. The development was carried out in collaboration with Mitsubishi Heavy Industries, Ltd. This report describes the design and functions of the COBRA seal verifier and the results of environmental and functional tests. The development of the COBRA seal verifier was carried out in the framework of Japan Support Programme for Agency Safeguards (JASPAS) as a project, JD-4 since 1981. (author)

  1. Rigor or mortis: best practices for preclinical research in neuroscience.

    Science.gov (United States)

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  3. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  4. High and low rigor temperature effects on sheep meat tenderness and ageing.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  5. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  6. Physiological studies of muscle rigor mortis in the fowl

    International Nuclear Information System (INIS)

    Nakahira, S.; Kaneko, K.; Tanaka, K.

    1990-01-01

    A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)

  7. Estimation of the breaking of rigor mortis by myotonometry.

    Science.gov (United States)

    Vain, A; Kauppila, R; Vuori, E

    1996-05-31

    Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.

  8. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  9. Rigorous solution to Bargmann-Wigner equation for integer spin

    CERN Document Server

    Huang Shi Zhong; Wu Ning; Zheng Zhi Peng

    2002-01-01

    A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived

  10. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    Science.gov (United States)

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  11. USCIS E-Verify Program Reports

    Data.gov (United States)

    Department of Homeland Security — The report builds on the last comprehensive evaluation of the E-Verify Program and demonstrates that E-Verify produces accurate results and that accuracy rates have...

  12. Verifier Theory and Unverifiability

    OpenAIRE

    Yampolskiy, Roman V.

    2016-01-01

    Despite significant developments in Proof Theory, surprisingly little attention has been devoted to the concept of proof verifier. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verificati...

  13. Verified OS Interface Code Synthesis

    Science.gov (United States)

    2016-12-01

    results into the larger proof framework of the seL4 microkernel to be directly usable in practice. Beyond the stated project goals, the solution...CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This combination increases proof productivity...were used for the verified ML compiler CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This

  14. Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.

    Science.gov (United States)

    Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten

    2018-01-01

    Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence

  15. Rigorous bounds on the free energy of electron-phonon models

    NARCIS (Netherlands)

    Raedt, Hans De; Michielsen, Kristel

    1997-01-01

    We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do

  16. Accelerating Biomedical Discoveries through Rigor and Transparency.

    Science.gov (United States)

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  17. Literally better : Analyzing and improving the quality of literals

    NARCIS (Netherlands)

    Beek, Wouter; Ilievski, Filip; Debattista, Jeremy; Schlobach, Stefan; Wielemaker, Jan

    2018-01-01

    Quality is a complicated and multifarious topic in contemporary Linked Data research. The aspect of literal quality in particular has not yet been rigorously studied. Nevertheless, analyzing and improving the quality of literals is important since literals form a substantial (one in seven

  18. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  19. Trends: Rigor Mortis in the Arts.

    Science.gov (United States)

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  20. Rigor mortis development at elevated temperatures induces pale exudative turkey meat characteristics.

    Science.gov (United States)

    McKee, S R; Sams, A R

    1998-01-01

    Development of rigor mortis at elevated post-mortem temperatures may contribute to turkey meat characteristics that are similar to those found in pale, soft, exudative pork. To evaluate this effect, 36 Nicholas tom turkeys were processed at 19 wk of age and placed in water at 40, 20, and 0 C immediately after evisceration. Pectoralis muscle samples were taken at 15 min, 30 min, 1 h, 2 h, and 4 h post-mortem and analyzed for R-value (an indirect measure of adenosine triphosphate), glycogen, pH, color, and sarcomere length. At 4 h, the remaining intact Pectoralis muscle was harvested, and aged on ice 23 h, and analyzed for drip loss, cook loss, shear values, and sarcomere length. By 15 min post-mortem, the 40 C treatment had higher R-values, which persisted through 4 h. By 1 h, the 40 C treatment pH and glycogen levels were lower than the 0 C treatment; however, they did not differ from those of the 20 C treatment. Increased L* values indicated that color became more pale by 2 h post-mortem in the 40 C treatment when compared to the 20 and 0 C treatments. Drip loss, cook loss, and shear value were increased whereas sarcomere lengths were decreased as a result of the 40 C treatment. These findings suggested that elevated post-mortem temperatures during processing resulted in acceleration of rigor mortis and biochemical changes in the muscle that produced pale, exudative meat characteristics in turkey.

  1. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  2. Rigor, vigor, and the study of health disparities.

    Science.gov (United States)

    Adler, Nancy; Bush, Nicole R; Pantell, Matthew S

    2012-10-16

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.

  3. Regressive transgressive cycle of Devonian sea in Uruguay verified by Palynology

    International Nuclear Information System (INIS)

    Da Silva, J.

    1990-01-01

    This work is about the results and conclusions of the populations palinomorphs study, carried out in Devonian formations in the center of Uruguay. The existence of a regressive transgressive cycle is verified by analyzing the vertical distribution of palinomorphs as well as is mentioned the presence of chintziest for the section studied - hoesphaeridium Cyathochitina kinds

  4. Status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  5. How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research

    OpenAIRE

    Wolf, Joachim; Rosenberg, Timo

    2012-01-01

    This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...

  6. Onset of rigor mortis is earlier in red muscle than in white muscle.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  7. Photoconductivity of amorphous silicon-rigorous modelling

    International Nuclear Information System (INIS)

    Brada, P.; Schauer, F.

    1991-01-01

    It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)

  8. Verifiably Truthful Mechanisms

    DEFF Research Database (Denmark)

    Branzei, Simina; Procaccia, Ariel D.

    2015-01-01

    the computational sense). Our approach involves three steps: (i) specifying the structure of mechanisms, (ii) constructing a verification algorithm, and (iii) measuring the quality of verifiably truthful mechanisms. We demonstrate this approach using a case study: approximate mechanism design without money...

  9. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    Science.gov (United States)

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  10. Upgrading geometry conceptual understanding and strategic competence through implementing rigorous mathematical thinking (RMT)

    Science.gov (United States)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-03-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.

  11. Efficient Verifiable Range and Closest Point Queries in Zero-Knowledge

    Directory of Open Access Journals (Sweden)

    Ghosh Esha

    2016-10-01

    Full Text Available We present an efficient method for answering one-dimensional range and closest-point queries in a verifiable and privacy-preserving manner. We consider a model where a data owner outsources a dataset of key-value pairs to a server, who answers range and closest-point queries issued by a client and provides proofs of the answers. The client verifies the correctness of the answers while learning nothing about the dataset besides the answers to the current and previous queries. Our work yields for the first time a zero-knowledge privacy assurance to authenticated range and closest-point queries. Previous work leaked the size of the dataset and used an inefficient proof protocol. Our construction is based on hierarchical identity-based encryption. We prove its security and analyze its efficiency both theoretically and with experiments on synthetic and real data (Enron email and Boston taxi datasets.

  12. 28 CFR 802.13 - Verifying your identity.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity. You...

  13. 20 CFR 401.45 - Verifying your identity.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make a...

  14. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    Science.gov (United States)

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Rigor mortis in an unusual position: Forensic considerations.

    Science.gov (United States)

    D'Souza, Deepak H; Harish, S; Rajesh, M; Kiran, J

    2011-07-01

    We report a case in which the dead body was found with rigor mortis in an unusual position. The dead body was lying on its back with limbs raised, defying gravity. Direction of the salivary stains on the face was also defying the gravity. We opined that the scene of occurrence of crime is unlikely to be the final place where the dead body was found. The clues were revealing a homicidal offence and an attempt to destroy the evidence. The forensic use of 'rigor mortis in an unusual position' is in furthering the investigations, and the scientific confirmation of two facts - the scene of death (occurrence) is different from the scene of disposal of dead body, and time gap between the two places.

  16. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  17. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    Science.gov (United States)

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (psalting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  18. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    Science.gov (United States)

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (prigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  19. Verifying FreeRTOS; a feasibility study

    NARCIS (Netherlands)

    Pronk, C.

    2010-01-01

    This paper presents a study on modeling and verifying the kernel of Real-Time Operating Systems (RTOS). The study will show advances in formally verifying such an RTOS both by refinement and by model checking approaches. This work fits in the context of Hoare’s verification challenge. Several

  20. The status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  1. Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies

    Science.gov (United States)

    Hagood, Margaret Carmody; Skinner, Emily Neil

    2015-01-01

    Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…

  2. Analyzing the Qualitative Data Analyst: A Naturalistic Investigation of Data Interpretation

    Directory of Open Access Journals (Sweden)

    Wolff-Michael Roth

    2015-07-01

    Full Text Available Much qualitative research involves the analysis of verbal data. Although the possibility to conduct qualitative research in a rigorous manner is sometimes contested in debates of qualitative/quantitative methods, there are scholarly communities within which qualitative research is indeed data driven and enacted in rigorous ways. How might one teach rigorous approaches to analysis of verbal data? In this study, 20 sessions were recorded in introductory graduate classes on qualitative research methods. The social scientist thought aloud while analyzing transcriptions that were handed to her immediately prior the sessions and for which she had no background information. The students then assessed, sometimes showing the original video, the degree to which the analyst had recovered (the structures of the original events. This study provides answers to the broad question: "How does an analyst recover an original event with a high degree of accuracy?" Implications are discussed for teaching qualitative data analysis. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1503119

  3. pd Scattering Using a Rigorous Coulomb Treatment: Reliability of the Renormalization Method for Screened-Coulomb Potentials

    International Nuclear Information System (INIS)

    Hiratsuka, Y.; Oryu, S.; Gojuki, S.

    2011-01-01

    Reliability of the screened Coulomb renormalization method, which was proposed in an elegant way by Alt-Sandhas-Zankel-Ziegelmann (ASZZ), is discussed on the basis of 'two-potential theory' for the three-body AGS equations with the Coulomb potential. In order to obtain ASZZ's formula, we define the on-shell Moller function, and calculate it by using the Haeringen criterion, i. e. 'the half-shell Coulomb amplitude is zero'. By these two steps, we can finally obtain the ASZZ formula for a small Coulomb phase shift. Furthermore, the reliability of the Haeringen criterion is thoroughly checked by a numerically rigorous calculation for the Coulomb LS-type equation. We find that the Haeringen criterion can be satisfied only in the higher energy region. We conclude that the ASZZ method can be verified in the case that the on-shell approximation to the Moller function is reasonable, and the Haeringen criterion is reliable. (author)

  4. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  5. Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals

    Science.gov (United States)

    Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna

    2012-01-01

    Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…

  6. Verified Interval Orbit Propagation in Satellite Collision Avoidance

    NARCIS (Netherlands)

    Römgens, B.A.; Mooij, E.; Naeije, M.C.

    2011-01-01

    Verified interval integration methods enclose a solution set corresponding to interval initial values and parameters, and bound integration and rounding errors. Verified methods suffer from overestimation of the solution, i.e., non-solutions are also included in the solution enclosure. Two verified

  7. An improved system to verify CANDU spent fuel elements in dry storage silos

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel

    2000-01-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  8. An improved system to verify CANDU spent fuel elements in dry storage silos

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2000-07-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  9. Verifying design patterns in Hoare Type Theory

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Buisse, Alexandre; Birkedal, Lars

    In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory.......In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory....

  10. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    Science.gov (United States)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  11. Psychological and social aspects verified after the Goiania's radioactive accident

    International Nuclear Information System (INIS)

    Helou, Suzana

    1995-01-01

    Psychological and social aspects verified after the radioactive accident occurred in 1987 in Goiania - brazilian city - are discussed. With this goal was going presented a public opinion research in order to retract the Goiania's radioactive accident residual psychological effects. They were going consolidated data obtained in 1.126 interviews. Four involvement different levels groups with the accident are compared with regard to the event. The research allowed to conclude that the accident affected psychologically somehow all Goiania's population. Besides, the research allowed to analyze the professionals performance quality standard in terms of the accident

  12. New concepts in nuclear arms control: verified cutoff and verified disposal

    International Nuclear Information System (INIS)

    Donnelly, W.H.

    1990-01-01

    Limiting the numbers of nuclear warheads by reducing military production and stockpiles of fissionable materials has been a constant item on the nuclear arms control agenda for the last 45 years. It has become more salient recently, however, because of two events: the enforced closure for safety reasons of the current United States military plutonium production facilities; and the possibility that the US and USSR may soon conclude an agreement providing for the verified destruction of significant numbers of nuclear warheads and the recovery of the fissionable material they contain with the option of transferring these materials to peaceful uses. A study has been made of the practical problems of verifying the cut off of fissionable material production for military purposes in the nuclear weapon states, as well as providing assurance that material recovered from warheads is not re-used for proscribed military purposes and facilitating its transfer to civil uses. Implementation of such measures would have important implications for non-proliferation. The resultant paper was presented to a meeting of the PPNN Core Group held in Baden, close to Vienna, over the weekend of 18/19th November 1989 and is reprinted in this booklet. (author)

  13. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    Science.gov (United States)

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  14. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  15. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1986-01-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first- and second-order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first-order formulation satisfies the conditions of the Hille--Yosida theorem. A foundation is laid thereby within which the domains associated with the first- and second-order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  16. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1985-05-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first and second order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first order formulation satisfies the conditions of the Hille-Yosida theorem. A foundation is laid thereby within which the domains associated with the first and second order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  17. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    Science.gov (United States)

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  18. Appraising the value of independent EIA follow-up verifiers

    Energy Technology Data Exchange (ETDEWEB)

    Wessels, Jan-Albert, E-mail: janalbert.wessels@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Environmental Assessment, School of Environmental Science, Murdoch University, Australia. (Australia)

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  19. Appraising the value of independent EIA follow-up verifiers

    International Nuclear Information System (INIS)

    Wessels, Jan-Albert; Retief, Francois; Morrison-Saunders, Angus

    2015-01-01

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  20. Towards Verifying National CO2 Emissions

    Science.gov (United States)

    Fung, I. Y.; Wuerth, S. M.; Anderson, J. L.

    2017-12-01

    With the Paris Agreement, nations around the world have pledged their voluntary reductions in future CO2 emissions. Satellite observations of atmospheric CO2 have the potential to verify self-reported emission statistics around the globe. We present a carbon-weather data assimilation system, wherein raw weather observations together with satellite observations of the mixing ratio of column CO2 from the Orbiting Carbon Observatory-2 are assimilated every 6 hours into the NCAR carbon-climate model CAM5 coupled to the Ensemble Kalman Filter of DART. In an OSSE, we reduced the fossil fuel emissions from a country, and estimated the emissions innovations demanded by the atmospheric CO2 observations. The uncertainties in the innovation are analyzed with respect to the uncertainties in the meteorology to determine the significance of the result. The work follows from "On the use of incomplete historical data to infer the present state of the atmosphere" (Charney et al. 1969), which maps the path for continuous data assimilation for weather forecasting and the five decades of progress since.

  1. The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools

    Science.gov (United States)

    Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia

    2016-01-01

    Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…

  2. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  3. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    Science.gov (United States)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  4. Privacy-Preserving Verifiability: A Case for an Electronic Exam Protocol

    DEFF Research Database (Denmark)

    Giustolisi, Rosario; Iovino, Vincenzo; Lenzini, Gabriele

    2017-01-01

    We introduce the notion of privacy-preserving verifiability for security protocols. It holds when a protocol admits a verifiability test that does not reveal, to the verifier that runs it, more pieces of information about the protocol’s execution than those required to run the test. Our definition...... of privacy-preserving verifiability is general and applies to cryptographic protocols as well as to human security protocols. In this paper we exemplify it in the domain of e-exams. We prove that the notion is meaningful by studying an existing exam protocol that is verifiable but whose verifiability tests...... are not privacy-preserving. We prove that the notion is applicable: we review the protocol using functional encryption so that it admits a verifiability test that preserves privacy according to our definition. We analyse, in ProVerif, that the verifiability holds despite malicious parties and that the new...

  5. USCIS E-Verify Self-Check

    Data.gov (United States)

    Department of Homeland Security — E-Verify is an internet based system that contains datasets to compare information from an employee's Form I-9, Employment Eligibility Verification, to data from the...

  6. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    Science.gov (United States)

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (Prigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (Prigor fillets (37.8 ± 0.8) and had significantly lower (Prigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  7. A Finite Equivalence of Verifiable Multi-secret Sharing

    Directory of Open Access Journals (Sweden)

    Hui Zhao

    2012-02-01

    Full Text Available We give an abstraction of verifiable multi-secret sharing schemes that is accessible to a fully mechanized analysis. This abstraction is formalized within the applied pi-calculus by using an equational theory which characterizes the cryptographic semantics of secret share. We also present an encoding from the equational theory into a convergent rewriting system, which is suitable for the automated protocol verifier ProVerif. Based on that, we verify the threshold certificate protocol in ProVerif.

  8. USCIS E-Verify Customer Satisfaction Survey, January 2013

    Data.gov (United States)

    Department of Homeland Security — This report focuses on the customer satisfaction of companies currently enrolled in the E-Verify program. Satisfaction with E-Verify remains high and follows up a...

  9. Experimental evaluation of rigor mortis. VIII. Estimation of time since death by repeated measurements of the intensity of rigor mortis on rats.

    Science.gov (United States)

    Krompecher, T

    1994-10-21

    The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.

  10. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    Science.gov (United States)

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  11. Disciplining Bioethics: Towards a Standard of Methodological Rigor in Bioethics Research

    Science.gov (United States)

    Adler, Daniel; Shaul, Randi Zlotnik

    2012-01-01

    Contemporary bioethics research is often described as multi- or interdisciplinary. Disciplines are characterized, in part, by their methods. Thus, when bioethics research draws on a variety of methods, it crosses disciplinary boundaries. Yet each discipline has its own standard of rigor—so when multiple disciplinary perspectives are considered, what constitutes rigor? This question has received inadequate attention, as there is considerable disagreement regarding the disciplinary status of bioethics. This disagreement has presented five challenges to bioethics research. Addressing them requires consideration of the main types of cross-disciplinary research, and consideration of proposals aiming to ensure rigor in bioethics research. PMID:22686634

  12. Verifying competence of operations personnel in nuclear power plants

    International Nuclear Information System (INIS)

    Farber, G.H.

    1986-01-01

    To ensure that only competent people are authorized to fill positions in a nuclear power plant, both the initial competence of personnel and the continuous maintenance of competence have to be verified. Two main methods are normally used for verifying competence, namely evaluation of a person's performance over a period of time, and evaluation of his knowledge and skills at a particular time by means of an examination. Both methods have limitations, and in practice they are often used together to give different and to some extent complementary evaluations of a person's competence. Verification of competence itself is a problem area, because objective judging of human competence is extremely difficult. Formal verification methods, such as tests and examinations, are particularly or exclusively applied for the direct operating personnel in the control room (very rarely for management personnel). Out of the many elements contributing to a person's competence, the knowledge which is needed and the intellectual skills are the main subjects of the formal verification methods. Therefore the presentation will concentrate on the proof of the technical qualification of operators by means of examinations. The examination process in the Federal Republic of Germany for the proof of knowledge and skills will serve as an example to describe and analyze the important aspects. From that recommendations are derived regarding standardization of the procedure as well as validation. (orig./GL)

  13. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  14. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  15. Electrocardiogram artifact caused by rigors mimicking narrow complex tachycardia: a case report.

    Science.gov (United States)

    Matthias, Anne Thushara; Indrakumar, Jegarajah

    2014-02-04

    The electrocardiogram (ECG) is useful in the diagnosis of cardiac and non-cardiac conditions. Rigors due to shivering can cause electrocardiogram artifacts mimicking various cardiac rhythm abnormalities. We describe an 80-year-old Sri Lankan man with an abnormal electrocardiogram mimicking narrow complex tachycardia during the immediate post-operative period. Electrocardiogram changes caused by muscle tremor during rigors could mimic a narrow complex tachycardia. Identification of muscle tremor as a cause of electrocardiogram artifact can avoid unnecessary pharmacological and non-pharmacological intervention to prevent arrhythmias.

  16. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    Science.gov (United States)

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  17. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    Science.gov (United States)

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  18. Einstein's Theory A Rigorous Introduction for the Mathematically Untrained

    CERN Document Server

    Grøn, Øyvind

    2011-01-01

    This book provides an introduction to the theory of relativity and the mathematics used in its processes. Three elements of the book make it stand apart from previously published books on the theory of relativity. First, the book starts at a lower mathematical level than standard books with tensor calculus of sufficient maturity to make it possible to give detailed calculations of relativistic predictions of practical experiments. Self-contained introductions are given, for example vector calculus, differential calculus and integrations. Second, in-between calculations have been included, making it possible for the non-technical reader to follow step-by-step calculations. Thirdly, the conceptual development is gradual and rigorous in order to provide the inexperienced reader with a philosophically satisfying understanding of the theory.  Einstein's Theory: A Rigorous Introduction for the Mathematically Untrained aims to provide the reader with a sound conceptual understanding of both the special and genera...

  19. Sonoelasticity to monitor mechanical changes during rigor and ageing.

    Science.gov (United States)

    Ayadi, A; Culioli, J; Abouelkaram, S

    2007-06-01

    We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.

  20. Impact of post-rigor high pressure processing on the physicochemical and microbial shelf-life of cultured red abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Perkins, L Brian; Yang, Tom C; Skonberg, Denise I

    2016-03-01

    High pressure processing (HPP) of post-rigor abalone at 300MPa for 10min extended the refrigerated shelf-life to four times that of unprocessed controls. Shucked abalone meats were processed at 100 or 300MPa for 5 or 10min, and stored at 2°C for 35days. Treatments were analyzed for aerobic plate count (APC), total volatile base nitrogen (TVBN), K-value, biogenic amines, color, and texture. APC did not exceed 10(6) and TVBN levels remained below 35mg/100g for 35days for the 300MPa treatments. No biogenic amines were detected in the 300MPa treatments, but putrescine and cadaverine were detected in the control and 100MPa treatments. Color and texture were not affected by HPP or storage time. These results indicate that post-rigor processing at 300MPa for 10min can significantly increase refrigerated shelf-life of abalone without affecting chemical or physical quality characteristics important to consumers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A rigorous proof for the Landauer-Büttiker formula

    DEFF Research Database (Denmark)

    Cornean, Horia Decebal; Jensen, Arne; Moldoveanu, V.

    Recently, Avron et al. shed new light on the question of quantum transport in mesoscopic samples coupled to particle reservoirs by semi-infinite leads. They rigorously treat the case when the sample undergoes an adiabatic evolution thus generating a current through th leads, and prove the so call...

  2. Verifying Architectural Design Rules of the Flight Software Product Line

    Science.gov (United States)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  3. Characterization of rigor mortis of longissimus dorsi and triceps ...

    African Journals Online (AJOL)

    24 h) of the longissimus dorsi (LD) and triceps brachii (TB) muscles as well as the shear force (meat tenderness) and colour were evaluated, aiming at characterizing the rigor mortis in the meat during industrial processing. Data statistic treatment demonstrated that carcass temperature and pH decreased gradually during ...

  4. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    Science.gov (United States)

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  6. Layout optimization of DRAM cells using rigorous simulation model for NTD

    Science.gov (United States)

    Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe

    2014-03-01

    scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.

  7. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  8. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    Science.gov (United States)

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  9. Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research

    Science.gov (United States)

    Andriessen, Daniel

    2004-01-01

    This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…

  10. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  11. 31 CFR 363.14 - How will you verify my identity?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application. At...

  12. Some Proxy Signature and Designated verifier Signature Schemes over Braid Groups

    OpenAIRE

    Lal, Sunder; Verma, Vandani

    2009-01-01

    Braids groups provide an alternative to number theoretic public cryptography and can be implemented quite efficiently. The paper proposes five signature schemes: Proxy Signature, Designated Verifier, Bi-Designated Verifier, Designated Verifier Proxy Signature And Bi-Designated Verifier Proxy Signature scheme based on braid groups. We also discuss the security aspects of each of the proposed schemes.

  13. Quality properties of pre- and post-rigor beef muscle after interventions with high frequency ultrasound.

    Science.gov (United States)

    Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn

    2014-11-01

    The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  14. Rigorous derivation from Landau-de Gennes theory to Ericksen-Leslie theory

    OpenAIRE

    Wang, Wei; Zhang, Pingwen; Zhang, Zhifei

    2013-01-01

    Starting from Beris-Edwards system for the liquid crystal, we present a rigorous derivation of Ericksen-Leslie system with general Ericksen stress and Leslie stress by using the Hilbert expansion method.

  15. Striation Patterns of Ox Muscle in Rigor Mortis

    Science.gov (United States)

    Locker, Ronald H.

    1959-01-01

    Ox muscle in rigor mortis offers a selection of myofibrils fixed at varying degrees of contraction from sarcomere lengths of 3.7 to 0.7 µ. A study of this material by phase contrast and electron microscopy has revealed four distinct successive patterns of contraction, including besides the familiar relaxed and contracture patterns, two intermediate types (2.4 to 1.9 µ, 1.8 to 1.5 µ) not previously well described. PMID:14417790

  16. Assessment of the Methodological Rigor of Case Studies in the Field of Management Accounting Published in Journals in Brazil

    Directory of Open Access Journals (Sweden)

    Kelly Cristina Mucio Marques

    2015-04-01

    Full Text Available This study aims to assess the methodological rigor of case studies in management accounting published in Brazilian journals. The study is descriptive. The data were collected using documentary research and content analysis, and 180 papers published from 2008 to 2012 in accounting journals rated as A2, B1, and B2 that were classified as case studies were selected. Based on the literature, we established a set of 15 criteria that we expected to be identified (either explicitly or implicitly in the case studies to classify those case studies as appropriate from the standpoint of methodological rigor. These criteria were partially met by the papers analyzed. The aspects less aligned with those proposed in the literature were the following: little emphasis on justifying the need to understand phenomena in context; lack of explanation of the reason for choosing the case study strategy; the predominant use of questions that do not enable deeper analysis; many studies based on only one source of evidence; little use of data and information triangulation; little emphasis on the data collection method; a high number of cases in which confusion between case study as a research strategy and as data collection method were detected; a low number of papers reporting the method of data analysis; few reports on a study's contributions; and a minority highlighting the issues requiring further research. In conclusion, the method used to apply case studies to management accounting must be improved because few studies showed rigorous application of the procedures that this strategy requires.

  17. Mathematical framework for fast and rigorous track fit for the ZEUS detector

    Energy Technology Data Exchange (ETDEWEB)

    Spiridonov, Alexander

    2008-12-15

    In this note we present a mathematical framework for a rigorous approach to a common track fit for trackers located in the inner region of the ZEUS detector. The approach makes use of the Kalman filter and offers a rigorous treatment of magnetic field inhomogeneity, multiple scattering and energy loss. We describe mathematical details of the implementation of the Kalman filter technique with a reduced amount of computations for a cylindrical drift chamber, barrel and forward silicon strip detectors and a forward straw drift chamber. Options with homogeneous and inhomogeneous field are discussed. The fitting of tracks in one ZEUS event takes about of 20ms on standard PC. (orig.)

  18. Learning from Science and Sport - How we, Safety, "Engage with Rigor"

    Science.gov (United States)

    Herd, A.

    2012-01-01

    As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a

  19. Differential rigor development in red and white muscle revealed by simultaneous measurement of tension and stiffness.

    Science.gov (United States)

    Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki

    2004-02-10

    Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.

  20. Evaluation of physical dimension changes as nondestructive measurements for monitoring rigor mortis development in broiler muscles.

    Science.gov (United States)

    Cavitt, L C; Sams, A R

    2003-07-01

    Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.

  1. An IBM 370 assembly language program verifier

    Science.gov (United States)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  2. A rigorous test for a new conceptual model for collisions

    International Nuclear Information System (INIS)

    Peixoto, E.M.A.; Mu-Tao, L.

    1979-01-01

    A rigorous theoretical foundation for the previously proposed model is formulated and applied to electron scattering by H 2 in the gas phase. An rigorous treatment of the interaction potential between the incident electron and the Hydrogen molecule is carried out to calculate Differential Cross Sections for 1 KeV electrons, using Glauber's approximation Wang's molecular wave function for the ground electronic state of H 2 . Moreover, it is shown for the first time that, when adequately done, the omission of two center terms does not adversely influence the results of molecular calculations. It is shown that the new model is far superior to the Independent Atom Model (or Independent Particle Model). The accuracy and simplicity of the new model suggest that it may be fruitfully applied to the description of other collision phenomena (e.g., in molecular beam experiments and nuclear physics). A new techniques is presented for calculations involving two center integrals within the frame work of the Glauber's approximation for scattering. (Author) [pt

  3. Rigorous Analysis of a Randomised Number Field Sieve

    OpenAIRE

    Lee, Jonathan; Venkatesan, Ramarathnam

    2018-01-01

    Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...

  4. Verifying different-modality properties for concepts produces switching costs.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2003-03-01

    According to perceptual symbol systems, sensorimotor simulations underlie the representation of concepts. It follows that sensorimotor phenomena should arise in conceptual processing. Previous studies have shown that switching from one modality to another during perceptual processing incurs a processing cost. If perceptual simulation underlies conceptual processing, then verifying the properties of concepts should exhibit a switching cost as well. For example, verifying a property in the auditory modality (e.g., BLENDER-loud) should be slower after verifying a property in a different modality (e.g., CRANBERRIES-tart) than after verifying a property in the same modality (e.g., LEAVES-rustling). Only words were presented to subjects, and there were no instructions to use imagery. Nevertheless, switching modalities incurred a cost, analogous to the cost of switching modalities in perception. A second experiment showed that this effect was not due to associative priming between properties in the same modality. These results support the hypothesis that perceptual simulation underlies conceptual processing.

  5. A rigorous pole representation of multilevel cross sections and its practical applications

    International Nuclear Information System (INIS)

    Hwang, R.N.

    1987-01-01

    In this article a rigorous method for representing the multilevel cross sections and its practical applications are described. It is a generalization of the rationale suggested by de Saussure and Perez for the s-wave resonances. A computer code WHOPPER has been developed to convert the Reich-Moore parameters into the pole and residue parameters in momentum space. Sample calculations have been carried out to illustrate that the proposed method preserves the rigor of the Reich-Moore cross sections exactly. An analytical method has been developed to evaluate the pertinent Doppler-broadened line shape functions. A discussion is presented on how to minimize the number of pole parameters so that the existing reactor codes can be best utilized

  6. Effect of pre-rigor stretch and various constant temperatures on the rate of post-mortem pH fall, rigor mortis and some quality traits of excised porcine biceps femoris muscle strips.

    Science.gov (United States)

    Vada-Kovács, M

    1996-01-01

    Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.

  7. Experimental evaluation of rigor mortis IX. The influence of the breaking (mechanical solution) on the development of rigor mortis.

    Science.gov (United States)

    Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice

    2008-04-07

    Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.

  8. The jABC Approach to Rigorous Collaborative Development of SCM Applications

    Science.gov (United States)

    Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong

    Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.

  9. Reasoning about knowledge: Children’s evaluations of generality and verifiability

    Science.gov (United States)

    Koenig, Melissa A.; Cole, Caitlin A.; Meyer, Meredith; Ridge, Katherine E.; Kushnir, Tamar; Gelman, Susan A.

    2015-01-01

    In a series of experiments, we examined 3- to 8-year-old children’s (N = 223) and adults’ (N = 32) use of two properties of testimony to estimate a speaker’s knowledge: generality and verifiability. Participants were presented with a “Generic speaker” who made a series of 4 general claims about “pangolins” (a novel animal kind), and a “Specific speaker” who made a series of 4 specific claims about “this pangolin” as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., “has a pointy nose”) or a non-evident feature that was not visible (e.g., “sleeps in a hollow tree”). Three main findings emerged: (1) Young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) Children’s attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) Children often generalized speakers’ knowledge outside of the pangolin domain, indicating a belief that a person’s knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. PMID:26451884

  10. Evaluation of the three-nucleon analyzing power puzzle

    International Nuclear Information System (INIS)

    Tornow, W.; Witala, H.

    1998-01-01

    The current status of the three-nucleon analyzing power puzzle is reviewed. Applying tight constraints on the allowed deviations between calculated predictions and accepted values for relevant nucleon-nucleon observables reveals that energy independent correction factors applied to the 3 P j nucleon-nucleon interactions can not solve the puzzle. Furthermore, using the same constraints, charge-independence breaking in the 3 P j nucleon-nucleon interactions can be ruled out as a possible tool to improve the agreement between three-nucleon calculations and data. The study of the energy dependence of the three-nucleon analyzing power puzzle gives clear evidence that the 3 P j nucleon-nucleon interaction obtained from phase-shift analyses and used in potential models are correct above about 25 MeV, i.e., the 3 P j nucleon-nucleon interactions have to be modified only at lower energies in order to solve the three-nucleon analyzing power puzzle, unless new three-nucleon forces can be found that account for the three-nucleon analyzing power puzzle without destroying the beautiful agreement between rigorous three-nucleon calculations and a large body of accurate three-nucleon data. (orig.)

  11. Evaluation of the three-nucleon analyzing power puzzle

    Energy Technology Data Exchange (ETDEWEB)

    Tornow, W. [Duke Univ., Durham, NC (United States). Dept. of Physics]|[Triangle Univ. Nuclear Lab., Durham, NC (United States); Witala, H. [Uniwersytet Jagiellonski, Cracow (Poland). Inst. Fizyki

    1998-07-20

    The current status of the three-nucleon analyzing power puzzle is reviewed. Applying tight constraints on the allowed deviations between calculated predictions and accepted values for relevant nucleon-nucleon observables reveals that energy independent correction factors applied to the {sup 3}P{sub j} nucleon-nucleon interactions can not solve the puzzle. Furthermore, using the same constraints, charge-independence breaking in the {sup 3}P{sub j} nucleon-nucleon interactions can be ruled out as a possible tool to improve the agreement between three-nucleon calculations and data. The study of the energy dependence of the three-nucleon analyzing power puzzle gives clear evidence that the {sup 3}P{sub j} nucleon-nucleon interaction obtained from phase-shift analyses and used in potential models are correct above about 25 MeV, i.e., the {sup 3}P{sub j} nucleon-nucleon interactions have to be modified only at lower energies in order to solve the three-nucleon analyzing power puzzle, unless new three-nucleon forces can be found that account for the three-nucleon analyzing power puzzle without destroying the beautiful agreement between rigorous three-nucleon calculations and a large body of accurate three-nucleon data. (orig.) 18 refs.

  12. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  13. Pre-rigor temperature and the relationship between lamb tenderisation, free water production, bound water and dry matter.

    Science.gov (United States)

    Devine, Carrick; Wells, Robyn; Lowe, Tim; Waller, John

    2014-01-01

    The M. longissimus from lambs electrically stimulated at 15 min post-mortem were removed after grading, wrapped in polythene film and held at 4 (n=6), 7 (n=6), 15 (n=6, n=8) and 35°C (n=6), until rigor mortis then aged at 15°C for 0, 4, 24 and 72 h post-rigor. Centrifuged free water increased exponentially, and bound water, dry matter and shear force decreased exponentially over time. Decreases in shear force and increases in free water were closely related (r(2)=0.52) and were unaffected by pre-rigor temperatures. © 2013.

  14. New rigorous asymptotic theorems for inverse scattering amplitudes

    International Nuclear Information System (INIS)

    Lomsadze, Sh.Yu.; Lomsadze, Yu.M.

    1984-01-01

    The rigorous asymptotic theorems both of integral and local types obtained earlier and establishing logarithmic and in some cases even power correlations aetdeen the real and imaginary parts of scattering amplitudes Fsub(+-) are extended to the inverse amplitudes 1/Fsub(+-). One also succeeds in establishing power correlations of a new type between the real and imaginary parts, both for the amplitudes themselves and for the inverse ones. All the obtained assertions are convenient to be tested in high energy experiments when the amplitudes show asymptotic behaviour

  15. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  16. Double phosphorylation of the myosin regulatory light chain during rigor mortis of bovine Longissimus muscle.

    Science.gov (United States)

    Muroya, Susumu; Ohnishi-Kameyama, Mayumi; Oe, Mika; Nakajima, Ikuyo; Shibata, Masahiro; Chikuni, Koichi

    2007-05-16

    To investigate changes in myosin light chains (MyLCs) during postmortem aging of the bovine longissimus muscle, we performed two-dimensional gel electrophoresis followed by identification with matrix-assisted laser desorption ionization time-of-flight mass spectrometry. The results of fluorescent differential gel electrophoresis showed that two spots of the myosin regulatory light chain (MyLC2) at pI values of 4.6 and 4.7 shifted toward those at pI values of 4.5 and 4.6, respectively, by 24 h postmortem when rigor mortis was completed. Meanwhile, the MyLC1 and MyLC3 spots did not change during the 14 days postmortem. Phosphoprotein-specific staining of the gels demonstrated that the MyLC2 proteins at pI values of 4.5 and 4.6 were phosphorylated. Furthermore, possible N-terminal region peptides containing one and two phosphoserine residues were detected in each mass spectrum of the MyLC2 spots at pI values of 4.5 and 4.6, respectively. These results demonstrated that MyLC2 became doubly phosphorylated during rigor formation of the bovine longissimus, suggesting involvement of the MyLC2 phosphorylation in the progress of beef rigor mortis. Bovine; myosin regulatory light chain (RLC, MyLC2); phosphorylation; rigor mortis; skeletal muscle.

  17. A rigorous derivation of gravitational self-force

    International Nuclear Information System (INIS)

    Gralla, Samuel E; Wald, Robert M

    2008-01-01

    There is general agreement that the MiSaTaQuWa equations should describe the motion of a 'small body' in general relativity, taking into account the leading order self-force effects. However, previous derivations of these equations have made a number of ad hoc assumptions and/or contain a number of unsatisfactory features. For example, all previous derivations have invoked, without proper justification, the step of 'Lorenz gauge relaxation', wherein the linearized Einstein equation is written in the form appropriate to the Lorenz gauge, but the Lorenz gauge condition is then not imposed-thereby making the resulting equations for the metric perturbation inequivalent to the linearized Einstein equations. (Such a 'relaxation' of the linearized Einstein equations is essential in order to avoid the conclusion that 'point particles' move on geodesics.) In this paper, we analyze the issue of 'particle motion' in general relativity in a systematic and rigorous way by considering a one-parameter family of metrics, g ab (λ), corresponding to having a body (or black hole) that is 'scaled down' to zero size and mass in an appropriate manner. We prove that the limiting worldline of such a one-parameter family must be a geodesic of the background metric, g ab (λ = 0). Gravitational self-force-as well as the force due to coupling of the spin of the body to curvature-then arises as a first-order perturbative correction in λ to this worldline. No assumptions are made in our analysis apart from the smoothness and limit properties of the one-parameter family of metrics, g ab (λ). Our approach should provide a framework for systematically calculating higher order corrections to gravitational self-force, including higher multipole effects, although we do not attempt to go beyond first-order calculations here. The status of the MiSaTaQuWa equations is explained

  18. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    Science.gov (United States)

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  20. Paper 3: Content and Rigor of Algebra Credit Recovery Courses

    Science.gov (United States)

    Walters, Kirk; Stachel, Suzanne

    2014-01-01

    This paper describes the content, organization and rigor of the f2f and online summer algebra courses that were delivered in summers 2011 and 2012. Examining the content of both types of courses is important because research suggests that algebra courses with certain features may be better than others in promoting success for struggling students.…

  1. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  2. Re-establishment of rigor mortis: evidence for a considerably longer post-mortem time span.

    Science.gov (United States)

    Crostack, Chiara; Sehner, Susanne; Raupach, Tobias; Anders, Sven

    2017-07-01

    Re-establishment of rigor mortis following mechanical loosening is used as part of the complex method for the forensic estimation of the time since death in human bodies and has formerly been reported to occur up to 8-12 h post-mortem (hpm). We recently described our observation of the phenomenon in up to 19 hpm in cases with in-hospital death. Due to the case selection (preceding illness, immobilisation), transfer of these results to forensic cases might be limited. We therefore examined 67 out-of-hospital cases of sudden death with known time points of death. Re-establishment of rigor mortis was positive in 52.2% of cases and was observed up to 20 hpm. In contrast to the current doctrine that a recurrence of rigor mortis is always of a lesser degree than its first manifestation in a given patient, muscular rigidity at re-establishment equalled or even exceeded the degree observed before dissolving in 21 joints. Furthermore, this is the first study to describe that the phenomenon appears to be independent of body or ambient temperature.

  3. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    Science.gov (United States)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  4. Differential algebras with remainder and rigorous proofs of long-term stability

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    It is shown how in addition to determining Taylor maps of general optical systems, it is possible to obtain rigorous interval bounds for the remainder term of the n-th order Taylor expansion. To this end, the three elementary operations of addition, multiplication, and differentiation in the Differential Algebraic approach are augmented by suitable interval operations in such a way that a remainder bound of the sum, product, and derivative is obtained from the Taylor polynomial and remainder bound of the operands. The method can be used to obtain bounds for the accuracy with which a Taylor map represents the true map of the particle optical system. In a more general sense, it is also useful for a variety of other numerical problems, including rigorous global optimization of highly complex functions. Combined with methods to obtain pseudo-invariants of repetitive motion and extensions of the Lyapunov- and Nekhoroshev stability theory, the latter can be used to guarantee stability for storage rings and other weakly nonlinear systems

  5. A Draft Conceptual Framework of Relevant Theories to Inform Future Rigorous Research on Student Service-Learning Outcomes

    Science.gov (United States)

    Whitley, Meredith A.

    2014-01-01

    While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…

  6. Building an Evidence Base to Inform Interventions for Pregnant and Parenting Adolescents: A Call for Rigorous Evaluation

    Science.gov (United States)

    Burrus, Barri B.; Scott, Alicia Richmond

    2012-01-01

    Adolescent parents and their children are at increased risk for adverse short- and long-term health and social outcomes. Effective interventions are needed to support these young families. We studied the evidence base and found a dearth of rigorously evaluated programs. Strategies from successful interventions are needed to inform both intervention design and policies affecting these adolescents. The lack of rigorous evaluations may be attributable to inadequate emphasis on and sufficient funding for evaluation, as well as to challenges encountered by program evaluators working with this population. More rigorous program evaluations are urgently needed to provide scientifically sound guidance for programming and policy decisions. Evaluation lessons learned have implications for other vulnerable populations. PMID:22897541

  7. A performance evaluation of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.; Wright, L.J.

    1987-01-01

    Personnel identity verification devices, which are based on the examination and assessment of a body feature or a unique repeatable personal action, are steadily improving. These biometric devices are becoming more practical with respect to accuracy, speed, user compatibility, reliability and cost, but more development is necessary to satisfy the varied and sometimes ill-defined future requirements of the security industry. In an attempt to maintain an awareness of the availability and the capabilities of identity verifiers for the DOE security community, Sandia Laboratories continues to comparatively evaluate the capabilities and improvements of developing devices. An evaluation of several recently available verifiers is discussed in this paper. Operating environments and procedures more typical of physical access control use can reveal performance substantially different from the basic laboratory tests

  8. Verifiable Distribution of Material Goods Based on Cryptology

    Directory of Open Access Journals (Sweden)

    Radomír Palovský

    2015-12-01

    Full Text Available Counterfeiting of material goods is a general problem. In this paper an architecture for verifiable distribution of material goods is presented. This distribution is based on printing such a QR code on goods, which would contain digitally signed serial number of the product, and validity of this digital signature could be verifiable by a customer. Extension consisting of adding digital signatures to revenue stamps used for state-controlled goods is also presented. Discussion on possibilities in making copies leads to conclusion that cryptographic security needs to be completed by technical difficulties of copying.

  9. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  10. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  11. A new twist to the long-standing three-nucleon analyzing power puzzle

    Energy Technology Data Exchange (ETDEWEB)

    Neidel, E.M.; Tornow, W.; Gonzalez Trotter, D.E.; Howell, C.R.; Crowell, A.S.; Macri, R.A.; Walter, R.L.; Weisel, G.J.; Esterline, J.; Witala, H.; Crowe, B.J.; Pedroni, R.S.; Markoff, D.M

    2003-01-16

    New results for the neutron-deuteron analyzing power A{sub y}({theta}) at E{sub n}=1.2 and 1.9 MeV and their comparison to proton-deuteron data reveal a sizeable and unexpected difference which increases with decreasing center-of-mass energy. This finding calls for the theoretical treatment of a subtle electromagnetic effect presently not incorporated in rigorous three-nucleon scattering calculations, before it is justified to invoke charge-dependent three-nucleon forces and/or other new physics.

  12. Reciprocity relations in transmission electron microscopy: A rigorous derivation.

    Science.gov (United States)

    Krause, Florian F; Rosenauer, Andreas

    2017-01-01

    A concise derivation of the principle of reciprocity applied to realistic transmission electron microscopy setups is presented making use of the multislice formalism. The equivalence of images acquired in conventional and scanning mode is thereby rigorously shown. The conditions for the applicability of the found reciprocity relations is discussed. Furthermore the positions of apertures in relation to the corresponding lenses are considered, a subject which scarcely has been addressed in previous publications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. The effect of rigor mortis on the passage of erythrocytes and fluid through the myocardium of isolated dog hearts.

    Science.gov (United States)

    Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M

    1978-07-01

    The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.

  14. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  15. Classroom Experiment to Verify the Lorentz Force

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 3. Classroom Experiment to Verify the Lorentz Force. Somnath Basu Anindita Bose Sumit Kumar Sinha Pankaj Vishe S Chatterjee. Classroom Volume 8 Issue 3 March 2003 pp 81-86 ...

  16. Atlantic salmon skin and fillet color changes effected by perimortem handling stress, rigor mortis, and ice storage.

    Science.gov (United States)

    Erikson, U; Misimi, E

    2008-03-01

    The changes in skin and fillet color of anesthetized and exhausted Atlantic salmon were determined immediately after killing, during rigor mortis, and after ice storage for 7 d. Skin color (CIE L*, a*, b*, and related values) was determined by a Minolta Chroma Meter. Roche SalmoFan Lineal and Roche Color Card values were determined by a computer vision method and a sensory panel. Before color assessment, the stress levels of the 2 fish groups were characterized in terms of white muscle parameters (pH, rigor mortis, and core temperature). The results showed that perimortem handling stress initially significantly affected several color parameters of skin and fillets. Significant transient fillet color changes also occurred in the prerigor phase and during the development of rigor mortis. Our results suggested that fillet color was affected by postmortem glycolysis (pH drop, particularly in anesthetized fillets), then by onset and development of rigor mortis. The color change patterns during storage were different for the 2 groups of fish. The computer vision method was considered suitable for automated (online) quality control and grading of salmonid fillets according to color.

  17. Biomedical text mining for research rigor and integrity: tasks, challenges, directions.

    Science.gov (United States)

    Kilicoglu, Halil

    2017-06-13

    An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.

  18. Rigorous results on measuring the quark charge below color threshold

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1979-01-01

    Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)

  19. On alternative approach for verifiable secret sharing

    OpenAIRE

    Kulesza, Kamil; Kotulski, Zbigniew; Pieprzyk, Joseph

    2002-01-01

    Secret sharing allows split/distributed control over the secret (e.g. master key). Verifiable secret sharing (VSS) is the secret sharing extended by verification capacity. Usually verification comes at the price. We propose "free lunch", the approach that allows to overcome this inconvenience.

  20. A Verifiable Secret Shuffle of Homomorphic Encryptions

    DEFF Research Database (Denmark)

    Groth, Jens

    2003-01-01

    We show how to prove in honest verifier zero-knowledge the correctness of a shuffle of homomorphic encryptions (or homomorphic commitments.) A shuffle consists in a rearrangement of the input ciphertexts and a reencryption of them so that the permutation is not revealed....

  1. Rigor mortis development in turkey breast muscle and the effect of electrical stunning.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-11-01

    Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.

  2. Rigorous Integration of Non-Linear Ordinary Differential Equations in Chebyshev Basis

    Czech Academy of Sciences Publication Activity Database

    Dzetkulič, Tomáš

    2015-01-01

    Roč. 69, č. 1 (2015), s. 183-205 ISSN 1017-1398 R&D Projects: GA MŠk OC10048; GA ČR GD201/09/H057 Institutional research plan: CEZ:AV0Z10300504 Keywords : Initial value problem * Rigorous integration * Taylor model * Chebyshev basis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.366, year: 2015

  3. Rigorous quantum limits on monitoring free masses and harmonic oscillators

    Science.gov (United States)

    Roy, S. M.

    2018-03-01

    There are heuristic arguments proposing that the accuracy of monitoring position of a free mass m is limited by the standard quantum limit (SQL): σ2( X (t ) ) ≥σ2( X (0 ) ) +(t2/m2) σ2( P (0 ) ) ≥ℏ t /m , where σ2( X (t ) ) and σ2( P (t ) ) denote variances of the Heisenberg representation position and momentum operators. Yuen [Phys. Rev. Lett. 51, 719 (1983), 10.1103/PhysRevLett.51.719] discovered that there are contractive states for which this result is incorrect. Here I prove universally valid rigorous quantum limits (RQL), viz. rigorous upper and lower bounds on σ2( X (t ) ) in terms of σ2( X (0 ) ) and σ2( P (0 ) ) , given by Eq. (12) for a free mass and by Eq. (36) for an oscillator. I also obtain the maximally contractive and maximally expanding states which saturate the RQL, and use the contractive states to set up an Ozawa-type measurement theory with accuracies respecting the RQL but beating the standard quantum limit. The contractive states for oscillators improve on the Schrödinger coherent states of constant variance and may be useful for gravitational wave detection and optical communication.

  4. Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.

    Science.gov (United States)

    Forgatch, Marion S; Kjøbli, John

    2016-09-01

    Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.

  5. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    Science.gov (United States)

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  6. Verified scientific findings

    International Nuclear Information System (INIS)

    Bullinger, M.G.

    1982-01-01

    In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

  7. An experiment designed to verify the general theory of relativity; Une experience destinee a verifier la theorie de la relativite generalisee

    Energy Technology Data Exchange (ETDEWEB)

    Surdin, Maurice [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1960-07-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [French] Projet d'une experience, utilisant l'effet de gravitation sur des horloges du type Maser placees sur la terre a deux altitudes differentes, et destinee a verifier la theorie de la relativite generalisee. Reproduction d'un article publie dans les Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, seance du 11 janvier 1960.

  8. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  9. Verifying versus falsifying banknotes

    Science.gov (United States)

    van Renesse, Rudolf L.

    1998-04-01

    A series of counterfeit Dutch, German, English, and U.S. banknotes was examined with respect to the various modi operandi to imitate paper based, printed and post-printed security features. These features provide positive evidence (verifiability) as well as negative evidence (falsifiability). It appears that the positive evidence provided in most cases is insufficiently convincing: banknote inspection mainly rests on negative evidence. The act of falsifying (to prove to be false), however, is an inefficacious procedure. Ergonomic verificatory security features are demanded. This demand is increasingly met by security features based on nano- technology. The potential of nano-security has a twofold base: (1) the unique optical effects displayed allow simple, fast and unambiguous inspection, and (2) the nano-technology they are based on, makes successful counterfeit or simulation extremely improbable.

  10. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications

    Directory of Open Access Journals (Sweden)

    Vassilis Gikas

    2016-08-01

    Full Text Available With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS, Inertial Measurement Unit (IMU and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness capabilities (i.e., potential and limitations based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android and iPhone 5s (iOS. Our findings indicate that the deviation of the smartphone locations from ground truth (trueness deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of

  11. A rigorous proof of the Landau-Peierls formula and much more

    DEFF Research Database (Denmark)

    Briet, Philippe; Cornean, Horia; Savoie, Baptiste

    2012-01-01

    We present a rigorous mathematical treatment of the zero-field orbital magnetic susceptibility of a non-interacting Bloch electron gas, at fixed temperature and density, for both metals and semiconductors/insulators. In particular, we obtain the Landau-Peierls formula in the low temperature and d...... and density limit as conjectured by Kjeldaas and Kohn (Phys Rev 105:806–813, 1957)....

  12. Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.

    Science.gov (United States)

    Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia

    2015-01-01

    mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.

  13. Industrial applications of formal methods to model, design and analyze computer systems

    CERN Document Server

    Craigen, Dan

    1995-01-01

    Formal methods are mathematically-based techniques, often supported by reasoning tools, that can offer a rigorous and effective way to model, design and analyze computer systems. The purpose of this study is to evaluate international industrial experience in using formal methods. The cases selected are representative of industrial-grade projects and span a variety of application domains. The study had three main objectives: · To better inform deliberations within industry and government on standards and regulations; · To provide an authoritative record on the practical experience of formal m

  14. Neutron-deuteron analyzing power data at En = 21 MeV and the energy dependence of the three-nucleon analyzing power puzzle

    Science.gov (United States)

    Weisel, G. J.; Tornow, W.; Esterline, J. H.

    2015-08-01

    We present measurements of n-d analyzing power, {A}y(θ ), at En = 21.0 MeV. The experiment produces neutrons via the 2H(d, n)3He reaction and uses a deuterated liquid-scintillator center detector and six pairs of liquid-scintillator neutron side detectors. Elastic neutron scattering events are identified by using time-of-flight techniques and by setting a gate in the center-detector pulse-height spectrum. Beam polarization is monitored by using a high-pressure helium gas scintillator. The n-d {A}y(θ ) data at 21.0 MeV show a significant discrepancy with the results of rigorous three-body calculations and are consistent with data taken previously by us at 19.0 and 22.5 MeV. We review the overall energy dependence of the three-nucleon analyzing power puzzle in neutron-deuteron elastic scattering, using the best data available. We find that the relative difference between calculations and data is nearly constant at 25% up to En = 22.5 MeV.

  15. Estimation of the time since death--reconsidering the re-establishment of rigor mortis.

    Science.gov (United States)

    Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter

    2013-01-01

    In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.

  16. Robustness and device independence of verifiable blind quantum computing

    International Nuclear Information System (INIS)

    Gheorghiu, Alexandru; Kashefi, Elham; Wallden, Petros

    2015-01-01

    Recent advances in theoretical and experimental quantum computing bring us closer to scalable quantum computing devices. This makes the need for protocols that verify the correct functionality of quantum operations timely and has led to the field of quantum verification. In this paper we address key challenges to make quantum verification protocols applicable to experimental implementations. We prove the robustness of the single server verifiable universal blind quantum computing protocol of Fitzsimons and Kashefi (2012 arXiv:1203.5217) in the most general scenario. This includes the case where the purification of the deviated input state is in the hands of an adversarial server. The proved robustness property allows the composition of this protocol with a device-independent state tomography protocol that we give, which is based on the rigidity of CHSH games as proposed by Reichardt et al (2013 Nature 496 456–60). The resulting composite protocol has lower round complexity for the verification of entangled quantum servers with a classical verifier and, as we show, can be made fault tolerant. (paper)

  17. NOS CO-OPS Water Level Data, Verified, Hourly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), hourly, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  18. College Readiness in California: A Look at Rigorous High School Course-Taking

    Science.gov (United States)

    Gao, Niu

    2016-01-01

    Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…

  19. Rigor in Qualitative Supply Chain Management Research

    DEFF Research Database (Denmark)

    Goffin, Keith; Raja, Jawwad; Claes, Björn

    2012-01-01

    , reliability, and theoretical saturation. Originality/value – It is the authors' contention that the addition of the repertory grid technique to the toolset of methods used by logistics and supply chain management researchers can only enhance insights and the building of robust theories. Qualitative studies......Purpose – The purpose of this paper is to share the authors' experiences of using the repertory grid technique in two supply chain management studies. The paper aims to demonstrate how the two studies provided insights into how qualitative techniques such as the repertory grid can be made more...... rigorous than in the past, and how results can be generated that are inaccessible using quantitative methods. Design/methodology/approach – This paper presents two studies undertaken using the repertory grid technique to illustrate its application in supply chain management research. Findings – The paper...

  20. Optimal correction and design parameter search by modern methods of rigorous global optimization

    International Nuclear Information System (INIS)

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  1. A new approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility,

  2. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    Energy Technology Data Exchange (ETDEWEB)

    Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com [Faculty of Science, Chandrakasem Rajabhat University, Ratchadaphisek Road, Chatuchak, Bangkok 10900 (Thailand); Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com [Department of Mathematics and Computer Science, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Chatrabhuti, Auttakit, E-mail: dma3ac2@gmail.com [Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Visser, Matt, E-mail: matt.visser@msor.vuw.ac.nz [School of Mathematics, Statistics, and Operations Research, Victoria University of Wellington, PO Box 600, Wellington 6140 (New Zealand)

    2016-06-02

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  3. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    International Nuclear Information System (INIS)

    Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt

    2016-01-01

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  4. Rigorous classification and carbon accounting principles for low and Zero Carbon Cities

    International Nuclear Information System (INIS)

    Kennedy, Scott; Sgouridis, Sgouris

    2011-01-01

    A large number of communities, new developments, and regions aim to lower their carbon footprint and aspire to become 'zero carbon' or 'Carbon Neutral.' Yet there are neither clear definitions for the scope of emissions that such a label would address on an urban scale, nor is there a process for qualifying the carbon reduction claims. This paper addresses the question of how to define a zero carbon, Low Carbon, or Carbon Neutral urban development by proposing hierarchical emissions categories with three levels: Internal Emissions based on the geographical boundary, external emissions directly caused by core municipal activities, and internal or external emissions due to non-core activities. Each level implies a different carbon management strategy (eliminating, balancing, and minimizing, respectively) needed to meet a Net Zero Carbon designation. The trade-offs, implications, and difficulties of implementing carbon debt accounting based upon these definitions are further analyzed. - Highlights: → A gap exists in comprehensive and standardized accounting methods for urban carbon emissions. → We propose a comprehensive and rigorous City Framework for Carbon Accounting (CiFCA). → CiFCA classifies emissions hierarchically with corresponding carbon management strategies. → Adoption of CiFCA allows for meaningful comparisons of claimed performance of eco-cities.

  5. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br

    2008-07-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  6. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    International Nuclear Information System (INIS)

    Botelho, Luiz C.L.

    2008-01-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  7. A plea for rigorous conceptual analysis as central method in transnational law design

    NARCIS (Netherlands)

    Rijgersberg, R.; van der Kaaij, H.

    2013-01-01

    Although shared problems are generally easily identified in transnational law design, it is considerably more difficult to design frameworks that transcend the peculiarities of local law in a univocal fashion. The following exposition is a plea for giving more prominence to rigorous conceptual

  8. Unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2016-07-01

    Full Text Available stream_source_info Marais_2016_ABSTRACT.pdf.txt stream_content_type text/plain stream_size 796 Content-Encoding ISO-8859-1 stream_name Marais_2016_ABSTRACT.pdf.txt Content-Type text/plain; charset=ISO-8859-1 18th... International Workshop on Descriptional Complexity of Formal Systems, 5 - 8 July 2016, Bucharest, Romania Unary self-verifying symmetric difference automata Laurette Marais1,2 and Lynette van Zijl1(B) 1 Department of Computer Science, Stellenbosch...

  9. Automated measurement and control of concrete properties in a ready mix truck with VERIFI.

    Science.gov (United States)

    2014-02-01

    In this research, twenty batches of concrete with six different mixture proportions were tested with VERIFI to evaluate 1) accuracy : and repeatability of VERIFI measurements, 2) ability of VERIFI to adjust slump automatically with water and admixtur...

  10. Business rescue decision making through verifier determinants – ask the specialists

    Directory of Open Access Journals (Sweden)

    Marius Pretorius

    2013-11-01

    Full Text Available Orientation: Business rescue has become a critical part of business strategy decision making, especially during economic downturns and recessions. Past legislation has generally supported creditor-friendly regimes, and its mind-set still applies which increases the difficulty of such turnarounds. There are many questions and critical issues faced by those involved in rescue. Despite extensive theory in the literature on failure, there is a void regarding practical verifiers of the signs and causes of venture decline, as specialists are not forthcoming about what they regard as their “competitive advantage”. Research purpose: This article introduces the concept and role of “verifier determinants” of early warning signs, as a tool to confirm the causes of decline in order to direct rescue strategies and, most importantly, reduce time between the first observation and the implementation of the rescue. Motivation for the study: Knowing how specialist practitioners confirm causes of business decline could assist in deciding on strategies for the rescue earlier than can be done using traditional due diligence which is time consuming. Reducing time is a crucial element of a successful rescue. Research design and approach: The researchers interviewed specialist practitioners with extensive experience in rescue and turnaround. An experimental design was used to ensure the specialists evaluated the same real cases to extract their experiences and base their decisions on. Main findings: The specialists confirmed the use of verifier determinants and identified such determinants as they personally used them to confirm causes of decline. These verifier determinants were classified into five categories; namely, management, finance, strategic, banking and operations and marketing of the ventures under investigation. The verifier determinants and their use often depend heavily on subconscious (non-factual information based on previous experiences

  11. A cascading failure model for analyzing railway accident causation

    Science.gov (United States)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  12. Demonstration of analyzers for multimode photonic time-bin qubits

    Science.gov (United States)

    Jin, Jeongwan; Agne, Sascha; Bourgoin, Jean-Philippe; Zhang, Yanbao; Lütkenhaus, Norbert; Jennewein, Thomas

    2018-04-01

    We demonstrate two approaches for unbalanced interferometers as time-bin qubit analyzers for quantum communication, robust against mode distortions and polarization effects as expected from free-space quantum communication systems including wavefront deformations, path fluctuations, pointing errors, and optical elements. Despite strong spatial and temporal distortions of the optical mode of a time-bin qubit, entangled with a separate polarization qubit, we verify entanglement using the Negative Partial Transpose, with the measured visibility of up to 0.85 ±0.01 . The robustness of the analyzers is further demonstrated for various angles of incidence up to 0 .2∘ . The output of the interferometers is coupled into multimode fiber yielding a high system throughput of 0.74. Therefore, these analyzers are suitable and efficient for quantum communication over multimode optical channels.

  13. Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.

    Science.gov (United States)

    He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk

    2014-10-01

    The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.

  14. A Practical Voter-Verifiable Election Scheme.

    OpenAIRE

    Chaum, D; Ryan, PYA; Schneider, SA

    2005-01-01

    We present an election scheme designed to allow voters to verify that their vote is accurately included in the count. The scheme provides a high degree of transparency whilst ensuring the secrecy of votes. Assurance is derived from close auditing of all the steps of the vote recording and counting process with minimal dependence on the system components. Thus, assurance arises from verification of the election rather than having to place trust in the correct behaviour of components of the vot...

  15. Acceptance Test Report for the 241-AZ-101 Ultrasonic Interface Level Analyzer

    International Nuclear Information System (INIS)

    ANDREWS, J.E.

    2000-01-01

    This document comprises the Acceptance Test Report for the 241-AZ-101 Ultrasonic Interface Level Analyzer. This document presents the results of Acceptance Testing of the 241-AZ-101 Ultrasonic Interface Level Analyzers (URSILLAs). Testing of the URSILLAs was performed in accordance with ATP-260-001, ''URSILLA Pre-installation Acceptance Test Procedure''. The objective of the testing was to verify that all equipment and components function in accordance with design specifications and original equipment manufacturer's specifications

  16. Supersymmetry and the Parisi-Sourlas dimensional reduction: A rigorous proof

    International Nuclear Information System (INIS)

    Klein, A.; Landau, L.J.; Perez, J.F.

    1984-01-01

    Functional integrals that are formally related to the average correlation functions of a classical field theory in the presence of random external sources are given a rigorous meaning. Their dimensional reduction to the Schwinger functions of the corresponding quantum field theory in two fewer dimensions is proven. This is done by reexpressing those functional integrals as expectations of a supersymmetric field theory. The Parisi-Sourlas dimensional reduction of a supersymmetric field theory to a usual quantum field theory in two fewer dimensions is proven. (orig.)

  17. Application of the rigorous method to x-ray and neutron beam scattering on rough surfaces

    International Nuclear Information System (INIS)

    Goray, Leonid I.

    2010-01-01

    The paper presents a comprehensive numerical analysis of x-ray and neutron scattering from finite-conducting rough surfaces which is performed in the frame of the boundary integral equation method in a rigorous formulation for high ratios of characteristic dimension to wavelength. The single integral equation obtained involves boundary integrals of the single and double layer potentials. A more general treatment of the energy conservation law applicable to absorption gratings and rough mirrors is considered. In order to compute the scattering intensity of rough surfaces using the forward electromagnetic solver, Monte Carlo simulation is employed to average the deterministic diffraction grating efficiency due to individual surfaces over an ensemble of realizations. Some rules appropriate for numerical implementation of the theory at small wavelength-to-period ratios are presented. The difference between the rigorous approach and approximations can be clearly seen in specular reflectances of Au mirrors with different roughness parameters at wavelengths where grazing incidence occurs at close to or larger than the critical angle. This difference may give rise to wrong estimates of rms roughness and correlation length if they are obtained by comparing experimental data with calculations. Besides, the rigorous approach permits taking into account any known roughness statistics and allows exact computation of diffuse scattering.

  18. Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound

    Science.gov (United States)

    Shiraishi, Naoto; Tajima, Hiroyasu

    2017-08-01

    A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.

  19. Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II

    Energy Technology Data Exchange (ETDEWEB)

    George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg; Aiysha Sultana; Tyler Van Leeuwen

    2009-06-01

    This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2 storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.

  20. Post mortem rigor development in the Egyptian goose (Alopochen aegyptiacus) breast muscle (pectoralis): factors which may affect the tenderness.

    Science.gov (United States)

    Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C

    2016-01-15

    Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.

  1. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  2. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    OpenAIRE

    K. Di; Y. Liu; B. Liu; M. Peng

    2012-01-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D c...

  3. The Method of a Standalone Functional Verifying Operability of Sonar Control Systems

    Directory of Open Access Journals (Sweden)

    A. A. Sotnikov

    2014-01-01

    Full Text Available This article describes a method of standalone verifying sonar control system, which is based on functional checking of control system operability.The main features of realized method are a development of the valid mathematic model for simulation of sonar signals at the point of hydroacoustic antenna, a valid representation of the sonar control system modes as a discrete Markov model, providing functional object verification in real time mode.Some ways are proposed to control computational complexity in case of insufficient computing resources of the simulation equipment, namely the way of model functionality reduction and the way of adequacy reduction.Experiments were made using testing equipment, which was developed by department of Research Institute of Information Control System at Bauman Moscow State Technical University to verify technical validity of industrial sonar complexes.On-board software was artificially changed to create malfunctions in functionality of sonar control systems during the verifying process in order to estimate verifying system performances.The method efficiency was proved by the theory and experiment results in comparison with the basic methodology of verifying technical systems.This method could be also used in debugging of on-board software of sonar complexes and in development of new promising algorithms of sonar signal processing.

  4. Acceptance Test Report for the 241-AZ-101 Ultrasonic Interface Level Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    ANDREWS, J.E.

    2000-01-27

    This document comprises the Acceptance Test Report for the 241-AZ-101 Ultrasonic Interface Level Analyzer. This document presents the results of Acceptance Testing of the 241-AZ-101 Ultrasonic Interface Level Analyzers (URSILLAs). Testing of the URSILLAs was performed in accordance with ATP-260-001, ''URSILLA Pre-installation Acceptance Test Procedure''. The objective of the testing was to verify that all equipment and components function in accordance with design specifications and original equipment manufacturer's specifications.

  5. A Trustworthy Internet Auction Model with Verifiable Fairness.

    Science.gov (United States)

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  6. 75 FR 29732 - Career and Technical Education Program-Promoting Rigorous Career and Technical Education Programs...

    Science.gov (United States)

    2010-05-27

    ... rigorous knowledge and skills in English- language arts and mathematics that employers and colleges expect... specialists and to access the student outcome data needed to meet annual evaluation and reporting requirements...

  7. Rigorous Line-Based Transformation Model Using the Generalized Point Strategy for the Rectification of High Resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Kun Hu

    2016-09-01

    Full Text Available High precision geometric rectification of High Resolution Satellite Imagery (HRSI is the basis of digital mapping and Three-Dimensional (3D modeling. Taking advantage of line features as basic geometric control conditions instead of control points, the Line-Based Transformation Model (LBTM provides a practical and efficient way of image rectification. It is competent to build the mathematical relationship between image space and the corresponding object space accurately, while it reduces the workloads of ground control and feature recognition dramatically. Based on generalization and the analysis of existing LBTMs, a novel rigorous LBTM is proposed in this paper, which can further eliminate the geometric deformation caused by sensor inclination and terrain variation. This improved nonlinear LBTM is constructed based on a generalized point strategy and resolved by least squares overall adjustment. Geo-positioning accuracy experiments with IKONOS, GeoEye-1 and ZiYuan-3 satellite imagery are performed to compare rigorous LBTM with other relevant line-based and point-based transformation models. Both theoretic analysis and experimental results demonstrate that the rigorous LBTM is more accurate and reliable without adding extra ground control. The geo-positioning accuracy of satellite imagery rectified by rigorous LBTM can reach about one pixel with eight control lines and can be further improved by optimizing the horizontal and vertical distribution of control lines.

  8. An adaptive finite element method for simulating surface tension with the gradient theory of fluid interfaces

    KAUST Repository

    Kou, Jisheng; Sun, Shuyu

    2014-01-01

    The gradient theory for the surface tension of simple fluids and mixtures is rigorously analyzed based on mathematical theory. The finite element approximation of surface tension is developed and analyzed, and moreover, an adaptive finite element method based on a physical-based estimator is proposed and it can be coupled efficiently with Newton's method as well. The numerical tests are carried out both to verify the proposed theory and to demonstrate the efficiency of the proposed method. © 2013 Elsevier B.V. All rights reserved.

  9. An adaptive finite element method for simulating surface tension with the gradient theory of fluid interfaces

    KAUST Repository

    Kou, Jisheng

    2014-01-01

    The gradient theory for the surface tension of simple fluids and mixtures is rigorously analyzed based on mathematical theory. The finite element approximation of surface tension is developed and analyzed, and moreover, an adaptive finite element method based on a physical-based estimator is proposed and it can be coupled efficiently with Newton\\'s method as well. The numerical tests are carried out both to verify the proposed theory and to demonstrate the efficiency of the proposed method. © 2013 Elsevier B.V. All rights reserved.

  10. A record and verify system for radiotherapy treatment

    International Nuclear Information System (INIS)

    Koens, M.L.; Vroome, H. de

    1984-01-01

    The Record and Verify system developed for the radiotherapy department of the Leiden University Hospital is described. The system has been in use since 1980 and will now be installed in at least four of the Dutch University Hospitals. The system provides the radiographer with a powerful tool for checking the set-up of the linear accelerator preceeding the irradiation of a field. After the irradiation of a field the machine settings are registered in the computer system together with the newly calculated cumulative dose. These registrations are used by the system to produce a daily report which provides the management of the department with insight into the established differences between treatment and treatment planning. Buying a record and verify system from the manufacturer of the linear accelerator is not an optimal solution especially for a department with more than one accelerator from different manufacturers. Integration in a Hospital Information System (HIS) has important advantages over the development of a dedicated departmental system. (author)

  11. Rigorous patient-prosthesis matching of Perimount Magna aortic bioprosthesis.

    Science.gov (United States)

    Nakamura, Hiromasa; Yamaguchi, Hiroki; Takagaki, Masami; Kadowaki, Tasuku; Nakao, Tatsuya; Amano, Atsushi

    2015-03-01

    Severe patient-prosthesis mismatch, defined as effective orifice area index ≤0.65 cm(2) m(-2), has demonstrated poor long-term survival after aortic valve replacement. Reported rates of severe mismatch involving the Perimount Magna aortic bioprosthesis range from 4% to 20% in patients with a small annulus. Between June 2008 and August 2011, 251 patients (mean age 70.5 ± 10.2 years; mean body surface area 1.55 ± 0.19 m(2)) underwent aortic valve replacement with a Perimount Magna bioprosthesis, with or without concomitant procedures. We performed our procedure with rigorous patient-prosthesis matching to implant a valve appropriately sized to each patient, and carried out annular enlargement when a 19-mm valve did not fit. The bioprosthetic performance was evaluated by transthoracic echocardiography predischarge and at 1 and 2 years after surgery. Overall hospital mortality was 1.6%. Only 5 (2.0%) patients required annular enlargement. The mean follow-up period was 19.1 ± 10.7 months with a 98.4% completion rate. Predischarge data showed a mean effective orifice area index of 1.21 ± 0.20 cm(2) m(-2). Moderate mismatch, defined as effective orifice area index ≤0.85 cm(2) m(-2), developed in 4 (1.6%) patients. None developed severe mismatch. Data at 1 and 2 years showed only two cases of moderate mismatch; neither was severe. Rigorous patient-prosthesis matching maximized the performance of the Perimount Magna, and no severe mismatch resulted in this Japanese population of aortic valve replacement patients. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  12. Verifiable Outsourced Decryption of Attribute-Based Encryption with Constant Ciphertext Length

    Directory of Open Access Journals (Sweden)

    Jiguo Li

    2017-01-01

    Full Text Available Outsourced decryption ABE system largely reduces the computation cost for users who intend to access the encrypted files stored in cloud. However, the correctness of the transformation ciphertext cannot be guaranteed because the user does not have the original ciphertext. Lai et al. provided an ABE scheme with verifiable outsourced decryption which helps the user to check whether the transformation done by the cloud is correct. In order to improve the computation performance and reduce communication overhead, we propose a new verifiable outsourcing scheme with constant ciphertext length. To be specific, our scheme achieves the following goals. (1 Our scheme is verifiable which ensures that the user efficiently checks whether the transformation is done correctly by the CSP. (2 The size of ciphertext and the number of expensive pairing operations are constant, which do not grow with the complexity of the access structure. (3 The access structure in our scheme is AND gates on multivalued attributes and we prove our scheme is verifiable and it is secure against selectively chosen-plaintext attack in the standard model. (4 We give some performance analysis which indicates that our scheme is adaptable for various limited bandwidth and computation-constrained devices, such as mobile phone.

  13. Building Program Verifiers from Compilers and Theorem Provers

    Science.gov (United States)

    2015-05-14

    Checking with SMT UFO • LLVM-based front-end (partially reused in SeaHorn) • Combines Abstract Interpretation with Interpolation-Based Model Checking • (no...assertions Counter-examples are long Hard to determine (from main) what is relevant Assertion Main 35 Building Verifiers from Comp and SMT Gurfinkel, 2015

  14. Association between cotinine-verified smoking status and hypertension in 167,868 Korean adults.

    Science.gov (United States)

    Kim, Byung Jin; Han, Ji Min; Kang, Jung Gyu; Kim, Bum Soo; Kang, Jin Ho

    2017-10-01

    Previous studies showed inconsistent results concerning the relationship between chronic smoking and blood pressure. Most of the studies involved self-reported smoking status. This study was performed to evaluate the association of urinary cotinine or self-reported smoking status with hypertension and blood pressure in Korean adults. Among individuals enrolled in the Kangbuk Samsung Health Study and Kangbuk Samsung Cohort Study, 167,868 participants (men, 55.7%; age, 37.5 ± 6.9 years) between 2011 and 2013 who had urinary cotinine measurements were included. Individuals with urinary cotinine levels ≥50 ng/mL were defined as cotinine-verified current smokers. The prevalence of hypertension and cotinine-verified current smokers in the overall population was 6.8% and 22.7%, respectively (10.0% in men and 2.8% in women for hypertension: 37.7% in men and 3.9% in women for cotinine-verified current smokers). In a multivariate regression analysis adjusted for age, sex, body mass index, waist circumference, alcohol drinking, vigorous exercise, and diabetes, cotinine-verified current smoking was associated with lower prevalence of hypertension compared with cotinine-verified never smoking (OR[95% CI], 0.79 [0.75, 0.84]). Log-transformed cotinine levels and unobserved smoking were negatively associated with hypertension, respectively (0.96 [0.96, 0.97] and 0.55 [0.39, 0.79]). In a multivariate linear regression analysis, the cotinine-verified current smoking was inversely associated with systolic and diastolic blood pressure (BP) (regression coefficient[95% CI], -1.23[-1.39, -1.07] for systolic BP and -0.71 [-0.84, -0.58] for diastolic BP). In subgroup analyses according to sex, the inverse associations between cotinine-verified current smoking and hypertension were observed only in men. This large observational study showed that cotinine-verified current smoking and unobserved smoking were inversely associated with hypertension in Korean adults, especially only in

  15. NOS CO-OPS Water Level Data, Verified, High Low

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), daily, high low water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services...

  16. An alternative test for verifying electronic balance linearity

    International Nuclear Information System (INIS)

    Thomas, I.R.

    1998-02-01

    This paper presents an alternative method for verifying electronic balance linearity and accuracy. This method is being developed for safeguards weighings (weighings for the control and accountability of nuclear material) at the Idaho National Engineering and Environmental Laboratory (INEEL). With regard to balance linearity and accuracy, DOE Order 5633.3B, Control and Accountability of Nuclear Materials, Paragraph 2, 4, e, (1), (a) Scales and Balances Program, states: ''All scales and balances used for accountability purposes shall be maintained in good working condition, recalibrated according to an established schedule, and checked for accuracy and linearity on each day that the scale or balance is used for accountability purposes.'' Various tests have been proposed for testing accuracy and linearity. In the 1991 Measurement Science Conference, Dr. Walter E. Kupper presented a paper entitled: ''Validation of High Accuracy Weighing Equipment.'' Dr. Kupper emphasized that tolerance checks for calibrated, state-of-the-art electronic equipment need not be complicated, and he presented four easy steps for verifying that a calibrated balance is operating correctly. These tests evaluate the standard deviation of successive weighings (of the same load), the off-center error, the calibration error, and the error due to nonlinearity. This method of balance validation is undoubtedly an authoritative means of ensuring balance operability, yet it could have two drawbacks: one, the test for linearity is not intuitively obvious, especially from a statistical viewpoint; and two, there is an absence of definitively defined testing limits. Hence, this paper describes an alternative means of verifying electronic balance linearity and accuracy that is being developed for safeguards measurements at the INEEL

  17. Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation

    Directory of Open Access Journals (Sweden)

    Ümit Çiftçi

    2010-03-01

    Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test

  18. Bringing scientific rigor to community-developed programs in Hong Kong

    Directory of Open Access Journals (Sweden)

    Fabrizio Cecilia S

    2012-12-01

    Full Text Available Abstract Background This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR. Methods The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Results Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. Conclusions The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  19. Desarrollo constitucional, legal y jurisprudencia del principio de rigor subsidiario

    Directory of Open Access Journals (Sweden)

    Germán Eduardo Cifuentes Sandoval

    2013-09-01

    Full Text Available In colombia the environment state administration is in charge of environmental national system, SINA, SINA is made up of states entities that coexist beneath a mixed organization of centralization and decentralization. SINA decentralization express itself in a administrative and territorial level, and is waited that entities that function under this structure act in a coordinated way in order to reach suggested objectives in the environmental national politicy. To achieve the coordinated environmental administration through entities that define the SINA, the environmental legislation of Colombia has include three basic principles: 1. The principle of “armorial regional” 2. The principle of “gradationnormative” 3. The principle of “rigorsubsidiaries”. These principles belong to the article 63, law 99 of 1933, and even in the case of the two first, it is possible to find equivalents in other norms that integrate the Colombian legal system, it does not happen in that way with the “ rigor subsidiaries” because its elements are uniques of the environmental normativity and do not seem to be similar to those that make part of the principle of “ subsidiaridad” present in the article 288 of the politic constitution. The “ rigor subsidiaries” give to decentralizates entities certain type of special ability to modify the current environmental legislation to defend the local ecological patrimony. It is an administrative ability with a foundation in the decentralization autonomy that allows to take place of the reglamentary denied of the legislative power with the condition that the new normativity be more demanding that the one that belongs to the central level

  20. Bringing scientific rigor to community-developed programs in Hong Kong.

    Science.gov (United States)

    Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M

    2012-12-31

    This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  1. NOS CO-OPS Water Level Data, Verified, 6-Minute

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), 6-minute, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  2. Verified Subtyping with Traits and Mixins

    Directory of Open Access Journals (Sweden)

    Asankhaya Sharma

    2014-07-01

    Full Text Available Traits allow decomposing programs into smaller parts and mixins are a form of composition that resemble multiple inheritance. Unfortunately, in the presence of traits, programming languages like Scala give up on subtyping relation between objects. In this paper, we present a method to check subtyping between objects based on entailment in separation logic. We implement our method as a domain specific language in Scala and apply it on the Scala standard library. We have verified that 67% of mixins used in the Scala standard library do indeed conform to subtyping between the traits that are used to build them.

  3. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  4. A CUMULATIVE MIGRATION METHOD FOR COMPUTING RIGOROUS TRANSPORT CROSS SECTIONS AND DIFFUSION COEFFICIENTS FOR LWR LATTICES WITH MONTE CARLO

    Energy Technology Data Exchange (ETDEWEB)

    Zhaoyuan Liu; Kord Smith; Benoit Forget; Javier Ortensi

    2016-05-01

    A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices. Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.

  5. The influence of low temperature, type of muscle and electrical stimulation on the course of rigor mortis, ageing and tenderness of beef muscles.

    Science.gov (United States)

    Olsson, U; Hertzman, C; Tornberg, E

    1994-01-01

    The course of rigor mortis, ageing and tenderness have been evaluated for two beef muscles, M. semimembranosus (SM) and M. longissimus dorsi (LD), when entering rigor at constant temperatures in the cold-shortening region (1, 4, 7 and 10°C). The influence of electrical stimulation (ES) was also examined. Post-mortem changes were registered by shortening and isometric tension and by following the decline of pH, ATP and creatine phosphate. The effect of ageing on tenderness was recorded by measuring shear-force (2, 8 and 15 days post mortem) and the sensory properties were assessed 15 days post mortem. It was found that shortening increased with decreasing temperature, resulting in decreased tenderness. Tenderness for LD, but not for SM, was improved by ES at 1 and 4°C, whereas ES did not give rise to any decrease in the degree of shortening during rigor mortis development. This suggests that ES influences tenderization more than it prevents cold-shortening. The samples with a pre-rigor mortis temperature of 1°C could not be tenderized, when stored up to 15 days, whereas this was the case for the muscles entering rigor mortis at the other higher temperatures. The results show that under the conditions used in this study, the course of rigor mortis is more important for the ultimate tenderness than the course of ageing. Copyright © 1994. Published by Elsevier Ltd.

  6. Verifying Safety Messages Using Relative-Time and Zone Priority in Vehicular Ad Hoc Networks

    Science.gov (United States)

    Banani, Sam; Thiemjarus, Surapa; Kittipiyakul, Somsak

    2018-01-01

    In high-density road networks, with each vehicle broadcasting multiple messages per second, the arrival rate of safety messages can easily exceed the rate at which digital signatures can be verified. Since not all messages can be verified, algorithms for selecting which messages to verify are required to ensure that each vehicle receives appropriate awareness about neighbouring vehicles. This paper presents a novel scheme to select important safety messages for verification in vehicular ad hoc networks (VANETs). The proposed scheme uses location and direction of the sender, as well as proximity and relative-time between vehicles, to reduce the number of irrelevant messages verified (i.e., messages from vehicles that are unlikely to cause an accident). Compared with other existing schemes, the analysis results show that the proposed scheme can verify messages from nearby vehicles with lower inter-message delay and reduced packet loss and thus provides high level of awareness of the nearby vehicles. PMID:29652840

  7. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    International Nuclear Information System (INIS)

    Yampolskiy, Roman V

    2017-01-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification. (invited comment)

  8. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    Science.gov (United States)

    Yampolskiy, Roman V.

    2017-09-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.

  9. Rigor index, fillet yield and proximate composition of cultured striped catfish (Pangasianodon hypophthalmus for its suitability in processing industries in Bangladesh

    Directory of Open Access Journals (Sweden)

    Salma Noor-E Islami

    2014-12-01

    Full Text Available Rigor-index in market-size striped catfish (Pangasianodon hypophthalmus, locally called Thai-Pangas was determined to assess fillet yield for production of value-added products. In whole fish, rigor started within 1 hr after death under both iced and room temperature conditions while rigor-index reached a maximum of 72.23% within 8 hr and 85.5% within 5 hr at room temperature and iced condition, respectively, which was fully relaxed after 22 hr under both storage conditions. Post-mortem muscle pH decreased to 6.8 after 2 hr, 6.2 after 8 hr and sharp increase to 6.9 after 9 hr. There was a positive correlation between rigor progress and pH shift in fish fillets. Hand filleting was done post-rigor and fillet yield experiment showed 50.4±2.1% fillet, 8.0±0.2% viscera, 8.0±1.3% skin and 32.0±3.2% carcass could be obtained from Thai-Pangas. Proximate composition analysis of four regions of Thai-Pangas viz., head region, middle region, tail region and viscera revealed moisture 78.36%, 81.14%, 81.45% and 57.33%; protein 15.83%, 15.97%, 16.14% and 17.20%; lipid 4.61%, 1.82%, 1.32% and 24.31% and ash 1.09%, 0.96%, 0.95% and 0.86%, respectively indicating suitability of Thai-Pangas for production of value-added products such as fish fillets.

  10. Influência do estresse causado pelo transporte e método de abate sobre o rigor mortis do tambaqui (Colossoma macropomum

    Directory of Open Access Journals (Sweden)

    Joana Maia Mendes

    2015-06-01

    Full Text Available ResumoO presente trabalho avaliou a influência do estresse pré-abate e do método de abate sobre o rigor mortis do tambaqui durante armazenamento em gelo. Foram estudadas respostas fisiológicas do tambaqui ao estresse durante o pré-abate, que foi dividido em quatro etapas: despesca, transporte, recuperação por 24 h e por 48 h. Ao final de cada etapa, os peixes foram amostrados para caracterização do estresse pré-abate por meio de análises dos parâmetros plasmáticos de glicose, lactato e amônia e, em seguida, os peixes foram abatidos por hipotermia ou por asfixia com gás carbônico para o estudo do rigor mortis. Verificou-se que o estado fisiológico de estresse dos peixes foi mais agudo logo após o transporte, implicando numa entrada em rigor mortis mais rápida: 60 minutos para tambaquis abatidos por hipotermia e 120 minutos para tambaquis abatidos por asfixia com gás carbônico. Nos viveiros, os peixes abatidos logo após a despesca apresentaram estado de estresse intermediário, sem diferença no tempo de entrada em rigor mortis em relação ao método de abate (135 minutos. Os peixes que passaram por recuperação ao estresse causado pelo transporte em condições simuladas de indústria apresentaram entrada em rigor mortis mais tardia: 225 minutos (com 24 h de recuperação e 255 minutos (com 48 h de recuperação, igualmente sem diferença em relação aos métodos de abate testados. A resolução do rigor mortis foi mais rápida nos peixes abatidos após o transporte, que foi de 12 dias. Nos peixes abatidos logo após a despesca, a resolução ocorreu com 16 dias e, nos peixes abatidos após recuperação, com 20 dias para 24 h de recuperação ao estresse pré-abate e 24 dias para 48 h de recuperação, sem influência do método de abate na resolução do rigor mortis. Assim, é desejável que o abate do tambaqui destinado à indústria seja feito após período de recuperação ao estresse, com vistas a aumentar sua

  11. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  12. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  13. Characterizing Verified Head Impacts in High School Girls' Lacrosse.

    Science.gov (United States)

    Caswell, Shane V; Lincoln, Andrew E; Stone, Hannah; Kelshaw, Patricia; Putukian, Margot; Hepburn, Lisa; Higgins, Michael; Cortes, Nelson

    2017-12-01

    Girls' high school lacrosse players have higher rates of head and facial injuries than boys. Research indicates that these injuries are caused by stick, player, and ball contacts. Yet, no studies have characterized head impacts in girls' high school lacrosse. To characterize girls' high school lacrosse game-related impacts by frequency, magnitude, mechanism, player position, and game situation. Descriptive epidemiology study. Thirty-five female participants (mean age, 16.2 ± 1.2 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) volunteered during 28 games in the 2014 and 2015 lacrosse seasons. Participants wore impact sensors affixed to the right mastoid process before each game. All game-related impacts recorded by the sensors were verified using game video. Data were summarized for all verified impacts in terms of frequency, peak linear acceleration (PLA), and peak rotational acceleration (PRA). Descriptive statistics and impact rates were calculated. Fifty-eight verified game-related impacts ≥20 g were recorded (median PLA, 33.8 g; median PRA, 6151.1 rad/s 2 ) during 467 player-games. The impact rate for all game-related verified impacts was 0.12 per athlete-exposure (AE) (95% CI, 0.09-0.16), equivalent to 2.1 impacts per team game, indicating that each athlete suffered fewer than 2 head impacts per season ≥20 g. Of these impacts, 28 (48.3%) were confirmed to directly strike the head, corresponding with an impact rate of 0.05 per AE (95% CI, 0.00-0.10). Overall, midfielders (n = 28, 48.3%) sustained the most impacts, followed by defenders (n = 12, 20.7%), attackers (n = 11, 19.0%), and goalies (n = 7, 12.1%). Goalies demonstrated the highest median PLA and PRA (38.8 g and 8535.0 rad/s 2 , respectively). The most common impact mechanisms were contact with a stick (n = 25, 43.1%) and a player (n = 17, 29.3%), followed by the ball (n = 7, 12.1%) and the ground (n = 7, 12.1%). One hundred percent of ball impacts occurred to goalies. Most impacts

  14. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility`s WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator`s waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits.

  15. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    International Nuclear Information System (INIS)

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility's WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator's waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits

  16. A Rigorous Treatment of Energy Extraction from a Rotating Black Hole

    Science.gov (United States)

    Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.

    2009-05-01

    The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.

  17. Analyzing the simplicial decomposition of spatial protein structures

    Directory of Open Access Journals (Sweden)

    Szabadka Zoltán

    2008-02-01

    Full Text Available Abstract Background The fast growing Protein Data Bank contains the three-dimensional description of more than 45000 protein- and nucleic-acid structures today. The large majority of the data in the PDB are measured by X-ray crystallography by thousands of researchers in millions of work-hours. Unfortunately, lots of structural errors, bad labels, missing atoms, falsely identified chains and groups make dificult the automated processing of this treasury of structural biological data. Results After we performed a rigorous re-structuring of the whole PDB on graph-theoretical basis, we created the RS-PDB (Rich-Structure PDB database. Using this cleaned and repaired database, we defined simplicial complexes on the heavy-atoms of the PDB, and analyzed the tetrahedra for geometric properties. Conclusion We have found surprisingly characteristic differences between simplices with atomic vertices of different types, and between the atomic neighborhoods – described also by simplices – of different ligand atoms in proteins.

  18. The influence of postmortem electrical stimulation on rigor mortis development, calpastatin activity, and tenderness in broiler and duck pectoralis.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-09-01

    This study was conducted to evaluate the effects of electrical stimulation (ES) on rigor mortis development, calpastatin activity, and tenderness in anatomically similar avian muscles composed primarily of either red or white muscle fibers. A total of 72 broilers and 72 White Pekin ducks were either treated with postmortem (PM) ES (450 mA) at the neck in a 1% NaCl solution for 2 s on and 1 s off for a total of 15 s or were used as nonstimulated controls. Both pectoralis muscles were harvested from the carcasses after 0.25, 1.25, and 24 h PM and analyzed for pH, inosine:adenosine ratio (R-value), sarcomere length, gravimetric fragmentation index, calpastatin activity, shear value, and cook loss. All data were analyzed within species for the effects of ES. Electrically stimulated ducks had a lower muscle pH at 0.25 and 1.25 h PM and higher R-values at 0.25 h PM compared with controls. Electrically stimulated broilers had a lower muscle pH at 1.25 h and higher R-values at 0.25 and 1.25 h PM compared with controls. Muscles of electrically stimulated broilers exhibited increased myofibrillar fragmentation at 0.25 and 1.25 h PM, whereas there was no such difference over PM time in the duck muscle. Electrical stimulation did not affect calpastatin activity in either broilers or ducks; however, the calpastatin activity of the broilers did decrease over the aging time period, whereas that of the ducks did not. Electrical stimulation decreased shear values in broilers at 1.25 h PM compared with controls; however, there was no difference in shear values of duck muscle due to ES at any sampling time. Cook loss was lower for electrically stimulated broilers at 0.25 and 1.25 h PM compared with the controls, but had no effect in the ducks. These results suggest that the red fibers of the duck pectoralis have less potential for rigor mortis acceleration and tenderization due to ES than do the white fibers of the broiler pectoralis.

  19. Seizing the Future: How Ohio's Career-Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Careers

    Science.gov (United States)

    Guarino, Heidi; Yoder, Shaun

    2015-01-01

    "Seizing the Future: How Ohio's Career and Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Work," demonstrates Ohio's progress in developing strong policies for career and technical education (CTE) programs to promote rigor, including college- and career-ready graduation…

  20. The Challenge of Timely, Responsive and Rigorous Ethics Review of Disaster Research: Views of Research Ethics Committee Members.

    Directory of Open Access Journals (Sweden)

    Matthew Hunt

    Full Text Available Research conducted following natural disasters such as earthquakes, floods or hurricanes is crucial for improving relief interventions. Such research, however, poses ethical, methodological and logistical challenges for researchers. Oversight of disaster research also poses challenges for research ethics committees (RECs, in part due to the rapid turnaround needed to initiate research after a disaster. Currently, there is limited knowledge available about how RECs respond to and appraise disaster research. To address this knowledge gap, we investigated the experiences of REC members who had reviewed disaster research conducted in low- or middle-income countries.We used interpretive description methodology and conducted in-depth interviews with 15 respondents. Respondents were chairs, members, advisors, or coordinators from 13 RECs, including RECs affiliated with universities, governments, international organizations, a for-profit REC, and an ad hoc committee established during a disaster. Interviews were analyzed inductively using constant comparative techniques.Through this process, three elements were identified as characterizing effective and high-quality review: timeliness, responsiveness and rigorousness. To ensure timeliness, many RECs rely on adaptations of review procedures for urgent protocols. Respondents emphasized that responsive review requires awareness of and sensitivity to the particularities of disaster settings and disaster research. Rigorous review was linked with providing careful assessment of ethical considerations related to the research, as well as ensuring independence of the review process.Both the frequency of disasters and the conduct of disaster research are on the rise. Ensuring effective and high quality review of disaster research is crucial, yet challenges, including time pressures for urgent protocols, exist for achieving this goal. Adapting standard REC procedures may be necessary. However, steps should be

  1. Making Digital Artifacts on the Web Verifiable and Reliable

    NARCIS (Netherlands)

    Kuhn, T.; Dumontier, M.

    2015-01-01

    The current Web has no general mechanisms to make digital artifacts - such as datasets, code, texts, and images - verifiable and permanent. For digital artifacts that are supposed to be immutable, there is moreover no commonly accepted method to enforce this immutability. These shortcomings have a

  2. Effect of muscle restraint on sheep meat tenderness with rigor mortis at 18°C.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Wells, Robyn W

    2002-02-01

    The effect on shear force of skeletal restraint and removing muscles from lamb m. longissimus thoracis et lumborum (LT) immediately after slaughter and electrical stimulation was undertaken at a rigor temperature of 18°C (n=15). The temperature of 18°C was achieved through chilling of electrically stimulated sheep carcasses in air at 12°C, air flow 1-1.5 ms(-2). In other groups, the muscle was removed at 2.5 h post-mortem and either wrapped or left non-wrapped before being placed back on the carcass to follow carcass cooling regimes. Following rigor mortis, the meat was aged for 0, 16, 40 and 65 h at 15°C and frozen. For the non-stimulated samples, the meat was aged for 0, 12, 36 and 60 h before being frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1 × 1 cm cross-section. Commencement of ageing was considered to take place at rigor mortis and this was taken as zero aged meat. There were no significant differences in the rate of tenderisation and initial shear force for all treatments. The 23% cook loss was similar for all wrapped and non-wrapped situations and the values decreased slightly with longer ageing durations. Wrapping was shown to mimic meat left intact on the carcass, as it prevented significant prerigor shortening. Such techniques allows muscles to be removed and placed in a controlled temperature environment to enable precise studies of ageing processes.

  3. Rigorous time slicing approach to Feynman path integrals

    CERN Document Server

    Fujiwara, Daisuke

    2017-01-01

    This book proves that Feynman's original definition of the path integral actually converges to the fundamental solution of the Schrödinger equation at least in the short term if the potential is differentiable sufficiently many times and its derivatives of order equal to or higher than two are bounded. The semi-classical asymptotic formula up to the second term of the fundamental solution is also proved by a method different from that of Birkhoff. A bound of the remainder term is also proved. The Feynman path integral is a method of quantization using the Lagrangian function, whereas Schrödinger's quantization uses the Hamiltonian function. These two methods are believed to be equivalent. But equivalence is not fully proved mathematically, because, compared with Schrödinger's method, there is still much to be done concerning rigorous mathematical treatment of Feynman's method. Feynman himself defined a path integral as the limit of a sequence of integrals over finite-dimensional spaces which is obtained by...

  4. 77 FR 70484 - Preoperational Testing of Onsite Electric Power Systems To Verify Proper Load Group Assignments...

    Science.gov (United States)

    2012-11-26

    ...-1294, ``Preoperational Testing of On-Site Electric Power Systems to Verify Proper Load Group... entitled ``Preoperational Testing of On- Site Electric Power Systems to Verify Proper Load Group... Electric Power Systems to Verify Proper Load Group Assignments, Electrical Separation, and Redundancy...

  5. Ghosts for Lists: A Critical Module of Contiki Verified in Frama-C

    OpenAIRE

    Blanchard , Allan; Kosmatov , Nikolai; Loulergue , Frédéric

    2018-01-01

    International audience; Internet of Things (IoT) applications are becoming increasingly critical and require rigorous formal verification. In this paper we target Contiki, a widely used open-source OS for IoT, and present a verification case study of one of its most critical modules: that of linked lists. Its API and list representation differ from the classical linked list implementations, and are particularly challenging for deductive verification. The proposed verification technique relies...

  6. TrustGuard: A Containment Architecture with Verified Output

    Science.gov (United States)

    2017-01-01

    that the TrustGuard system has minimal performance decline, despite restrictions such as high communication latency and limited available bandwidth...design are the availability of high bandwidth and low delays between the host and the monitoring chip. 3-D integration provides an alternate way of...TRUSTGUARD: A CONTAINMENT ARCHITECTURE WITH VERIFIED OUTPUT SOUMYADEEP GHOSH A DISSERTATION PRESENTED TO THE FACULTY OF PRINCETON UNIVERSITY IN

  7. Verifying a smart design of TCAP : a synergetic experience

    NARCIS (Netherlands)

    T. Arts; I.A. van Langevelde

    1999-01-01

    textabstractAn optimisation of the SS No. 7 Transport Capabilities Procedures is verified by specifying both the original and the optimised {scriptsize sf TCAP in {scriptsize sf $mu$CRL, generating transition systems for both using the {scriptsize sf $mu$CRL tool set, and checking weak bisimulation

  8. Rigorous modelling of light's intensity angular-profile in Abbe refractometers with absorbing homogeneous fluids

    International Nuclear Information System (INIS)

    García-Valenzuela, A; Contreras-Tello, H; Márquez-Islas, R; Sánchez-Pérez, C

    2013-01-01

    We derive an optical model for the light intensity distribution around the critical angle in a standard Abbe refractometer when used on absorbing homogenous fluids. The model is developed using rigorous electromagnetic optics. The obtained formula is very simple and can be used suitably in the analysis and design of optical sensors relying on Abbe type refractometry.

  9. A rigorous phenomenological analysis of the ππ scattering lengths

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.; Sararu, M.

    1979-11-01

    The constraining power of the present experimental data, combined with the general theoretical knowledge about ππ scattering, upon the scattering lengths of this process, is investigated by means of a rigorous functional method. We take as input the experimental phase shifts and make no hypotheses about the high energy behaviour of the amplitudes, using only absolute bounds derived from axiomatic field theory and exact consequences of crossing symmetry. In the simplest application of the method, involving only the π 0 π 0 S-wave, we explored numerically a number of values proposed by various authors for the scattering lengths a 0 and a 2 and found that no one appears to be especially favoured. (author)

  10. A Generalized Method for the Comparable and Rigorous Calculation of the Polytropic Efficiencies of Turbocompressors

    Science.gov (United States)

    Dimitrakopoulos, Panagiotis

    2018-03-01

    The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.

  11. Quality of nuchal translucency measurements correlates with broader aspects of program rigor and culture of excellence.

    Science.gov (United States)

    Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W

    2013-01-01

    To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.

  12. Comparison of rigorous modelling of different structure profiles on photomasks for quantitative linewidth measurements by means of UV- or DUV-optical microscopy

    Science.gov (United States)

    Ehret, Gerd; Bodermann, Bernd; Woehler, Martin

    2007-06-01

    The optical microscopy is an important instrument for dimensional characterisation or calibration of micro- and nanostructures, e.g. chrome structures on photomasks. In comparison to scanning electron microscopy (possible contamination of the sample) and atomic force microscopy (slow, risk of damage) optical microscopy is a fast and non destructive metrology method. The precise quantitative determination of the linewidth from the microscope image is, however, only possible by knowledge of the geometry of the structures and their consideration in the optical modelling. We compared two different rigorous model approaches, the Rigorous Coupled Wave Analysis (RCWA) and the Finite Elements Method (FEM) for modelling of structures with different edge angles, linewidths, line to space ratios and polarisations. The RCWA method can adapt inclined edges profiles only by a staircase approximation leading to increased modelling errors of the RCWA method. Even today's sophisticated rigorous methods still show problems with TM-polarisation. Therefore both rigorous methods are compared in terms of their convergence for TE and TM- polarisation. Beyond that also the influence of typical illumination wavelengths (365 nm, 248 nm and 193 nm) on the microscope images and their contribution to the measuring uncertainty budget will be discussed.

  13. Verifying Correct Usage of Atomic Blocks and Typestate: Technical Companion

    National Research Council Canada - National Science Library

    Beckman, Nels E; Aldrich, Jonathan

    2008-01-01

    In this technical report, we present a static and dynamic semantics as well as a proof of soundness for a programming language presented in the paper entitled, 'Verifying Correct Usage of Atomic Blocks and Typestate...

  14. From everyday communicative figurations to rigorous audience news repertoires

    DEFF Research Database (Denmark)

    Kobbernagel, Christian; Schrøder, Kim Christian

    2016-01-01

    In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013), in which people build their cross-media news...... repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption......, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building...

  15. Development of material measures for performance verifying surface topography measuring instruments

    International Nuclear Information System (INIS)

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-01-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments

  16. The AutoProof Verifier: Usability by Non-Experts and on Standard Code

    Directory of Open Access Journals (Sweden)

    Carlo A. Furia

    2015-08-01

    Full Text Available Formal verification tools are often developed by experts for experts; as a result, their usability by programmers with little formal methods experience may be severely limited. In this paper, we discuss this general phenomenon with reference to AutoProof: a tool that can verify the full functional correctness of object-oriented software. In particular, we present our experiences of using AutoProof in two contrasting contexts representative of non-expert usage. First, we discuss its usability by students in a graduate course on software verification, who were tasked with verifying implementations of various sorting algorithms. Second, we evaluate its usability in verifying code developed for programming assignments of an undergraduate course. The first scenario represents usability by serious non-experts; the second represents usability on "standard code", developed without full functional verification in mind. We report our experiences and lessons learnt, from which we derive some general suggestions for furthering the development of verification tools with respect to improving their usability.

  17. Rigorous lower bound on the dynamic critical exponent of some multilevel Swendsen-Wang algorithms

    International Nuclear Information System (INIS)

    Li, X.; Sokal, A.D.

    1991-01-01

    We prove the rigorous lower bound z exp ≥α/ν for the dynamic critical exponent of a broad class of multilevel (or ''multigrid'') variants of the Swendsen-Wang algorithm. This proves that such algorithms do suffer from critical slowing down. We conjecture that such algorithms in fact lie in the same dynamic universality class as the stanard Swendsen-Wang algorithm

  18. Beyond the RCT: Integrating Rigor and Relevance to Evaluate the Outcomes of Domestic Violence Programs

    Science.gov (United States)

    Goodman, Lisa A.; Epstein, Deborah; Sullivan, Cris M.

    2018-01-01

    Programs for domestic violence (DV) victims and their families have grown exponentially over the last four decades. The evidence demonstrating the extent of their effectiveness, however, often has been criticized as stemming from studies lacking scientific rigor. A core reason for this critique is the widespread belief that credible evidence can…

  19. Verifying Temporal Properties of Reactive Systems by Transformation

    OpenAIRE

    Hamilton, Geoff

    2015-01-01

    We show how program transformation techniques can be used for the verification of both safety and liveness properties of reactive systems. In particular, we show how the program transformation technique distillation can be used to transform reactive systems specified in a functional language into a simplified form that can subsequently be analysed to verify temporal properties of the systems. Example systems which are intended to model mutual exclusion are analysed using these techniques with...

  20. Finite-Time Synchronization of Chaotic Systems with Different Dimension and Secure Communication

    Directory of Open Access Journals (Sweden)

    Shouquan Pang

    2016-01-01

    Full Text Available Finite-time synchronization of chaotic systems with different dimension and secure communication is investigated. It is rigorously proven that global finite-time synchronization can be achieved between three-dimension Lorenz chaotic system and four-dimension Lorenz hyperchaotic system which have certain parameters or uncertain parameters. The electronic circuits of finite-time synchronization using Multisim 12 are designed to verify our conclusion. And the application to the secure communications is also analyzed and discussed.

  1. Application of Asymptotic and Rigorous Techniques for the Characterization of Interferences Caused by a Wind Turbine in Its Neighborhood

    Directory of Open Access Journals (Sweden)

    Maria Jesús Algar

    2013-01-01

    Full Text Available This paper presents a complete assessment to the interferences caused in the nearby radio systems by wind turbines. Three different parameters have been considered: the scattered field of a wind turbine, its radar cross-section (RCS, and the Doppler shift generated by the rotating movements of the blades. These predictions are very useful for the study of the influence of wind farms in radio systems. To achieve this, both high-frequency techniques, such as Geometrical Theory of Diffraction/Uniform Theory of Diffraction (GTD/UTD and Physical Optics (PO, and rigorous techniques, like Method of Moments (MoM, have been used. In the analysis of the scattered field, conductor and dielectric models of the wind turbine have been analyzed. In this way, realistic results can be obtained. For all cases under analysis, the wind turbine has been modeled with NURBS (Non-Uniform Rational B-Spline surfaces since they allow the real shape of the object to be accurately replicated with very little information.

  2. Dosimetric effects of edema in permanent prostate seed implants: a rigorous solution

    International Nuclear Information System (INIS)

    Chen Zhe; Yue Ning; Wang Xiaohong; Roberts, Kenneth B.; Peschel, Richard; Nath, Ravinder

    2000-01-01

    Purpose: To derive a rigorous analytic solution to the dosimetric effects of prostate edema so that its impact on the conventional pre-implant and post-implant dosimetry can be studied for any given radioactive isotope and edema characteristics. Methods and Materials: The edema characteristics observed by Waterman et al (Int. J. Rad. Onc. Biol. Phys, 41:1069-1077; 1998) was used to model the time evolution of the prostate and the seed locations. The total dose to any part of prostate tissue from a seed implant was calculated analytically by parameterizing the dose fall-off from a radioactive seed as a single inverse power function of distance, with proper account of the edema-induced time evolution. The dosimetric impact of prostate edema was determined by comparing the dose calculated with full consideration of prostate edema to that calculated with the conventional dosimetry approach where the seed locations and the target volume are assumed to be stationary. Results: A rigorous analytic solution on the relative dosimetric effects of prostate edema was obtained. This solution proved explicitly that the relative dosimetric effects of edema, as found in the previous numerical studies by Yue et. al. (Int. J. Radiat. Oncol. Biol. Phys. 43, 447-454, 1999), are independent of the size and the shape of the implant target volume and are independent of the number and the locations of the seeds implanted. It also showed that the magnitude of relative dosimetric effects is independent of the location of dose evaluation point within the edematous target volume. It implies that the relative dosimetric effects of prostate edema are universal with respect to a given isotope and edema characteristic. A set of master tables for the relative dosimetric effects of edema were obtained for a wide range of edema characteristics for both 125 I and 103 Pd prostate seed implants. Conclusions: A rigorous analytic solution of the relative dosimetric effects of prostate edema has been

  3. "Snow White" Coating Protects SpaceX Dragon's Trunk Against Rigors of Space

    Science.gov (United States)

    McMahan, Tracy

    2013-01-01

    He described it as "snow white." But NASA astronaut Don Pettit was not referring to the popular children's fairy tale. Rather, he was talking about the white coating of the Space Exploration Technologies Corp. (SpaceX) Dragon spacecraft that reflected from the International Space Station s light. As it approached the station for the first time in May 2012, the Dragon s trunk might have been described as the "fairest of them all," for its pristine coating, allowing Pettit to clearly see to maneuver the robotic arm to grab the Dragon for a successful nighttime berthing. This protective thermal control coating, developed by Alion Science and Technology Corp., based in McLean, Va., made its bright appearance again with the March 1 launch of SpaceX's second commercial resupply mission. Named Z-93C55, the coating was applied to the cargo portion of the Dragon to protect it from the rigors of space. "For decades, Alion has produced coatings to protect against the rigors of space," said Michael Kenny, senior chemist with Alion. "As space missions evolved, there was a growing need to dissipate electrical charges that build up on the exteriors of spacecraft, or there could be damage to the spacecraft s electronics. Alion's research led us to develop materials that would meet this goal while also providing thermal controls. The outcome of this research was Alion's proprietary Z-93C55 coating."

  4. Study design elements for rigorous quasi-experimental comparative effectiveness research.

    Science.gov (United States)

    Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan

    2013-03-01

    Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.

  5. Effects of well-boat transportation on the muscle pH and onset of rigor mortis in Atlantic salmon.

    Science.gov (United States)

    Gatica, M C; Monti, G; Gallo, C; Knowles, T G; Warriss, P D

    2008-07-26

    During the transport of salmon (Salmo salar), in a well-boat, 10 fish were sampled at each of six stages: in cages after crowding at the farm (stage 1), in the well-boat after loading (stage 2), in the well-boat after eight hours transport and before unloading (stage 3), in the resting cages immediately after finishing unloading (stage 4), after 24 hours resting in cages, (stage 5) and in the processing plant after pumping from the resting cages (stage 6). The water in the well-boat was at ambient temperature with recirculation to the sea. At each stage the fish were stunned percussively and bled by gill cutting. Immediately after death, and then every three hours for 18 hours, the muscle pH and rigor index of the fish were measured. At successive stages the initial muscle pH of the fish decreased, except for a slight gain in stage 5, after they had been rested for 24 hours. The lowest initial muscle pH was observed at stage 6. The fishes' rigor index showed that rigor developed more quickly at each successive stage, except for a slight decrease in rate at stage 5, attributable to the recovery of muscle reserves.

  6. Conformational distributions and proximity relationships in the rigor complex of actin and myosin subfragment-1.

    Science.gov (United States)

    Nyitrai, M; Hild, G; Lukács, A; Bódis, E; Somogyi, B

    2000-01-28

    Cyclic conformational changes in the myosin head are considered essential for muscle contraction. We hereby show that the extension of the fluorescence resonance energy transfer method described originally by Taylor et al. (Taylor, D. L., Reidler, J., Spudich, J. A., and Stryer, L. (1981) J. Cell Biol. 89, 362-367) allows determination of the position of a labeled point outside the actin filament in supramolecular complexes and also characterization of the conformational heterogeneity of an actin-binding protein while considering donor-acceptor distance distributions. Using this method we analyzed proximity relationships between two labeled points of S1 and the actin filament in the acto-S1 rigor complex. The donor (N-[[(iodoacetyl)amino]ethyl]-5-naphthylamine-1-sulfonate) was attached to either the catalytic domain (Cys-707) or the essential light chain (Cys-177) of S1, whereas the acceptor (5-(iodoacetamido)fluorescein) was attached to the actin filament (Cys-374). In contrast to the narrow positional distribution (assumed as being Gaussian) of Cys-707 (5 +/- 3 A), the positional distribution of Cys-177 was found to be broad (102 +/- 4 A). Such a broad positional distribution of the label on the essential light chain of S1 may be important in accommodating the helically arranged acto-myosin binding relative to the filament axis.

  7. Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames

    Science.gov (United States)

    Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.

    2017-12-01

    Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.

  8. Rigorous vector wave propagation for arbitrary flat media

    Science.gov (United States)

    Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.

    2017-08-01

    Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.

  9. Dynamics of harmonically-confined systems: Some rigorous results

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Zhigang, E-mail: zwu@physics.queensu.ca; Zaremba, Eugene, E-mail: zaremba@sparky.phy.queensu.ca

    2014-03-15

    In this paper we consider the dynamics of harmonically-confined atomic gases. We present various general results which are independent of particle statistics, interatomic interactions and dimensionality. Of particular interest is the response of the system to external perturbations which can be either static or dynamic in nature. We prove an extended Harmonic Potential Theorem which is useful in determining the damping of the centre of mass motion when the system is prepared initially in a highly nonequilibrium state. We also study the response of the gas to a dynamic external potential whose position is made to oscillate sinusoidally in a given direction. We show in this case that either the energy absorption rate or the centre of mass dynamics can serve as a probe of the optical conductivity of the system. -- Highlights: •We derive various rigorous results on the dynamics of harmonically-confined atomic gases. •We derive an extension of the Harmonic Potential Theorem. •We demonstrate the link between the energy absorption rate in a harmonically-confined system and the optical conductivity.

  10. Rigorous theory of molecular orientational nonlinear optics

    International Nuclear Information System (INIS)

    Kwak, Chong Hoon; Kim, Gun Yeup

    2015-01-01

    Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecular hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented

  11. A new look at the statistical assessment of approximate and rigorous methods for the estimation of stabilized formation temperatures in geothermal and petroleum wells

    International Nuclear Information System (INIS)

    Espinoza-Ojeda, O M; Santoyo, E; Andaverde, J

    2011-01-01

    Approximate and rigorous solutions of seven heat transfer models were statistically examined, for the first time, to estimate stabilized formation temperatures (SFT) of geothermal and petroleum boreholes. Constant linear and cylindrical heat source models were used to describe the heat flow (either conductive or conductive/convective) involved during a borehole drilling. A comprehensive statistical assessment of the major error sources associated with the use of these models was carried out. The mathematical methods (based on approximate and rigorous solutions of heat transfer models) were thoroughly examined by using four statistical analyses: (i) the use of linear and quadratic regression models to infer the SFT; (ii) the application of statistical tests of linearity to evaluate the actual relationship between bottom-hole temperatures and time function data for each selected method; (iii) the comparative analysis of SFT estimates between the approximate and rigorous predictions of each analytical method using a β ratio parameter to evaluate the similarity of both solutions, and (iv) the evaluation of accuracy in each method using statistical tests of significance, and deviation percentages between 'true' formation temperatures and SFT estimates (predicted from approximate and rigorous solutions). The present study also enabled us to determine the sensitivity parameters that should be considered for a reliable calculation of SFT, as well as to define the main physical and mathematical constraints where the approximate and rigorous methods could provide consistent SFT estimates

  12. Community historians and the dilemma of rigor vs relevance : A comment on Danziger and van Rappard

    NARCIS (Netherlands)

    Dehue, Trudy

    1998-01-01

    Since the transition from finalism to contextualism, the history of science seems to be caught up in a basic dilemma. Many historians fear that with the new contextualist standards of rigorous historiography, historical research can no longer be relevant to working scientists themselves. The present

  13. Student’s rigorous mathematical thinking based on cognitive style

    Science.gov (United States)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  14. A Novel Simple Phantom for Verifying the Dose of Radiation Therapy

    Directory of Open Access Journals (Sweden)

    J. H. Lee

    2015-01-01

    Full Text Available A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with 60Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions.

  15. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    Goncalves, Joao G.M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-01-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  16. Reference Material Properties and Standard Problems to Verify the Fuel Performance Models Ver 1.0

    International Nuclear Information System (INIS)

    Yang, Yong Sik; Kim, Jae Yong; Koo, Yang Hyun

    2010-12-01

    All fuel performance models must be validated by in-pile and out-pile tests. However, the model validation requires much efforts and times to confirm its exactness. In many fields, new performance models and codes are confirmed by code-to-code benchmarking process under simplified standard problem analysis. At present, the DUOS, which is the steady state fuel performance analysis code for dual cooled annular fuel, development project is progressing and new FEM module is developed to analyze the fuel performance during transient period. In addition, the verification process is planning to examine the new models and module's rightness by comparing with commercial finite element analysis such as a ADINA, ABAQUS and ANSYS. This reports contains the result of unification of material properties and establishment of standard problem to verify the newly developed models with commercial FEM code

  17. Analyzing task-based user study data to determine colormap efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Ashton, Zoe Charon Maria [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wendelberger, Joanne Roth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ticknor, Lawrence O. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Turton, Terece [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Samsel, Francesca [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-07-23

    Domain scientists need colormaps to visualize their data and are especially useful for identifying areas of interest, like in ocean data to identify eddies or characterize currents. However, traditional Rainbow colormap performs poorly for understanding details, because of the small perceptual range. In order to assist domain scientists in recognizing and identifying important details in their data, different colormaps need to be applied to allow higher perceptual definition. Visual artist Francesca Samsel used her understanding of color theory to create new colormaps to improve perception. While domain scientists find the new colormaps to be useful, we implemented a rigorous and quantitative study to determine whether or not the new colormaps have perceptually more colors. Color count data from one of these studies will be analyzed in depth in order to determine whether or not the new colormaps have more perceivable colors and what affects the number of perceivable colors.

  18. Rigorous Multicomponent Reactive Separations Modelling: Complete Consideration of Reaction-Diffusion Phenomena

    International Nuclear Information System (INIS)

    Ahmadi, A.; Meyer, M.; Rouzineau, D.; Prevost, M.; Alix, P.; Laloue, N.

    2010-01-01

    This paper gives the first step of the development of a rigorous multicomponent reactive separation model. Such a model is highly essential to further the optimization of acid gases removal plants (CO 2 capture, gas treating, etc.) in terms of size and energy consumption, since chemical solvents are conventionally used. Firstly, two main modelling approaches are presented: the equilibrium-based and the rate-based approaches. Secondly, an extended rate-based model with rigorous modelling methodology for diffusion-reaction phenomena is proposed. The film theory and the generalized Maxwell-Stefan equations are used in order to characterize multicomponent interactions. The complete chain of chemical reactions is taken into account. The reactions can be kinetically controlled or at chemical equilibrium, and they are considered for both liquid film and liquid bulk. Thirdly, the method of numerical resolution is described. Coupling the generalized Maxwell-Stefan equations with chemical equilibrium equations leads to a highly non-linear Differential-Algebraic Equations system known as DAE index 3. The set of equations is discretized with finite-differences as its integration by Gear method is complex. The resulting algebraic system is resolved by the Newton- Raphson method. Finally, the present model and the associated methods of numerical resolution are validated for the example of esterification of methanol. This archetype non-electrolytic system permits an interesting analysis of reaction impact on mass transfer, especially near the phase interface. The numerical resolution of the model by Newton-Raphson method gives good results in terms of calculation time and convergence. The simulations show that the impact of reactions at chemical equilibrium and that of kinetically controlled reactions with high kinetics on mass transfer is relatively similar. Moreover, the Fick's law is less adapted for multicomponent mixtures where some abnormalities such as counter

  19. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    Science.gov (United States)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  20. Rigorous Quantum Field Theory A Festschrift for Jacques Bros

    CERN Document Server

    Monvel, Anne Boutet; Iagolnitzer, Daniel; Moschella, Ugo

    2007-01-01

    Jacques Bros has greatly advanced our present understanding of rigorous quantum field theory through numerous fundamental contributions. This book arose from an international symposium held in honour of Jacques Bros on the occasion of his 70th birthday, at the Department of Theoretical Physics of the CEA in Saclay, France. The impact of the work of Jacques Bros is evident in several articles in this book. Quantum fields are regarded as genuine mathematical objects, whose various properties and relevant physical interpretations must be studied in a well-defined mathematical framework. The key topics in this volume include analytic structures of Quantum Field Theory (QFT), renormalization group methods, gauge QFT, stability properties and extension of the axiomatic framework, QFT on models of curved spacetimes, QFT on noncommutative Minkowski spacetime. Contributors: D. Bahns, M. Bertola, R. Brunetti, D. Buchholz, A. Connes, F. Corbetta, S. Doplicher, M. Dubois-Violette, M. Dütsch, H. Epstein, C.J. Fewster, K....

  1. POSSIBILITIES TO EVALUATE THE QUALITY OF EDUCATION BY VERIFYING THE DISTRIBUTION OF MARKS

    Directory of Open Access Journals (Sweden)

    Alexandru BOROIU

    2015-05-01

    Full Text Available In the higher education, for the evaluation of education process it is of high interest to use some numeric indicators obtained from the database with the final results realized by the students on exams session. For this purpose could be used the following numeric indicators: proportion of students absent on final evaluation, proportion of non-promoted students, normality degree of passing marks distribution. In order to do this we realized an Excel calculation program that could be applied to each discipline. The inputs are concrete (students total, students present to final evaluation, marks absolute frequency and the outputs for the three indicators are binary (competent or noncompetent, in the last situation the verdict being: “Give explanations. Propose an action plan, with actions, responsible and terms”. To verify the imposed normality degree we elaborate a calculation program based on Kolmogorov-Smirnov concordance test. So, it was realized the increase of analyze objectivity and it was created the opportunity to apply corrective measures in order to improve the education process.

  2. People consider reliability and cost when verifying their autobiographical memories.

    Science.gov (United States)

    Wade, Kimberley A; Nash, Robert A; Garry, Maryanne

    2014-02-01

    Because memories are not always accurate, people rely on a variety of strategies to verify whether the events that they remember really did occur. Several studies have examined which strategies people tend to use, but none to date has asked why people opt for certain strategies over others. Here we examined the extent to which people's beliefs about the reliability and the cost of different strategies would determine their strategy selection. Subjects described a childhood memory and then suggested strategies they might use to verify the accuracy of that memory. Next, they rated the reliability and cost of each strategy, and the likelihood that they might use it. Reliability and cost each predicted strategy selection, but a combination of the two ratings provided even greater predictive value. Cost was significantly more influential than reliability, which suggests that a tendency to seek and to value "cheap" information more than reliable information could underlie many real-world memory errors. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Verifying the integrity of hardcopy document using OCR

    CSIR Research Space (South Africa)

    Mthethwa, Sthembile

    2018-03-01

    Full Text Available stream_source_info Mthethwa_20042_2018.pdf.txt stream_content_type text/plain stream_size 7349 Content-Encoding UTF-8 stream_name Mthethwa_20042_2018.pdf.txt Content-Type text/plain; charset=UTF-8 Verifying the Integrity...) of the document to be defined. Each text in the meta-template is labelled with a unique identifier, which makes it easier for the process of validation. The meta-template consist of two types of text; normal text and validation text (important text that must...

  4. Biochemically verified smoking cessation and vaping beliefs among vape store customers.

    Science.gov (United States)

    Tackett, Alayna P; Lechner, William V; Meier, Ellen; Grant, DeMond M; Driskill, Leslie M; Tahirkheli, Noor N; Wagener, Theodore L

    2015-05-01

    To evaluate biochemically verified smoking status and electronic nicotine delivery systems (ENDS) use behaviors and beliefs among a sample of customers from vapor stores (stores specializing in ENDS). A cross-sectional survey of 215 adult vapor store customers at four retail locations in the Midwestern United States; a subset of participants (n = 181) also completed exhaled carbon monoxide (CO) testing to verify smoking status. Outcomes evaluated included ENDS preferences, harm beliefs, use behaviors, smoking history and current biochemically verified smoking status. Most customers reported starting ENDS as a means of smoking cessation (86%), using newer-generation devices (89%), vaping non-tobacco/non-menthol flavors (72%) and using e-liquid with nicotine strengths of ≤20 mg/ml (72%). There was a high rate of switching (91.4%) to newer-generation ENDS among those who started with a first-generation product. Exhaled CO readings confirmed that 66% of the tested sample had quit smoking. Among those who continued to smoke, mean cigarettes per day decreased from 22.1 to 7.5 (P customers in the United States who use electronic nicotine delivery devices to stop smoking, vaping longer, using newer-generation devices and using non-tobacco and non-menthol flavored e-liquid appear to be associated with higher rates of smoking cessation. © 2015 Society for the Study of Addiction.

  5. Large test rigs verify Clinch River control rod reliability

    International Nuclear Information System (INIS)

    Michael, H.D.; Smith, G.G.

    1983-01-01

    The purpose of the Clinch River control test programme was to use multiple full-scale prototypic control rod systems for verifying the system's ability to perform reliably during simulated reactor power control and emergency shutdown operations. Two major facilities, the Shutdown Control Rod and Maintenance (Scram) facility and the Dynamic and Seismic Test (Dast) facility, were constructed. The test programme of each facility is described. (UK)

  6. A new method for deriving rigorous results on ππ scattering

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.

    1979-06-01

    We develop a new approach to the problem of constraining the ππ scattering amplitudes by means of the axiomatically proved properties of unitarity, analyticity and crossing symmetry. The method is based on the solution of an extremal problem on a convex set of analytic functions and provides a global description of the domain of values taken by any finite number of partial waves at an arbitrary set of unphysical energies, compatible with unitarity, the bounds at complex energies derived from generalized dispersion relations and the crossing integral relations. From this doma domain we obtain new absolute bounds for the amplitudes as well as rigorous correlations between the values of various partial waves. (author)

  7. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  8. Optical vector network analyzer with improved accuracy based on polarization modulation and polarization pulling.

    Science.gov (United States)

    Li, Wei; Liu, Jian Guo; Zhu, Ning Hua

    2015-04-15

    We report a novel optical vector network analyzer (OVNA) with improved accuracy based on polarization modulation and stimulated Brillouin scattering (SBS) assisted polarization pulling. The beating between adjacent higher-order optical sidebands which are generated because of the nonlinearity of an electro-optic modulator (EOM) introduces considerable error to the OVNA. In our scheme, the measurement error is significantly reduced by removing the even-order optical sidebands using polarization discrimination. The proposed approach is theoretically analyzed and experimentally verified. The experimental results show that the accuracy of the OVNA is greatly improved compared to a conventional OVNA.

  9. Descriptional complexity of non-unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2017-09-01

    Full Text Available Previously, self-verifying symmetric difference automata were defined and a tight bound of 2^n-1-1 was shown for state complexity in the unary case. We now consider the non-unary case and show that, for every n at least 2, there is a regular...

  10. Parametric spectro-temporal analyzer (PASTA) for real-time optical spectrum observation

    Science.gov (United States)

    Zhang, Chi; Xu, Jianbing; Chui, P. C.; Wong, Kenneth K. Y.

    2013-06-01

    Real-time optical spectrum analysis is an essential tool in observing ultrafast phenomena, such as the dynamic monitoring of spectrum evolution. However, conventional method such as optical spectrum analyzers disperse the spectrum in space and allocate it in time sequence by mechanical rotation of a grating, so are incapable of operating at high speed. A more recent method all-optically stretches the spectrum in time domain, but is limited by the allowable input condition. In view of these constraints, here we present a real-time spectrum analyzer called parametric spectro-temporal analyzer (PASTA), which is based on the time-lens focusing mechanism. It achieves a frame rate as high as 100 MHz and accommodates various input conditions. As a proof of concept and also for the first time, we verify its applications in observing the dynamic spectrum of a Fourier domain mode-locked laser, and the spectrum evolution of a laser cavity during its stabilizing process.

  11. Analyzing power Ay(θ) of n-3He elastic scattering between 1.60 and 5.54 MeV.

    Science.gov (United States)

    Esterline, J; Tornow, W; Deltuva, A; Fonseca, A C

    2013-04-12

    Comprehensive and high-accuracy n-3He elastic scattering analyzing power Ay(θ) angular distributions were obtained at five incident neutron energies between 1.60 and 5.54 MeV. The data are compared to rigorous four-nucleon calculations using high-precision nucleon-nucleon potential models; three-nucleon force effects are found to be very small. The agreement between data and calculations is fair at the lower energies and becomes less satisfactory with increasing neutron energy. Comparison to p-3He scattering over the same energy range exhibits unexpectedly large isospin effects.

  12. Detecting Android Malwares with High-Efficient Hybrid Analyzing Methods

    Directory of Open Access Journals (Sweden)

    Yu Liu

    2018-01-01

    Full Text Available In order to tackle the security issues caused by malwares of Android OS, we proposed a high-efficient hybrid-detecting scheme for Android malwares. Our scheme employed different analyzing methods (static and dynamic methods to construct a flexible detecting scheme. In this paper, we proposed some detecting techniques such as Com+ feature based on traditional Permission and API call features to improve the performance of static detection. The collapsing issue of traditional function call graph-based malware detection was also avoided, as we adopted feature selection and clustering method to unify function call graph features of various dimensions into same dimension. In order to verify the performance of our scheme, we built an open-access malware dataset in our experiments. The experimental results showed that the suggested scheme achieved high malware-detecting accuracy, and the scheme could be used to establish Android malware-detecting cloud services, which can automatically adopt high-efficiency analyzing methods according to the properties of the Android applications.

  13. Rigorous Photogrammetric Processing of CHANG'E-1 and CHANG'E-2 Stereo Imagery for Lunar Topographic Mapping

    Science.gov (United States)

    Di, K.; Liu, Y.; Liu, B.; Peng, M.

    2012-07-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.

  14. Verifying a nuclear weapon`s response to radiation environments

    Energy Technology Data Exchange (ETDEWEB)

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  15. Verifying the gravitational shift due to the earth's rotation

    International Nuclear Information System (INIS)

    Briatore, L.; Leschiutta, S.

    1976-01-01

    Data on various independent time scales kept in different laboratories are elaborated in order to verify the gravitational shift due to the earth's rotation. It is shown that the state of the art in the measurement of time is just now resulting in the possibility to make measurement of Δ t/t approximately 10 -13 . Moreover it is shown an experimental evidence of the earth's rotation relativistic effects

  16. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  17. Improving students’ mathematical critical thinking through rigorous teaching and learning model with informal argument

    Science.gov (United States)

    Hamid, H.

    2018-01-01

    The purpose of this study is to analyze an improvement of students’ mathematical critical thinking (CT) ability in Real Analysis course by using Rigorous Teaching and Learning (RTL) model with informal argument. In addition, this research also attempted to understand students’ CT on their initial mathematical ability (IMA). This study was conducted at a private university in academic year 2015/2016. The study employed the quasi-experimental method with pretest-posttest control group design. The participants of the study were 83 students in which 43 students were in the experimental group and 40 students were in the control group. The finding of the study showed that students in experimental group outperformed students in control group on mathematical CT ability based on their IMA (high, medium, low) in learning Real Analysis. In addition, based on medium IMA the improvement of mathematical CT ability of students who were exposed to RTL model with informal argument was greater than that of students who were exposed to CI (conventional instruction). There was also no effect of interaction between RTL model and CI model with both (high, medium, and low) IMA increased mathematical CT ability. Finally, based on (high, medium, and low) IMA there was a significant improvement in the achievement of all indicators of mathematical CT ability of students who were exposed to RTL model with informal argument than that of students who were exposed to CI.

  18. Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D

    Science.gov (United States)

    Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas

    2017-11-01

    One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.

  19. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    Science.gov (United States)

    Stephen M Ogle; Kenneth Davis; Thomas Lauvaux; Andrew Schuh; Dan Cooley; Tristram O West; Linda S Heath; Natasha L Miles; Scott Richardson; F Jay Breidt; James E Smith; Jessica L McCarty; Kevin R Gurney; Pieter Tans; A Scott. Denning

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country's contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated...

  20. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...... tool based on belief logics and explicit attacker knowledge....

  1. The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.

    Science.gov (United States)

    Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno

    2016-01-01

    Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to

  2. Smoothing of Transport Plans with Fixed Marginals and Rigorous Semiclassical Limit of the Hohenberg-Kohn Functional

    Science.gov (United States)

    Cotar, Codina; Friesecke, Gero; Klüppelberg, Claudia

    2018-06-01

    We prove rigorously that the exact N-electron Hohenberg-Kohn density functional converges in the strongly interacting limit to the strictly correlated electrons (SCE) functional, and that the absolute value squared of the associated constrained search wavefunction tends weakly in the sense of probability measures to a minimizer of the multi-marginal optimal transport problem with Coulomb cost associated to the SCE functional. This extends our previous work for N = 2 ( Cotar etal. in Commun Pure Appl Math 66:548-599, 2013). The correct limit problem has been derived in the physics literature by Seidl (Phys Rev A 60 4387-4395, 1999) and Seidl, Gorigiorgi and Savin (Phys Rev A 75:042511 1-12, 2007); in these papers the lack of a rigorous proofwas pointed out.We also give amathematical counterexample to this type of result, by replacing the constraint of given one-body density—an infinite dimensional quadratic expression in the wavefunction—by an infinite-dimensional quadratic expression in the wavefunction and its gradient. Connections with the Lawrentiev phenomenon in the calculus of variations are indicated.

  3. Muscle pH, rigor mortis and blood variables in Atlantic salmon transported in two types of well-boat.

    Science.gov (United States)

    Gatica, M C; Monti, G E; Knowles, T G; Gallo, C B

    2010-01-09

    Two systems for transporting live salmon (Salmo salar) were compared in terms of their effects on blood variables, muscle pH and rigor index: an 'open system' well-boat with recirculated sea water at 13.5 degrees C and a stocking density of 107 kg/m3 during an eight-hour journey, and a 'closed system' well-boat with water chilled from 16.7 to 2.1 degrees C and a stocking density of 243.7 kg/m3 during a seven-hour journey. Groups of 10 fish were sampled at each of four stages: in cages at the farm, in the well-boat after loading, in the well-boat after the journey and before unloading, and in the processing plant after they were pumped from the resting cages. At each sampling, the fish were stunned and bled by gill cutting. Blood samples were taken to measure lactate, osmolality, chloride, sodium, cortisol and glucose, and their muscle pH and rigor index were measured at death and three hours later. In the open system well-boat, the initial muscle pH of the fish decreased at each successive stage, and at the final stage they had a significantly lower initial muscle pH and more rapid onset of rigor than the fish transported on the closed system well-boat. At the final stage all the blood variables except glucose were significantly affected in the fish transported on both types of well-boat.

  4. A two-dimensional deformable phantom for quantitatively verifying deformation algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, Neil; Chuang, Cynthia; Pouliot, Jean [Department of Radiation Oncology, University of California San Francisco, San Francisco, California 94143-1708 (United States)

    2011-08-15

    Purpose: The incorporation of deformable image registration into the treatment planning process is rapidly advancing. For this reason, the methods used to verify the underlying deformation algorithms must evolve equally fast. This manuscript proposes a two-dimensional deformable phantom, which can objectively verify the accuracy of deformation algorithms, as the next step for improving these techniques. Methods: The phantom represents a single plane of the anatomy for a head and neck patient. Inflation of a balloon catheter inside the phantom simulates tumor growth. CT and camera images of the phantom are acquired before and after its deformation. Nonradiopaque markers reside on the surface of the deformable anatomy and are visible through an acrylic plate, which enables an optical camera to measure their positions; thus, establishing the ground-truth deformation. This measured deformation is directly compared to the predictions of deformation algorithms, using several similarity metrics. The ratio of the number of points with more than a 3 mm deformation error over the number that are deformed by more than 3 mm is used for an error metric to evaluate algorithm accuracy. Results: An optical method of characterizing deformation has been successfully demonstrated. For the tests of this method, the balloon catheter deforms 32 out of the 54 surface markers by more than 3 mm. Different deformation errors result from the different similarity metrics. The most accurate deformation predictions had an error of 75%. Conclusions: The results presented here demonstrate the utility of the phantom for objectively verifying deformation algorithms and determining which is the most accurate. They also indicate that the phantom would benefit from more electron density heterogeneity. The reduction of the deformable anatomy to a two-dimensional system allows for the use of nonradiopaque markers, which do not influence deformation algorithms. This is the fundamental advantage of this

  5. Die verifiëring, verfyning en toepassing van leksikografiese liniale ...

    African Journals Online (AJOL)

    Leksikografiese liniale vir Afrikaans en die Afrikatale is 'n dekade oud en word algemeen gebruik in die samestelling van woordeboeke. Die samestellers het dit tot dusver nie nodig geag om hierdie liniale te verifieer of te verfyn nie. Kritiek is egter uitgespreek op die samestelling van die Afrikaanse Liniaal en dit word in ...

  6. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    Science.gov (United States)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  7. The rigorous stochastic matrix multiplication scheme for the calculations of reduced equilibrium density matrices of open multilevel quantum systems

    International Nuclear Information System (INIS)

    Chen, Xin

    2014-01-01

    Understanding the roles of the temporary and spatial structures of quantum functional noise in open multilevel quantum molecular systems attracts a lot of theoretical interests. I want to establish a rigorous and general framework for functional quantum noises from the constructive and computational perspectives, i.e., how to generate the random trajectories to reproduce the kernel and path ordering of the influence functional with effective Monte Carlo methods for arbitrary spectral densities. This construction approach aims to unify the existing stochastic models to rigorously describe the temporary and spatial structure of Gaussian quantum noises. In this paper, I review the Euclidean imaginary time influence functional and propose the stochastic matrix multiplication scheme to calculate reduced equilibrium density matrices (REDM). In addition, I review and discuss the Feynman-Vernon influence functional according to the Gaussian quadratic integral, particularly its imaginary part which is critical to the rigorous description of the quantum detailed balance. As a result, I establish the conditions under which the influence functional can be interpreted as the average of exponential functional operator over real-valued Gaussian processes for open multilevel quantum systems. I also show the difference between the local and nonlocal phonons within this framework. With the stochastic matrix multiplication scheme, I compare the normalized REDM with the Boltzmann equilibrium distribution for open multilevel quantum systems

  8. Verifying real-time systems against scenario-based requirements

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Li, Shuhao; Nielsen, Brian

    2009-01-01

    We propose an approach to automatic verification of real-time systems against scenario-based requirements. A real-time system is modeled as a network of Timed Automata (TA), and a scenario-based requirement is specified as a Live Sequence Chart (LSC). We define a trace-based semantics for a kernel...... subset of the LSC language. By equivalently translating an LSC chart into an observer TA and then non-intrusively composing this observer with the original system model, the problem of verifying a real-time system against a scenario-based requirement reduces to a classical real-time model checking...

  9. From Operating-System Correctness to Pervasively Verified Applications

    Science.gov (United States)

    Daum, Matthias; Schirmer, Norbert W.; Schmidt, Mareike

    Though program verification is known and has been used for decades, the verification of a complete computer system still remains a grand challenge. Part of this challenge is the interaction of application programs with the operating system, which is usually entrusted with retrieving input data from and transferring output data to peripheral devices. In this scenario, the correct operation of the applications inherently relies on operating-system correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes an approach to pervasively verify applications running on top of the operating system.

  10. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    Directory of Open Access Journals (Sweden)

    K. Di

    2012-07-01

    Full Text Available Chang'E-1(CE-1 and Chang'E-2(CE-2 are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1 refining EOPs by correcting the attitude angle bias, 2 refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model and DOM (Digital Ortho Map are automatically generated.

  11. Direct integration of the S-matrix applied to rigorous diffraction

    International Nuclear Information System (INIS)

    Iff, W; Lindlein, N; Tishchenko, A V

    2014-01-01

    A novel Fourier method for rigorous diffraction computation at periodic structures is presented. The procedure is based on a differential equation for the S-matrix, which allows direct integration of the S-matrix blocks. This results in a new method in Fourier space, which can be considered as a numerically stable and well-parallelizable alternative to the conventional differential method based on T-matrix integration and subsequent conversions from the T-matrices to S-matrix blocks. Integration of the novel differential equation in implicit manner is expounded. The applicability of the new method is shown on the basis of 1D periodic structures. It is clear however, that the new technique can also be applied to arbitrary 2D periodic or periodized structures. The complexity of the new method is O(N 3 ) similar to the conventional differential method with N being the number of diffraction orders. (fast track communication)

  12. Building and Verifying a Predictive Model of Interruption Resumption

    Science.gov (United States)

    2012-03-01

    the gardener to remember those plants (and whether they need to be removed), and so will not commit resources to remember that information . The overall...camera), the storyteller needed help much less often. This result suggests that when there is no one to help them remember the last thing they said...INV ITED P A P E R Building and Verifying a Predictive Model of Interruption Resumption Help from a robot, to allow a human storyteller to continue

  13. Rigorous derivation of porous-media phase-field equations

    Science.gov (United States)

    Schmuck, Markus; Kalliadasis, Serafim

    2017-11-01

    The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.

  14. Technical assessment of air quality measuring analyzers; Evaluation technique des analyseurs de mesure de la qualite de l`air

    Energy Technology Data Exchange (ETDEWEB)

    Tatry, V. [Laboratoire de mesures a l`air ambiant, Dept. Mesures et Analyses, INERIS, (France)

    1996-12-31

    Air quality measuring analyzers are assessed in order to verify their measuring performance and to examine their aptitude to field measurements. For ensuring such assessment, the INERIS institute (France) disposes of three climatic enclosures, gas mixture emission systems and data acquisition systems. The assessment methodology is presented together with the various possible results: response time, linearity and limits determination, calibration studies, thresholds, drifts, hysteresis, physical detrimental effects, etc. Applications such as analyzers for one or more pollutants in ambient air and at the emission source (portable multi-gas analyzers) are presented, together with their results

  15. 40 CFR 8.9 - Measures to assess and verify environmental impacts.

    Science.gov (United States)

    2010-07-01

    ... environmental impacts. 8.9 Section 8.9 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL ENVIRONMENTAL IMPACT ASSESSMENT OF NONGOVERNMENTAL ACTIVITIES IN ANTARCTICA § 8.9 Measures to assess and verify environmental impacts. (a) The operator shall conduct appropriate monitoring of key environmental indicators as...

  16. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    International Nuclear Information System (INIS)

    Tolk, Keith M.; Stoker, Gerald C.

    1999-01-01

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements

  17. An experiment designed to verify the general theory of relativity

    International Nuclear Information System (INIS)

    Surdin, Maurice

    1960-01-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [fr

  18. Elements of a system for verifying a Comprehensive Test Ban

    International Nuclear Information System (INIS)

    Hannon, W.J.

    1987-01-01

    The paper discusses the goals of a monitoring system for a CTB, its functions, the challenges to verification, discrimination techniques, and some recent developments. It is concluded technical, military and political efforts are required to establish and verify test ban treaties which will contribute to stability in the long term. It currently appears there will be a significant number of unidentified events

  19. Rigorous upper bounds for transport due to passive advection by inhomogeneous turbulence

    International Nuclear Information System (INIS)

    Krommes, J.A.; Smith, R.A.

    1987-05-01

    A variational procedure, due originally to Howard and explored by Busse and others for self-consistent turbulence problems, is employed to determine rigorous upper bounds for the advection of a passive scalar through an inhomogeneous turbulent slab with arbitrary generalized Reynolds number R and Kubo number K. In the basic version of the method, the steady-state energy balance is used as a constraint; the resulting bound, though rigorous, is independent of K. A pedagogical reference model (one dimension, K = ∞) is described in detail; the bound compares favorably with the exact solution. The direct-interaction approximation is also worked out for this model; it is somewhat more accurate than the bound, but requires considerably more labor to solve. For the basic bound, a general formalism is presented for several dimensions, finite correlation length, and reasonably general boundary conditions. Part of the general method, in which a Green's function technique is employed, applies to self-consistent as well as to passive problems, and thereby generalizes previous results in the fluid literature. The formalism is extended for the first time to include time-dependent constraints, and a bound is deduced which explicitly depends on K and has the correct physical scalings in all regimes of R and K. Two applications from the theory of turbulent plasmas ae described: flux in velocity space, and test particle transport in stochastic magnetic fields. For the velocity space problem the simplest bound reproduces Dupree's original scaling for the strong turbulence diffusion coefficient. For the case of stochastic magnetic fields, the scaling of the bounds is described for the magnetic diffusion coefficient as well as for the particle diffusion coefficient in the so-called collisionless, fluid, and double-streaming regimes

  20. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    Science.gov (United States)

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from

  1. Music-therapy analyzed through conceptual mapping

    Science.gov (United States)

    Martinez, Rodolfo; de la Fuente, Rebeca

    2002-11-01

    Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.

  2. Is Collaborative, Community-Engaged Scholarship More Rigorous than Traditional Scholarship? On Advocacy, Bias, and Social Science Research

    Science.gov (United States)

    Warren, Mark R.; Calderón, José; Kupscznk, Luke Aubry; Squires, Gregory; Su, Celina

    2018-01-01

    Contrary to the charge that advocacy-oriented research cannot meet social science research standards because it is inherently biased, the authors of this article argue that collaborative, community-engaged scholarship (CCES) must meet high standards of rigor if it is to be useful to support equity-oriented, social justice agendas. In fact, they…

  3. Guidelines for conducting rigorous health care psychosocial cross-cultural/language qualitative research.

    Science.gov (United States)

    Arriaza, Pablo; Nedjat-Haiem, Frances; Lee, Hee Yun; Martin, Shadi S

    2015-01-01

    The purpose of this article is to synthesize and chronicle the authors' experiences as four bilingual and bicultural researchers, each experienced in conducting cross-cultural/cross-language qualitative research. Through narrative descriptions of experiences with Latinos, Iranians, and Hmong refugees, the authors discuss their rewards, challenges, and methods of enhancing rigor, trustworthiness, and transparency when conducting cross-cultural/cross-language research. The authors discuss and explore how to effectively manage cross-cultural qualitative data, how to effectively use interpreters and translators, how to identify best methods of transcribing data, and the role of creating strong community relationships. The authors provide guidelines for health care professionals to consider when engaging in cross-cultural qualitative research.

  4. Analyzing the Impact of Brand Equity and Advertisement on Customers’ Loyalty in Isfahan City

    OpenAIRE

    Mohammad Hossein Moshref Javadi; Sayyed Mohsen Allameh; Amir Poursaaedi

    2014-01-01

    The objective of this study was to analyze the impact of advertisement and brand equity on customers’ loyalty in Isfahan city. literature review on advertising, brand equity, customer loyalty research model was presented. A standard questionnaire was used as data collection instrument. To measure SNOWA Corporation brand equity, Keller's brand equity model was used with six dimensions of brand's salience, performance, image, judgments, feelings and resonance. Face validity was used to verify t...

  5. Reconsideration of the sequence of rigor mortis through postmortem changes in adenosine nucleotides and lactic acid in different rat muscles.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Iwadate, K; Nakajima, M

    1996-10-25

    We examined the changes in adenosine triphosphate (ATP), lactic acid, adenosine diphosphate (ADP) and adenosine monophosphate (AMP) in five different rat muscles after death. Rigor mortis has been thought to occur simultaneously in dead muscles and hence to start in small muscles sooner than in large muscles. In this study we found that the rate of decrease in ATP was significantly different in each muscle. The greatest drop in ATP was observed in the masseter muscle. These findings contradict the conventional theory of rigor mortis. Similarly, the rates of change in ADP and lactic acid, which are thought to be related to the consumption or production of ATP, were different in each muscle. However, the rate of change of AMP was the same in each muscle.

  6. Modeling and Analyzing the Slipping of the Ball Screw

    Directory of Open Access Journals (Sweden)

    Nannan Xu

    Full Text Available AbstractThis paper aims to set up the ball systematic slipping model and analyze the slipping characteristics caused by different factors for a ball screw operating at high speeds. To investigate the ball screw slipping mechanism, transformed coordinate system should be established firstly. Then it is used to set up mathematical modeling for the ball slipping caused by the three main reasons and the speed of slipping can be calculated. Later, the influence of the contact angle, helix angle and screw diameter for ball screw slipping will be analyzed according to the ball slipping model and slipping speeds equation and the slipping analysis will be obtained. Finally, curve of slipping analysis and that of mechanical efficiency of the ball screw analysis by Lin are compared, which will indirectly verify the correctness of the slipping model. The slipping model and the curve of slipping analysis established in this paper will provide theory basis for reducing slipping and improving the mechanical efficiency of a ball screw operating at high speeds.

  7. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Hoffstaetter, G.H.

    1994-12-01

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  8. Useful, Used, and Peer Approved: The Importance of Rigor and Accessibility in Postsecondary Research and Evaluation. WISCAPE Viewpoints

    Science.gov (United States)

    Vaade, Elizabeth; McCready, Bo

    2012-01-01

    Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…

  9. Improved rigorous upper bounds for transport due to passive advection described by simple models of bounded systems

    International Nuclear Information System (INIS)

    Kim, Chang-Bae; Krommes, J.A.

    1988-08-01

    The work of Krommes and Smith on rigorous upper bounds for the turbulent transport of a passively advected scalar [/ital Ann. Phys./ 177:246 (1987)] is extended in two directions: (1) For their ''reference model,'' improved upper bounds are obtained by utilizing more sophisticated two-time constraints which include the effects of cross-correlations up to fourth order. Numerical solutions of the model stochastic differential equation are also obtained; they show that the new bounds compare quite favorably with the exact results, even at large Reynolds and Kubo numbers. (2) The theory is extended to take account of a finite spatial autocorrelation length L/sub c/. As a reasonably generic example, the problem of particle transport due to statistically specified stochastic magnetic fields in a collisionless turbulent plasma is revisited. A bound is obtained which reduces for small L/sub c/ to the quasilinear limit and for large L/sub c/ to the strong turbulence limit, and which provides a reasonable and rigorous interpolation for intermediate values of L/sub c/. 18 refs., 6 figs

  10. Teaching practice and experiences of verifying the three laws of genetics based on the SSLP marker analysis.

    Science.gov (United States)

    Huang, Xue-Ying; Fan, Kai; Ye, Yan-Fang; Wang, Bin; Wu, Wei-Ren; Lan, Tao

    2017-09-20

    We explored the practical effect of the genetic analysis of simple sequence length polymorphism (SSLP) molecular markers in rice in the genetics lab course. Two parents and their F 2 population were analyzed and detected with three SSLP molecular markers that located on two chromosomes of the rice genome. The markers' genotype data were used to verify the three laws of genetics, including segregation, independent assortment and linkage and crossing-over. Our practice has proved not only beneficial to deepen students' understandings about the three laws of genetics, but also conducive to cultivate students' interests in research and innovation and improve their skills and comprehensive analysis abilities. At the same time, the application scope of the experiment was discussed. This comprehensive experiment is also useful for the transformation of scientific research achievements into undergraduate experimental teaching.

  11. Study on diagnostic plant analyzer method for support of emergency operation

    International Nuclear Information System (INIS)

    Yoshikawa, H.; Gofuku, A.; Itoh, K.; Wakabayashi, J.

    1986-01-01

    Methods of time-critical diagnostic plant analyzer are investigated which would serve as support to emergency operation of nuclear power plant. A faster-than-real-time simulator code TOKRAC is developed for analyzing PWR primary loop thermo-hydraulics of small-break LOCA and it is applied for a numerical experiment of initial phase of TMI-2 accident. TOKRAC resulted in a good agreement with a RELAP4/MOD6 calculation and the plant record with as fast as can one-tenth of real time. A real-time estimator of SG heat transfer rate based on Kalman filter is proposed and its applicability is verified using LOFT ATWS experimental data. With regards to how to integrate those methods into the software system in operation support center, a new concept of module-based simulation system is proposed which aims at offering a flexible and human-cognitive oriented environment for various analytical tool development

  12. A study into first-year engineering education success using a rigorous mixed methods approach

    DEFF Research Database (Denmark)

    van den Bogaard, M.E.D.; de Graaff, Erik; Verbraek, Alexander

    2015-01-01

    The aim of this paper is to combine qualitative and quantitative research methods into rigorous research into student success. Research methods have weaknesses that can be overcome by clever combinations. In this paper we use a situated study into student success as an example of how methods...... using statistical techniques. The main elements of the model were student behaviour and student disposition, which were influenced by the students’ perceptions of the education environment. The outcomes of the qualitative studies were useful in interpreting the outcomes of the structural equation...

  13. Verifying mapping, monitoring and modeling of fine sediment pollution sources in West Maui, Hawai'i, USA

    Science.gov (United States)

    Cerovski-Darriau, C.; Stock, J. D.

    2017-12-01

    Coral reef ecosystems, and the fishing and tourism industries they support, depend on clean waters. Fine sediment pollution from nearshore watersheds threatens these enterprises in West Maui, Hawai'i. To effectively mitigate sediment pollution, we first have to know where the sediment is coming from, and how fast it erodes. In West Maui, we know that nearshore sediment plumes originate from erosion of fine sand- to silt-sized air fall deposits where they are exposed by grazing, agriculture, or other disturbances. We identified and located these sediment sources by mapping watershed geomorphological processes using field traverses, historic air photos, and modern orthophotos. We estimated bank lowering rates using erosion pins, and other surface erosion rates were extrapolated from data collected elsewhere on the Hawaiian Islands. These measurements and mapping led to a reconnaissance sediment budget which showed that annual loads are dominated by bank erosion of legacy terraces. Field observations during small storms confirm that nearshore sediment plumes are sourced from bank erosion of in-stream, legacy agricultural deposits. To further verify this sediment budget, we used geochemical fingerprinting to uniquely identify each potential source (e.g. stream banks, agricultural fields, roads, other human modified soils, and hillslopes) from the Wahikuli watershed (10 km2) and analyzed the fine fraction using ICP-MS for elemental geochemistry. We propose to apply this the fingerprinting results to nearshore suspended sediment samples taken during storms to identify the proportion of sediment coming from each source. By combining traditional geomorphic mapping, monitoring and geochemistry, we hope to provide a powerful tool to verify the primary source of sediment reaching the nearshore.

  14. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Science.gov (United States)

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  15. PRO development: rigorous qualitative research as the crucial foundation.

    Science.gov (United States)

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  16. Analytical verification and quality assessment of the Tosoh HLC-723GX HbA1c analyzer

    Directory of Open Access Journals (Sweden)

    Marko Ris

    2017-04-01

    Full Text Available Objectives: Ion-exchange high-performance liquid chromatography (IE-HPLC has long been used as a reproducible and versatile analytical tool for HbA1c measurement.In this study, we performed analytical verification and quality assessment of the recently introduced small IE-HPLC Tosoh HLC-723GX HbA1c analyzer, and a comparison of results to immunoassay (IA and capillary electrophoresis (CE. Design and methods: The total imprecision of Tosoh HLC-723GX was verified according to CLSI EP15-A2 protocol using commercial control materials (C-QC and pooled human whole blood samples (HWB. The Sigma metric was used for the evaluation of quality targets. HbA1c results were compared to automated CE (MiniCap Flex Piercing, Sebia, France and IA (Tina-quant HbA1c Gen 2, Cobas Integra 400+, Roche Diagnostics, USA procedures. Results: The total imprecision of Tosoh HLC-723GX-HbA1c for IFCC(mmol/mol and NGSP(% units was: 1.91/1.25% (HbA1c=31 mmol/mol/5.0% and 0.51/0.63% (HbA1c=84 mmol/mol/9.8% for C-QC, and 0.39/0.2% (HbA1c=47 mmol/mol/6.5% and 0.77/0.46% (HbA1c=94 mmol/mol/10.8% in HWB samples, respectively. Bland-Altman analysis did not reveal any deviation of the results between Tosoh HLC-723GX and CE: mean difference 0.0% (95%CI: −0.02927 to 0.02653%, while the mean HbA1c difference against IA was −0.07% (95%CI: −0.1039 to −0.02765. At the selected HbA1c clinical decision level (48 mmol/mol/6,5%, six sigma analysis gave σ value of 3.91, within a desirable classification of performance. Conclusion: The analytical performance of the Tosoh HLC-723GX complies with the rigorous quality criteria for clinical use of HbA1c, with the results comparable to the CE procedure. Tosoh HLC-723GX provides a plausible analytical choice for reliable HbA1c measurement in low-volume laboratories. Keywords: HbA1c, Quality targets, Six sigma, Tosoh HLC-723GX analyzer

  17. Verifying atom entanglement schemes by testing Bell's inequality

    International Nuclear Information System (INIS)

    Angelakis, D.G.; Knight, P.L.; Tregenna, B.; Munro, W.J.

    2001-01-01

    Recent experiments to test Bell's inequality using entangled photons and ions aimed at tests of basic quantum mechanical principles. Interesting results have been obtained and many loopholes could be closed. In this paper we want to point out that tests of Bell's inequality also play an important role in verifying atom entanglement schemes. We describe as an example a scheme to prepare arbitrary entangled states of N two-level atoms using a leaky optical cavity and a scheme to entangle atoms inside a photonic crystal. During the state preparation no photons are emitted, and observing a violation of Bell's inequality is the only way to test whether a scheme works with a high precision or not. (orig.)

  18. Paulo Leminski : um estudo sobre o rigor e o relaxo em suas poesias

    OpenAIRE

    Dhynarte de Borba e Albuquerque

    2005-01-01

    O trabalho examina a trajetória da poesia de Paulo Leminski, buscando estabelecer os termos do humor, da pesquisa metalingüística e do eu-lírico, e que não deixa de exibir traços da poesia marginal dos 70. Um autor que trabalhou com a busca do rigor concretista mediante os procedimentos da fala cotidiana mais ou menos relaxada. O esforço poético do curitibano Leminski é uma “linha que nunca termina” – ele escreveu poesias, romances, peças de publicidade, letras de música e fez traduções. Em t...

  19. Derivation of basic equations for rigorous dynamic simulation of cryogenic distillation column for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro; Naruse, Yuji

    1981-08-01

    The basic equations are derived for rigorous dynamic simulation of cryogenic distillation columns for hydrogen isotope separation. The model accounts for such factors as differences in latent heat of vaporization among the six isotopic species of molecular hydrogen, decay heat of tritium, heat transfer through the column wall and nonideality of the solutions. Provision is also made for simulation of columns with multiple feeds and multiple sidestreams. (author)

  20. The low-energy neutron-deuteron analyzing power and the sup 3 P sub 0,1,2 interactions of nucleon-nucleon potentials

    Energy Technology Data Exchange (ETDEWEB)

    Tornow, W.; Howell, C.R.; Alohali, M.; Chen, Z.P.; Felsher, P.D.; Hanly, J.M.; Walter, R.L.; Weisel, G. (Duke Univ., Durham, NC (USA). Dept. of Physics Triangle Universities Nuclear Lab., Durham, NC (USA)); Mertens, G. (Tuebingen Univ. (Germany, F.R.). Physikalisches Inst.); Slaus, I. (Institut Rudjer Boskovic, Zagreb (Yugoslavia)); Witala, H.; Gloeckle, W. (Bochum Univ. (Germany, F.R.). Inst. fuer Theoretische Physik 2)

    1991-03-28

    Data for the analyzing power A{sub y}({theta}) for the elastic scattering of neutrons from deuterons have been measured at 5.0, 6.5 and 8.5 MeV to an accuracy of +-0.0035. Surprisingly large differences have been observed at these low energies between the data and rigorous Faddeev calculations using the Paris and Bonn B nucleon-nucleon potentials. The A{sub y}({theta}) data provide a stringent test for our present understanding of the on-shell and off-shell {sup 3}P{sub 0,1,2} nucleon-nucleon interactions. (orig.).

  1. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    International Nuclear Information System (INIS)

    Ogle, Stephen M; Davis, Kenneth; Lauvaux, Thomas; Miles, Natasha L; Richardson, Scott; Schuh, Andrew; Cooley, Dan; Breidt, F Jay; West, Tristram O; Heath, Linda S; Smith, James E; McCarty, Jessica L; Gurney, Kevin R; Tans, Pieter; Denning, A Scott

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country’s contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated emissions associated with managing lands for carbon sequestration and other activities, which often have large uncertainties. We report here on the challenges and results associated with a case study using atmospheric measurements of CO 2 concentrations and inverse modeling to verify nationally-reported biogenic CO 2 emissions. The biogenic CO 2 emissions inventory was compiled for the Mid-Continent region of United States based on methods and data used by the US government for reporting to the UNFCCC, along with additional sources and sinks to produce a full carbon balance. The biogenic emissions inventory produced an estimated flux of −408 ± 136 Tg CO 2 for the entire study region, which was not statistically different from the biogenic flux of −478 ± 146 Tg CO 2 that was estimated using the atmospheric CO 2 concentration data. At sub-regional scales, the spatial density of atmospheric observations did not appear sufficient to verify emissions in general. However, a difference between the inventory and inversion results was found in one isolated area of West-central Wisconsin. This part of the region is dominated by forestlands, suggesting that further investigation may be warranted into the forest C stock or harvested wood product data from this portion of the study area. The results suggest that observations of atmospheric CO 2 concentration data and inverse modeling could be used to verify biogenic emissions, and provide more confidence in biogenic GHG emissions reporting to the UNFCCC. (letter)

  2. Verifying the model of predicting entrepreneurial intention among students of business and non-business orientation

    Directory of Open Access Journals (Sweden)

    Zoran Sušanj

    2015-01-01

    Full Text Available This study aims to verify whether certain entrepreneurial characteristics, like entrepreneurial potential and entrepreneurial propensity, affect the level of entrepreneurial self-efficacy and desirability of entrepreneurship, and further have direct and indirect effect on entrepreneurial intentions. Furthermore, this study seeks to compare the strength of the relationship between these variables among groups of students who receive some entrepreneurship education and students outside the business sphere. Data was collected from a sample of undergraduate students of business and non-business orientation and analyzed with multi-group analysis within SEM. Results of the multi-group analysis indicate that indeed, the strength of the relationship among tested variables is more pronounced when it comes to business students. That is, mediating effect of perceived entrepreneurial self-efficacy and desirability of entrepreneurship in the relationship between entrepreneurial characteristics and intent, is significantly stronger for the business-oriented groups, in comparison to non-business orientation group. The amount of explained variance of all constructs (except entrepreneurial propensity is also larger in business students in comparison to non-business students. Educational implications of obtained results are discussed.

  3. Data-Driven Approach for Analyzing Hydrogeology and Groundwater Quality Across Multiple Scales.

    Science.gov (United States)

    Curtis, Zachary K; Li, Shu-Guang; Liao, Hua-Sheng; Lusch, David

    2017-08-29

    Recent trends of assimilating water well records into statewide databases provide a new opportunity for evaluating spatial dynamics of groundwater quality and quantity. However, these datasets are scarcely rigorously analyzed to address larger scientific problems because they are of lower quality and massive. We develop an approach for utilizing well databases to analyze physical and geochemical aspects of groundwater systems, and apply it to a multiscale investigation of the sources and dynamics of chloride (Cl - ) in the near-surface groundwater of the Lower Peninsula of Michigan. Nearly 500,000 static water levels (SWLs) were critically evaluated, extracted, and analyzed to delineate long-term, average groundwater flow patterns using a nonstationary kriging technique at the basin-scale (i.e., across the entire peninsula). Two regions identified as major basin-scale discharge zones-the Michigan and Saginaw Lowlands-were further analyzed with regional- and local-scale SWL models. Groundwater valleys ("discharge" zones) and mounds ("recharge" zones) were identified for all models, and the proportions of wells with elevated Cl - concentrations in each zone were calculated, visualized, and compared. Concentrations in discharge zones, where groundwater is expected to flow primarily upwards, are consistently and significantly higher than those in recharge zones. A synoptic sampling campaign in the Michigan Lowlands revealed concentrations generally increase with depth, a trend noted in previous studies of the Saginaw Lowlands. These strong, consistent SWL and Cl - distribution patterns across multiple scales suggest that a deep source (i.e., Michigan brines) is the primary cause for the elevated chloride concentrations observed in discharge areas across the peninsula. © 2017, National Ground Water Association.

  4. Circular polarization analyzer with polarization tunable focusing of surface plasmon polaritons

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Sen; Zhang, Yan, E-mail: yzhang@mail.cnu.edu.cn [Department of Physics, Harbin Institute of Technology, Harbin 150001 (China); Beijing Key Laboratory for Metamaterials and Devices, and Key Laboratory of Terahertz Optoelectronics, Ministry of Education, Department of Physics, Capital Normal University, Beijing 100048 (China); Wang, Xinke [Beijing Key Laboratory for Metamaterials and Devices, and Key Laboratory of Terahertz Optoelectronics, Ministry of Education, Department of Physics, Capital Normal University, Beijing 100048 (China); Kan, Qiang [State Key Laboratory for Integrated Optoelectronics, Institute of Semiconductors, Chinese Academy of Sciences, Beijing 100083 (China); Qu, Shiliang [Optoelectronics Department, Harbin Institute of Technology at Weihai, Weihai 264209 (China)

    2015-12-14

    A practical circular polarization analyzer (CPA) that can selectively focus surface plasmon polaritons (SPPs) at two separate locations, according to the helicity of the circularly polarized light, is designed and experimentally verified in the terahertz frequency range. The CPA consists of fishbone-slit units and is designed using the simulated annealing algorithm. By differentially detecting the intensities of the two SPPs focuses, the helicity of the incident circularly polarized light can be obtained and the CPA is less vulnerable to the noise of incident light. The proposed device may also have wide potential applications in chiral SPPs photonics and the analysis of chiral molecules in biology.

  5. Using Project Complexity Determinations to Establish Required Levels of Project Rigor

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Thomas D.

    2015-10-01

    This presentation discusses the project complexity determination process that was developed by National Security Technologies, LLC, for the U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office for implementation at the Nevada National Security Site (NNSS). The complexity determination process was developed to address the diversity of NNSS project types, size, and complexity; to fill the need for one procedure but with provision for tailoring the level of rigor to the project type, size, and complexity; and to provide consistent, repeatable, effective application of project management processes across the enterprise; and to achieve higher levels of efficiency in project delivery. These needs are illustrated by the wide diversity of NNSS projects: Defense Experimentation, Global Security, weapons tests, military training areas, sensor development and testing, training in realistic environments, intelligence community support, sensor development, environmental restoration/waste management, and disposal of radioactive waste, among others.

  6. Design of a verifiable subset for HAL/S

    Science.gov (United States)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  7. How to Verify and Manage the Translational Plagiarism?

    Science.gov (United States)

    Wiwanitkit, Viroj

    2016-01-01

    The use of Google translator as a tool for determining translational plagiarism is a big challenge. As noted, plagiarism of the original papers written in Macedonian and translated into other languages can be verified after computerised translation in other languages. Attempts to screen the translational plagiarism should be supported. The use of Google Translate tool might be helpful. Special focus should be on any non-English reference that might be the source of plagiarised material and non-English article that might translate from an original English article, which cannot be detected by simple plagiarism screening tool. It is a hard job for any journal to detect the complex translational plagiarism but the harder job might be how to effectively manage the case. PMID:27703588

  8. Association Between Maximal Skin Dose and Breast Brachytherapy Outcome: A Proposal for More Rigorous Dosimetric Constraints

    International Nuclear Information System (INIS)

    Cuttino, Laurie W.; Heffernan, Jill; Vera, Robyn; Rosu, Mihaela; Ramakrishnan, V. Ramesh; Arthur, Douglas W.

    2011-01-01

    Purpose: Multiple investigations have used the skin distance as a surrogate for the skin dose and have shown that distances 4.05 Gy/fraction. Conclusion: The initial skin dose recommendations have been based on safe use and the avoidance of significant toxicity. The results from the present study have suggested that patients might further benefit if more rigorous constraints were applied and if the skin dose were limited to 120% of the prescription dose.

  9. 13 CFR 127.403 - What happens if SBA verifies the concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA verifies the concern's eligibility? 127.403 Section 127.403 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  10. Rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets

    International Nuclear Information System (INIS)

    Yang, Z.R.

    1993-10-01

    We have exactly calculated the rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets (SC's) by means of graph expansion and a combinatorial approach and investigated the asymptotic behaviour in the limit of long distance. The result show there is no long range correlation between spins at any finite temperature which indicates no existence of phase transition and thus finally confirms the conclusion produced by the renormalization group method and other physical arguments. (author). 7 refs, 6 figs

  11. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    Science.gov (United States)

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  12. Release of major ions during rigor mortis development in kid Longissimus dorsi muscle.

    Science.gov (United States)

    Feidt, C; Brun-Bellut, J

    1999-01-01

    Ionic strength plays an important role in post mortem muscle changes. Its increase is due to ion release during the development of rigor mortis. Twelve alpine kids were used to study the effects of chilling and meat pH on ion release. Free ions were measured in Longissimus dorsi muscle by capillary electrophoresis after water extraction. All free ion concentrations increased after death, but there were differences between ions. Temperature was not a factor affecting ion release in contrast to ultimate pH value. Three release mechanisms are believed to coexist: a passive binding to proteins, which stops as pH decreases, an active segregation which stops as ATP disappears and the production of metabolites due to anaerobic glycolysis.

  13. Development of a theoretical framework for analyzing cerebrospinal fluid dynamics

    DEFF Research Database (Denmark)

    Cohen, Benjamin; Voorhees, Abram; Vedel, Søren

    2009-01-01

    Background: To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservat...

  14. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  15. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-01-01

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733

  16. Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol

    Science.gov (United States)

    Huang, Xiaowan; Singh, Anu; Smolka, Scott A.

    2010-01-01

    We use the UPPAAL model checker for Timed Automata to verify the Timing-Sync time-synchronization protocol for sensor networks (TPSN). The TPSN protocol seeks to provide network-wide synchronization of the distributed clocks in a sensor network. Clock-synchronization algorithms for sensor networks such as TPSN must be able to perform arithmetic on clock values to calculate clock drift and network propagation delays. They must be able to read the value of a local clock and assign it to another local clock. Such operations are not directly supported by the theory of Timed Automata. To overcome this formal-modeling obstacle, we augment the UPPAAL specification language with the integer clock derived type. Integer clocks, which are essentially integer variables that are periodically incremented by a global pulse generator, greatly facilitate the encoding of the operations required to synchronize clocks as in the TPSN protocol. With this integer-clock-based model of TPSN in hand, we use UPPAAL to verify that the protocol achieves network-wide time synchronization and is devoid of deadlock. We also use the UPPAAL Tracer tool to illustrate how integer clocks can be used to capture clock drift and resynchronization during protocol execution

  17. Rigorous derivation of the perimeter generating functions for the mean-squared radius of gyration of rectangular, Ferrers and pyramid polygons

    International Nuclear Information System (INIS)

    Lin, Keh Ying

    2006-01-01

    We have derived rigorously the perimeter generating functions for the mean-squared radius of gyration of rectangular, Ferrers and pyramid polygons. These functions were found by Jensen recently. His nonrigorous results are based on the analysis of the long series expansions. (comment)

  18. Rigorous decoupling between edge states in frustrated spin chains and ladders

    Science.gov (United States)

    Chepiga, Natalia; Mila, Frédéric

    2018-05-01

    We investigate the occurrence of exact zero modes in one-dimensional quantum magnets of finite length that possess edge states. Building on conclusions first reached in the context of the spin-1/2 X Y chain in a field and then for the spin-1 J1-J2 Heisenberg model, we show that the development of incommensurate correlations in the bulk invariably leads to oscillations in the sign of the coupling between edge states, and hence to exact zero energy modes at the crossing points where the coupling between the edge states rigorously vanishes. This is true regardless of the origin of the frustration (e.g., next-nearest-neighbor coupling or biquadratic coupling for the spin-1 chain), of the value of the bulk spin (we report on spin-1/2, spin-1, and spin-2 examples), and of the value of the edge-state emergent spin (spin-1/2 or spin-1).

  19. Rigorous constraints on the matrix elements of the energy–momentum tensor

    Directory of Open Access Journals (Sweden)

    Peter Lowdon

    2017-11-01

    Full Text Available The structure of the matrix elements of the energy–momentum tensor play an important role in determining the properties of the form factors A(q2, B(q2 and C(q2 which appear in the Lorentz covariant decomposition of the matrix elements. In this paper we apply a rigorous frame-independent distributional-matching approach to the matrix elements of the Poincaré generators in order to derive constraints on these form factors as q→0. In contrast to the literature, we explicitly demonstrate that the vanishing of the anomalous gravitomagnetic moment B(0 and the condition A(0=1 are independent of one another, and that these constraints are not related to the specific properties or conservation of the individual Poincaré generators themselves, but are in fact a consequence of the physical on-shell requirement of the states in the matrix elements and the manner in which these states transform under Poincaré transformations.

  20. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    Science.gov (United States)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  1. An efficient and rigorous thermodynamic library and optimal-control of a cryogenic air separation unit

    DEFF Research Database (Denmark)

    Gaspar, Jozsef; Ritschel, Tobias Kasper Skovborg; Jørgensen, John Bagterp

    2017-01-01

    -linear model based control to achieve optimal techno-economic performance. Accordingly, this work presents a computationally efficient and novel approach for solving a tray-by-tray equilibrium model and its implementation for open-loop optimal-control of a cryogenic distillation column. Here, the optimisation...... objective is to reduce the cost of compression in a volatile electricity market while meeting the production requirements, i.e. product flow rate and purity. This model is implemented in Matlab and uses the ThermoLib rigorous thermodynamic library. The present work represents a first step towards plant...

  2. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    Science.gov (United States)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  3. ASTUS system for verifying the transport seal TITUS 1

    International Nuclear Information System (INIS)

    Barillaux; Monteil, D.; Destain, G.D.

    1991-01-01

    ASTUS, a system for acquisition and processing ultrasonic signatures of TITUS 1 seals has been developed. TITUS seals are used to verify the integrity of the fissile material's container sealing after transport. An autonomous portable reading case permit to take seals signatures at the starting point and to transmit these reference signatures to a central safeguards computer by phonic modem. Then, at the terminal point with a similar reading case, an authority takes again the signature of seals and immediately transmit these signatures to the central safeguards computer. The central computer processes the data in real time by autocorrelation and return its verdict to the terminal point

  4. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China.

    Science.gov (United States)

    Liu, Tao; Zhu, Guanghu; He, Jianfeng; Song, Tie; Zhang, Meng; Lin, Hualiang; Xiao, Jianpeng; Zeng, Weilin; Li, Xing; Li, Zhihao; Xie, Runsheng; Zhong, Haojie; Wu, Xiaocheng; Hu, Wenbiao; Zhang, Yonghui; Ma, Wenjun

    2017-08-02

    Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6%) in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI) data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed) model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250). The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  5. Chlorophyll fluorescence is a rigorous, high throughput tool to analyze the impacts of genotype, species, and stress on plant and ecosystem productivity

    Science.gov (United States)

    Ewers, B. E.; Pleban, J. R.; Aston, T.; Beverly, D.; Speckman, H. N.; Hosseini, A.; Bretfeld, M.; Edwards, C.; Yarkhunova, Y.; Weinig, C.; Mackay, D. S.

    2017-12-01

    Abiotic and biotic stresses reduce plant productivity, yet high-throughput characterization of plant responses across genotypes, species and stress conditions are limited by both instrumentation and data analysis techniques. Recent developments in chlorophyll a fluorescence measurement at leaf to landscape scales could improve our predictive understanding of plants response to stressors. We analyzed the interaction of species and stress across two crop types, five gymnosperm and two angiosperm tree species from boreal and montane forests, grasses, forbs and shrubs from sagebrush steppe, and 30 tree species from seasonally wet tropical forest. We also analyzed chlorophyll fluorescence and gas exchange data from twelve Brassica rapa crop accessions and 120 recombinant inbred lines to investigate phenotypic responses to drought. These data represent more than 10,000 measurements of fluorescence and allow us to answer two questions 1) are the measurements from high-throughput, hand held and drone-mounted instruments quantitatively similar to lower throughput camera and gas exchange mounted instruments and 2) do the measurements find differences in genotypic, species and environmental stress on plants? We found through regression that the high and low throughput instruments agreed across both individual chlorophyll fluorescence components and calculated ratios and were not different from a 1:1 relationship with correlation greater than 0.9. We used hierarchical Bayesian modeling to test the second question. We found a linear relationship between the fluorescence-derived quantum yield of PSII and the quantum yield of CO2 assimilation from gas-exchange, with a slope of ca. 0.1 indicating that the efficiency of the entire photosynthetic process was about 10% of PSII across genotypes, species and drought stress. Posterior estimates of quantum yield revealed that drought-treatment, genotype and species differences were preserved when accounting for measurement uncertainty

  6. Rigorous description of holograms of particles illuminated by an astigmatic elliptical Gaussian beam

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Y J; Ren, K F; Coetmellec, S; Lebrun, D, E-mail: fang.ren@coria.f [UMR 6614/CORIA, CNRS and Universite et INSA de Rouen Avenue de l' Universite BP 12, 76801 Saint Etienne du Rouvray (France)

    2009-02-01

    The digital holography is a non-intrusive optical metrology and well adapted for the measurement of the size and velocity field of particles in the spray of a fluid. The simplified model of an opaque disk is often used in the treatment of the diagrams and therefore the refraction and the third dimension diffraction of the particle are not taken into account. We present in this paper a rigorous description of the holographic diagrams and evaluate the effects of the refraction and the third dimension diffraction by comparison to the opaque disk model. It is found that the effects are important when the real part of the refractive index is near unity or the imaginary part is non zero but small.

  7. Integrated Process Design and Control of Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2015-01-01

    on the element concept, which is used to translate a system of compounds into elements. The operation of the reactive distillation column at the highest driving force and other candidate points is analyzed through analytical solution as well as rigorous open-loop and closed-loop simulations. By application...... of this approach, it is shown that designing the reactive distillation process at the maximum driving force results in an optimal design in terms of controllability and operability. It is verified that the reactive distillation design option is less sensitive to the disturbances in the feed at the highest driving...

  8. Election Verifiability: Cryptographic Definitions and an Analysis of Helios and JCJ

    Science.gov (United States)

    2015-08-06

    Computer Society, 2014. To appear. [26] David Chaum . Untraceable electronic mail, return addresses, and digital pseudonyms. Communications of the ACM...24(2):84–88, 1981. [27] David Chaum . Secret-ballot receipts: True voter-verifiable elections. IEEE Security and Privacy, 2(1):38–47, 2004. [28... David Chaum , Richard Carback, Jeremy Clark, Aleksander Essex, Stefan Popoveniuc, Ronald L. Rivest, Peter Y. A. Ryan, Emily Shen, and Alan T. Sherman

  9. Analyzing temporal changes in maximum runoff volume series of the Danube River

    International Nuclear Information System (INIS)

    Halmova, Dana; Pekarova, Pavla; Onderka, Milan; Pekar, Jan

    2008-01-01

    Several hypotheses claim that more extremes in climatic and hydrologic phenomena are anticipated. In order to verify such hypotheses it is inevitable to examine the past periods by thoroughly analyzing historical data. In the present study, the annual maximum runoff volumes with t-day durations were calculated for a 130-year series of mean daily discharge of Danube River at Bratislava gauge (Slovakia). Statistical methods were used to clarify how the maximum runoff volumes of the Danube River changed over two historical periods (1876-1940 and 1941-2005). The conclusion is that the runoff volume regime during floods has not changed significantly during the last 130 years.

  10. Fast rigorous numerical method for the solution of the anisotropic neutron transport problem and the NITRAN system for fusion neutronics application. Pt. 1

    International Nuclear Information System (INIS)

    Takahashi, A.; Rusch, D.

    1979-07-01

    Some recent neutronics experiments for fusion reactor blankets show that the precise treatment of anisotropic secondary emissions for all types of neutron scattering is needed for neutron transport calculations. In the present work new rigorous methods, i.e. based on non-approximative microscopic neutron balance equations, are applied to treat the anisotropic collision source term in transport equations. The collision source calculation is free from approximations except for the discretization of energy, angle and space variables and includes the rigorous treatment of nonelastic collisions, as far as nuclear data are given. Two methods are presented: first the Ii-method, which relies on existing nuclear data files and then, as an ultimate goal, the I*-method, which aims at the use of future double-differential cross section data, but which is also applicable to the present single-differential data basis to allow a smooth transition to the new data type. An application of the Ii-method is given in the code system NITRAN which employs the Ssub(N)-method to solve the transport equations. Both rigorous methods, the Ii- and the I*-method, are applicable to all radiation transport problems and they can be used also in the Monte-Carlo-method to solve the transport problem. (orig./RW) [de

  11. Spin temperature concept verified by optical magnetometry of nuclear spins

    Science.gov (United States)

    Vladimirova, M.; Cronenberger, S.; Scalbert, D.; Ryzhov, I. I.; Zapasskii, V. S.; Kozlov, G. G.; Lemaître, A.; Kavokin, K. V.

    2018-01-01

    We develop a method of nonperturbative optical control over adiabatic remagnetization of the nuclear spin system and apply it to verify the spin temperature concept in GaAs microcavities. The nuclear spin system is shown to exactly follow the predictions of the spin temperature theory, despite the quadrupole interaction that was earlier reported to disrupt nuclear spin thermalization. These findings open a way for the deep cooling of nuclear spins in semiconductor structures, with the prospect of realizing nuclear spin-ordered states for high-fidelity spin-photon interfaces.

  12. Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute.

    Science.gov (United States)

    Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene

    2016-04-01

    Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Independent technique of verifying high-dose rate (HDR) brachytherapy treatment plans

    International Nuclear Information System (INIS)

    Saw, Cheng B.; Korb, Leroy J.; Darnell, Brenda; Krishna, K. V.; Ulewicz, Dennis

    1998-01-01

    Purpose: An independent technique for verifying high-dose rate (HDR) brachytherapy treatment plans has been formulated and validated clinically. Methods and Materials: In HDR brachytherapy, dwell times at respective dwell positions are computed, using an optimization algorithm in a HDR treatment-planning system to deliver a specified dose to many target points simultaneously. Because of the variability of dwell times, concerns have been expressed regarding the ability of the algorithm to compute the correct dose. To address this concern, a commercially available low-dose rate (LDR) algorithm was used to compute the doses at defined distances, based on the dwell times obtained from the HDR treatment plans. The percent deviation between doses computed using the HDR and LDR algorithms were reviewed for HDR procedures performed over the last year. Results: In this retrospective study, the difference between computed doses using the HDR and LDR algorithms was found to be within 5% for about 80% of the HDR procedures. All of the reviewed procedures have dose differences of less than 10%. Conclusion: An independent technique for verifying HDR brachytherapy treatment plans has been validated based on clinical data. Provided both systems are available, this technique is universal in its applications and not limited to either a particular implant applicator, implant site, or implant type

  14. Machine learning to analyze images of shocked materials for precise and accurate measurements

    Energy Technology Data Exchange (ETDEWEB)

    Dresselhaus-Cooper, Leora; Howard, Marylesa; Hock, Margaret C.; Meehan, B. T.; Ramos, Kyle J.; Bolme, Cindy A.; Sandberg, Richard L.; Nelson, Keith A.

    2017-09-14

    A supervised machine learning algorithm, called locally adaptive discriminant analysis (LADA), has been developed to locate boundaries between identifiable image features that have varying intensities. LADA is an adaptation of image segmentation, which includes techniques that find the positions of image features (classes) using statistical intensity distributions for each class in the image. In order to place a pixel in the proper class, LADA considers the intensity at that pixel and the distribution of intensities in local (nearby) pixels. This paper presents the use of LADA to provide, with statistical uncertainties, the positions and shapes of features within ultrafast images of shock waves. We demonstrate the ability to locate image features including crystals, density changes associated with shock waves, and material jetting caused by shock waves. This algorithm can analyze images that exhibit a wide range of physical phenomena because it does not rely on comparison to a model. LADA enables analysis of images from shock physics with statistical rigor independent of underlying models or simulations.

  15. Fractional Stochastic Differential Equations Satisfying Fluctuation-Dissipation Theorem

    Science.gov (United States)

    Li, Lei; Liu, Jian-Guo; Lu, Jianfeng

    2017-10-01

    We propose in this work a fractional stochastic differential equation (FSDE) model consistent with the over-damped limit of the generalized Langevin equation model. As a result of the `fluctuation-dissipation theorem', the differential equations driven by fractional Brownian noise to model memory effects should be paired with Caputo derivatives, and this FSDE model should be understood in an integral form. We establish the existence of strong solutions for such equations and discuss the ergodicity and convergence to Gibbs measure. In the linear forcing regime, we show rigorously the algebraic convergence to Gibbs measure when the `fluctuation-dissipation theorem' is satisfied, and this verifies that satisfying `fluctuation-dissipation theorem' indeed leads to the correct physical behavior. We further discuss possible approaches to analyze the ergodicity and convergence to Gibbs measure in the nonlinear forcing regime, while leave the rigorous analysis for future works. The FSDE model proposed is suitable for systems in contact with heat bath with power-law kernel and subdiffusion behaviors.

  16. Multilingual Validation of the Questionnaire for Verifying Stroke-Free Status in West Africa.

    Science.gov (United States)

    Sarfo, Fred; Gebregziabher, Mulugeta; Ovbiagele, Bruce; Akinyemi, Rufus; Owolabi, Lukman; Obiako, Reginald; Akpa, Onoja; Armstrong, Kevin; Akpalu, Albert; Adamu, Sheila; Obese, Vida; Boa-Antwi, Nana; Appiah, Lambert; Arulogun, Oyedunni; Mensah, Yaw; Adeoye, Abiodun; Tosin, Aridegbe; Adeleye, Osimhiarherhuo; Tabi-Ajayi, Eric; Phillip, Ibinaiye; Sani, Abubakar; Isah, Suleiman; Tabari, Nasir; Mande, Aliyu; Agunloye, Atinuke; Ogbole, Godwin; Akinyemi, Joshua; Laryea, Ruth; Melikam, Sylvia; Uvere, Ezinne; Adekunle, Gregory; Kehinde, Salaam; Azuh, Paschal; Dambatta, Abdul; Ishaq, Naser; Saulson, Raelle; Arnett, Donna; Tiwari, Hemnant; Jenkins, Carolyn; Lackland, Dan; Owolabi, Mayowa

    2016-01-01

    The Questionnaire for Verifying Stroke-Free Status (QVSFS), a method for verifying stroke-free status in participants of clinical, epidemiological, and genetic studies, has not been validated in low-income settings where populations have limited knowledge of stroke symptoms. We aimed to validate QVSFS in 3 languages, Yoruba, Hausa and Akan, for ascertainment of stroke-free status of control subjects enrolled in an on-going stroke epidemiological study in West Africa. Data were collected using a cross-sectional study design where 384 participants were consecutively recruited from neurology and general medicine clinics of 5 tertiary referral hospitals in Nigeria and Ghana. Ascertainment of stroke status was by neurologists using structured neurological examination, review of case records, and neuroimaging (gold standard). Relative performance of QVSFS without and with pictures of stroke symptoms (pictograms) was assessed using sensitivity, specificity, positive predictive value, and negative predictive value. The overall median age of the study participants was 54 years and 48.4% were males. Of 165 stroke cases identified by gold standard, 98% were determined to have had stroke, whereas of 219 without stroke 87% were determined to be stroke-free by QVSFS. Negative predictive value of the QVSFS across the 3 languages was 0.97 (range, 0.93-1.00), sensitivity, specificity, and positive predictive value were 0.98, 0.82, and 0.80, respectively. Agreement between the questionnaire with and without the pictogram was excellent/strong with Cohen k=0.92. QVSFS is a valid tool for verifying stroke-free status across culturally diverse populations in West Africa. © 2015 American Heart Association, Inc.

  17. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: balancing methodological rigor and research ethics.

    Science.gov (United States)

    Underhill, Kristen

    2013-10-01

    The growing evidence base for biomedical HIV prevention interventions - such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines - has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of "risk homeostasis," which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China

    Directory of Open Access Journals (Sweden)

    Tao Liu

    2017-08-01

    Full Text Available Abstract Background Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? Methods We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6% in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. Results A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250. The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. Conclusions This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  19. Rigorous lower bounds on the imaginary parts of the scattering amplitudes and the positions of their zeros

    CERN Document Server

    Uchiyama, T

    1974-01-01

    Rigorous lower bounds are derived from axiomatic field theory, by invoking analyticity and unitarity of the S-matrix. The bounds are expressed in terms of the total cross section and the slope parameter, and are found to be compatible with CERN experimental pp scattering data. It is also shown that the calculated lower-bound values imply non-existence of zeros for -t

  20. Rigorous approach to the comparison between experiment and theory in Casimir force measurements

    International Nuclear Information System (INIS)

    Klimchitskaya, G L; Chen, F; Decca, R S; Fischbach, E; Krause, D E; Lopez, D; Mohideen, U; Mostepanenko, V M

    2006-01-01

    In most experiments on the Casimir force the comparison between measurement data and theory was done using the concept of the root-mean-square deviation, a procedure that has been criticized in the literature. Here we propose a special statistical analysis which should be performed separately for the experimental data and for the results of the theoretical computations. In so doing, the random, systematic and total experimental errors are found as functions of separation, taking into account the distribution laws for each error at 95% confidence. Independently, all theoretical errors are combined to obtain the total theoretical error at the same confidence. Finally, the confidence interval for the differences between theoretical and experimental values is obtained as a function of separation. This rigorous approach is applied to two recent experiments on the Casimir effect

  1. Alternative pre-rigor foreshank positioning can improve beef shoulder muscle tenderness.

    Science.gov (United States)

    Grayson, A L; Lawrence, T E

    2013-09-01

    Thirty beef carcasses were harvested and the foreshank of each side was independently positioned (cranial, natural, parallel, or caudal) 1h post-mortem to determine the effect of foreshank angle at rigor mortis on the sarcomere length and tenderness of six beef shoulder muscles. The infraspinatus (IS), pectoralis profundus (PP), serratus ventralis (SV), supraspinatus (SS), teres major (TM) and triceps brachii (TB) were excised 48 h post-mortem for Warner-Bratzler shear force (WBSF) and sarcomere length evaluations. All muscles except the SS had altered (P<0.05) sarcomere lengths between positions; the cranial position resulted in the longest sarcomeres for the SV and TB muscles whilst the natural position had longer sarcomeres for the PP and TM muscles. The SV from the cranial position had lower (P<0.05) shear than the caudal position and TB from the natural position had lower (P<0.05) shear than the parallel or caudal positions. Sarcomere length was moderately correlated (r=-0.63; P<0.01) to shear force. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. An method of verify period signal based on data acquisition card

    International Nuclear Information System (INIS)

    Zeng Shaoli

    2005-01-01

    This paper introduces an method to verify index voltage of Period Signal Generator by using data acquisition card. which it's error is less 0.5%. A corresponding Win32's program, which use voluntarily developed VxD to control data acquisition card direct I/O and multi thread technique for gain the best time scale precision, has developed in Windows platform. The program will real time collect inda voltage data and auto measure period. (authors)

  3. Stochastic Geometry and Quantum Gravity: Some Rigorous Results

    Science.gov (United States)

    Zessin, H.

    The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.

  4. Flux wire measurements in Cavalier for verifying computer code applications

    International Nuclear Information System (INIS)

    Fehr, M.; Stubbs, J.; Hosticka, B.

    1988-01-01

    The Cavalier and UVAR research reactors are to be converted from high-enrichment uranium (HEU) to low-enrichment uranium (LEU) fuel. As a first step, an extensive set of gold wire activation measurements has been taken on the Cavalier reactor. Axial traverses show internal consistency to the order of ±5%, while horizontal traverses show somewhat larger deviations. The activation measurements will be converted to flux measurements via the Thermos code and will then be used to verify the Leopard-2DB codes. The codes will ultimately be used to design an upgraded LEU core for the UVAR

  5. Verifiable Fuel Cycle Simulation Model (VISION): A Tool for Analyzing Nuclear Fuel Cycle Futures

    International Nuclear Information System (INIS)

    Jacobson, Jacob J.; Piet, Steven J.; Matthern, Gretchen E.; Shropshire, David E.; Jeffers, Robert F.; Yacout, A.M.; Schweitzer, Tyler

    2010-01-01

    The nuclear fuel cycle consists of a set of complex components that are intended to work together. To support the nuclear renaissance, it is necessary to understand the impacts of changes and timing of events in any part of the fuel cycle system such as how the system would respond to each technological change, a series of which moves the fuel cycle from where it is to a postulated future state. The system analysis working group of the United States research program on advanced fuel cycles (formerly called the Advanced Fuel Cycle Initiative) is developing a dynamic simulation model, VISION, to capture the relationships, timing, and changes in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model components and some examples of how to use VISION. For example, VISION users can now change yearly the selection of separation or reactor technologies, the performance characteristics of those technologies, and/or the routing of material among separation and reactor types - with the model still operating on a PC in <5 min.

  6. A detailed and verified wind resource atlas for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Mortensen, N G; Landberg, L; Rathmann, O; Nielsen, M N [Risoe National Lab., Roskilde (Denmark); Nielsen, P [Energy and Environmental Data, Aalberg (Denmark)

    1999-03-01

    A detailed and reliable wind resource atlas covering the entire land area of Denmark has been established. Key words of the methodology are wind atlas analysis, interpolation of wind atlas data sets, automated generation of digital terrain descriptions and modelling of local wind climates. The atlas contains wind speed and direction distributions, as well as mean energy densities of the wind, for 12 sectors and four heights above ground level: 25, 45, 70 and 100 m. The spatial resolution is 200 meters in the horizontal. The atlas has been verified by comparison with actual wind turbine power productions from over 1200 turbines. More than 80% of these turbines were predicted to within 10%. The atlas will become available on CD-ROM and on the Internet. (au)

  7. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  8. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  9. Space-charge effect in electron time-of-flight analyzer for high-energy photoemission spectroscopy

    International Nuclear Information System (INIS)

    Greco, G.; Verna, A.; Offi, F.; Stefani, G.

    2016-01-01

    Highlights: • Two methods for the simulation of space-charge effect in time-resolved PES. • Reliability and advantages in the use of the SIMION"® software. • Simulation of the space-charge effect in an electron TOF analyzer. • Feasibility of a TOF analyzer in time-resolved high-energy PES experiments at FEL. - Abstract: The space-charge effect, due to the instantaneous emission of many electrons after the absorption of a single photons pulse, causes distortion in the photoelectron energy spectrum. Two calculation methods have been applied to simulate the expansion during a free flight of clouds of mono- and bi-energetic electrons generated by a high energy pulse of light and their results have been compared. The accuracy of a widely used tool, such as SIMION"®, in predicting the energy distortion caused by the space-charge has been tested and the reliability of its results is verified. Finally we used SIMION"® to take into account the space-charge effects in the simulation of simple photoemission experiments with a time-of-flight analyzer.

  10. Verifying 4D gated radiotherapy using time-integrated electronic portal imaging: a phantom and clinical study

    Directory of Open Access Journals (Sweden)

    Slotman Ben J

    2007-08-01

    Full Text Available Abstract Background Respiration-gated radiotherapy (RGRT can decrease treatment toxicity by allowing for smaller treatment volumes for mobile tumors. RGRT is commonly performed using external surrogates of tumor motion. We describe the use of time-integrated electronic portal imaging (TI-EPI to verify the position of internal structures during RGRT delivery Methods TI-EPI portals were generated by continuously collecting exit dose data (aSi500 EPID, Portal vision, Varian Medical Systems when a respiratory motion phantom was irradiated during expiration, inspiration and free breathing phases. RGRT was delivered using the Varian RPM system, and grey value profile plots over a fixed trajectory were used to study object positions. Time-related positional information was derived by subtracting grey values from TI-EPI portals sharing the pixel matrix. TI-EPI portals were also collected in 2 patients undergoing RPM-triggered RGRT for a lung and hepatic tumor (with fiducial markers, and corresponding planning 4-dimensional CT (4DCT scans were analyzed for motion amplitude. Results Integral grey values of phantom TI-EPI portals correlated well with mean object position in all respiratory phases. Cranio-caudal motion of internal structures ranged from 17.5–20.0 mm on planning 4DCT scans. TI-EPI of bronchial images reproduced with a mean value of 5.3 mm (1 SD 3.0 mm located cranial to planned position. Mean hepatic fiducial markers reproduced with 3.2 mm (SD 2.2 mm caudal to planned position. After bony alignment to exclude set-up errors, mean displacement in the two structures was 2.8 mm and 1.4 mm, respectively, and corresponding reproducibility in anatomy improved to 1.6 mm (1 SD. Conclusion TI-EPI appears to be a promising method for verifying delivery of RGRT. The RPM system was a good indirect surrogate of internal anatomy, but use of TI-EPI allowed for a direct link between anatomy and breathing patterns.

  11. Verifying pronunciation dictionaries using conflict analysis

    CSIR Research Space (South Africa)

    Davel, MH

    2010-09-01

    Full Text Available The authors describe a new language-independent technique for automatically identifying errors in an electronic pronunciation dictionary by analyzing the source of conflicting patterns directly.They evaluate the effectiveness of the technique in two...

  12. Comprehensive laboratory and field testing of cavity ring-down spectroscopy analyzers measuring H2O, CO2, CH4 and CO

    Science.gov (United States)

    Yver Kwok, C.; Laurent, O.; Guemri, A.; Philippon, C.; Wastine, B.; Rella, C. W.; Vuillemin, C.; Truong, F.; Delmotte, M.; Kazan, V.; Darding, M.; Lebègue, B.; Kaiser, C.; Xueref-Rémy, I.; Ramonet, M.

    2015-09-01

    To develop an accurate measurement network of greenhouse gases, instruments in the field need to be stable and precise and thus require infrequent calibrations and a low consumption of consumables. For about 10 years, cavity ring-down spectroscopy (CRDS) analyzers have been available that meet these stringent requirements for precision and stability. Here, we present the results of tests of CRDS instruments in the laboratory (47 instruments) and in the field (15 instruments). The precision and stability of the measurements are studied. We demonstrate that, thanks to rigorous testing, newer models generally perform better than older models, especially in terms of reproducibility between instruments. In the field, we see the importance of individual diagnostics during the installation phase, and we show the value of calibration and target gases that assess the quality of the data. Finally, we formulate recommendations for use of these analyzers in the field.

  13. Comprehensive laboratory and field testing of cavity ring-down spectroscopy analyzers measuring H2O, CO2, CH4 and CO

    Directory of Open Access Journals (Sweden)

    C. Yver Kwok

    2015-09-01

    Full Text Available To develop an accurate measurement network of greenhouse gases, instruments in the field need to be stable and precise and thus require infrequent calibrations and a low consumption of consumables. For about 10 years, cavity ring-down spectroscopy (CRDS analyzers have been available that meet these stringent requirements for precision and stability. Here, we present the results of tests of CRDS instruments in the laboratory (47 instruments and in the field (15 instruments. The precision and stability of the measurements are studied. We demonstrate that, thanks to rigorous testing, newer models generally perform better than older models, especially in terms of reproducibility between instruments. In the field, we see the importance of individual diagnostics during the installation phase, and we show the value of calibration and target gases that assess the quality of the data. Finally, we formulate recommendations for use of these analyzers in the field.

  14. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  15. Rigorous Results for the Distribution of Money on Connected Graphs

    Science.gov (United States)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  16. Spatially resolvable optical emission spectrometer for analyzing density uniformity of semiconductor process plasma

    International Nuclear Information System (INIS)

    Oh, Changhoon; Ryoo, Hoonchul; Lee, Hyungwoo; Hahn, Jae W.; Kim, Se-Yeon; Yi, Hun-Jung

    2010-01-01

    We proposed a spatially resolved optical emission spectrometer (SROES) for analyzing the uniformity of plasma density for semiconductor processes. To enhance the spatial resolution of the SROES, we constructed a SROES system using a series of lenses, apertures, and pinholes. We calculated the spatial resolution of the SROES for the variation of pinhole size, and our calculated results were in good agreement with the measured spatial variation of the constructed SROES. The performance of the SROES was also verified by detecting the correlation between the distribution of a fluorine radical in inductively coupled plasma etch process and the etch rate of a SiO 2 film on a silicon wafer.

  17. Which Interventions Have the Greatest Effect on Student Learning in Sub-Saharan Africa? "A Meta-Analysis of Rigorous Impact Evaluations"

    Science.gov (United States)

    Conn, Katharine

    2014-01-01

    In the last three decades, there has been a large increase in the number of rigorous experimental and quasi-experimental evaluations of education programs in developing countries. These impact evaluations have taken place all over the globe, including a large number in Sub-Saharan Africa (SSA). The fact that the developing world is socially and…

  18. A Development of Advanced Rigorous 2 Step System for the High Resolution Residual Dose Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyun; Kim, Jong Woo; Kim, Jea Hyun; Lee, Jae Yong; Shin, Chang Ho [Hanyang Univ., Seoul (Korea, Republic of); Kim, Song Hyun [Kyoto University, Sennan (Japan)

    2016-10-15

    In these days, an activation problem such as residual radiation is one of the important issues. The activated devices and structures can emit the residual radiation. Therefore, the activation should be properly analyzed to make a plan for design, operation, and decontamination of nuclear facilities. For activation calculation, Rigorous 2 Step (R2S) method is introduced as following strategy: (1) the particle transport calculation is performed for an object geometry to get particle spectra and total fluxes; (2) inventories of each cell are calculated by using flux information according to irradiation and decay history; (3) the residual gamma distribution was evaluated by transport code, if needed. This scheme is based on cell calculation of used geometry. In this method, the particle spectra and total fluxes are obtained by mesh tally for activation calculation. It is useful to reduce the effects of gradient flux information. Nevertheless, several limitations are known as follows: Firstly, high relative error of spectra, when lots of meshes were used; secondly, different flux information from spectrum of void in mesh-tally. To calculate high resolution residual dose, several method are developed such as R2Smesh and MCR2S unstructured mesh. The R2Smesh method products better efficiency for obtaining neutron spectra by using fine/coarse mesh. Also, the MCR2S unstructured mesh can effectively separate void spectrum. In this study, the AR2S system was developed to combine the features of those mesh based R2S method. To confirm the AR2S system, the simple activation problem was evaluated and compared with R2S method using same division. Those results have good agreement within 0.83 %. Therefore, it is expected that the AR2S system can properly estimate an activation problem.

  19. Building a Laboratory-Scale Biogas Plant and Verifying its Functionality

    Science.gov (United States)

    Boleman, Tomáš; Fiala, Jozef; Blinová, Lenka; Gerulová, Kristína

    2011-01-01

    The paper deals with the process of building a laboratory-scale biogas plant and verifying its functionality. The laboratory-scale prototype was constructed in the Department of Safety and Environmental Engineering at the Faculty of Materials Science and Technology in Trnava, of the Slovak University of Technology. The Department has already built a solar laboratory to promote and utilise solar energy, and designed SETUR hydro engine. The laboratory is the next step in the Department's activities in the field of renewable energy sources and biomass. The Department is also involved in the European Union project, where the goal is to upgrade all existed renewable energy sources used in the Department.

  20. A new (k,n verifiable secret image sharing scheme (VSISS

    Directory of Open Access Journals (Sweden)

    Amitava Nag

    2014-11-01

    Full Text Available In this paper, a new (k,n verifiable secret image sharing scheme (VSISS is proposed in which third order LFSR (linear-feedback shift register-based public key cryptosystem is applied for the cheating prevention and preview before decryption. In the proposed scheme the secret image is first partitioned into several non-overlapping blocks of k pixels. Every k pixel is then used to form m=⌈k/4⌉+1 pixels of one encrypted share. The original secret image can be reconstructed by gathering any k or more encrypted shared images. The experimental results show that the proposed VSISS is an efficient and safe method.

  1. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    Science.gov (United States)

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    Science.gov (United States)

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  3. Plasma diagnostics with a retarding potential analyzer

    International Nuclear Information System (INIS)

    Jack, T.M.

    1996-01-01

    The plasma rocket is located at NASA Johnson Space Center. To produce a thrust in space, an inert gas is ionized into a plasma and heated in the linear section of a tokamak fusion device. The magnetic field used to contain the plasma has a magnitude of 2--10 kGauss. The plasma plume has a variable thrust and specific impulse. A high temperature retarding potential analyzer (RPA) is being developed to characterize the plasma in the plume and at the edge of the magnetically contained plasma. The RPA measures the energy and density of ions or electrons entering into its solid angle of collection. An oscilloscope displays the ion flux versus the collected current. All measurements are made relative to the facility ground. Testing of this device involves the determination of its output parameters, sensitivity, and responses to a wide range of energies and densities. Each grid will be tested individually by changing only its voltage and observing the output from the RPA. To verify that the RPA is providing proper output, it is compared to the output from a Langmuir or Faraday probe

  4. Using Model Checking for Analyzing Distributed Power Control Problems

    Directory of Open Access Journals (Sweden)

    Thomas Brihaye

    2010-01-01

    Full Text Available Model checking (MC is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control (PC problem can be modeled by a timed game between a given transmitter and its environment, the authors wanted to know whether this approach can be applied to distributed PC. It turns out that it can be applied successfully and allows one to analyze realistic scenarios including the case of discrete transmit powers and games with incomplete information. The proposed methodology is as follows. We state some objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired properties are verified and determine a winning strategy.

  5. Fault Diagnosis of Motor Bearing by Analyzing a Video Clip

    Directory of Open Access Journals (Sweden)

    Siliang Lu

    2016-01-01

    Full Text Available Conventional bearing fault diagnosis methods require specialized instruments to acquire signals that can reflect the health condition of the bearing. For instance, an accelerometer is used to acquire vibration signals, whereas an encoder is used to measure motor shaft speed. This study proposes a new method for simplifying the instruments for motor bearing fault diagnosis. Specifically, a video clip recording of a running bearing system is captured using a cellphone that is equipped with a camera and a microphone. The recorded video is subsequently analyzed to obtain the instantaneous frequency of rotation (IFR. The instantaneous fault characteristic frequency (IFCF of the defective bearing is obtained by analyzing the sound signal that is recorded by the microphone. The fault characteristic order is calculated by dividing IFCF by IFR to identify the fault type of the bearing. The effectiveness and robustness of the proposed method are verified by a series of experiments. This study provides a simple, flexible, and effective solution for motor bearing fault diagnosis. Given that the signals are gathered using an affordable and accessible cellphone, the proposed method is proven suitable for diagnosing the health conditions of bearing systems that are located in remote areas where specialized instruments are unavailable or limited.

  6. Rigor mortis and the epileptology of Charles Bland Radcliffe (1822-1889).

    Science.gov (United States)

    Eadie, M J

    2007-03-01

    Charles Bland Radcliffe (1822-1889) was one of the physicians who made major contributions to the literature on epilepsy in the mid-19th century, when the modern understanding of the disorder was beginning to emerge, particularly in England. His experimental work was concerned with the electrical properties of frog muscle and nerve. Early in his career he related his experimental findings to the phenomenon of rigor mortis and concluded that, contrary to the general belief of the time, muscle contraction depended on the cessation of nerve input, and muscle relaxation on its presence. He adhered to this counter-intuitive interpretation throughout his life and, based on it, produced an epileptology that was very different from those of his contemporaries and successors. His interpretations were ultimately without any direct influence on the advance of knowledge. However, his idea that withdrawal of an inhibitory process released previously suppressed muscular contractile powers, when applied to the brain rather than the periphery of the nervous system, permitted Hughlings Jackson to explain certain psychological phenomena that accompany or follow some epileptic events. As well, Radcliffe was one of the chief early advocates for potassium bromide, the first effective anticonvulsant.

  7. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  8. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  9. ATHENA [Advanced Thermal Hydraulic Energy Network Analyzer] solutions to developmental assessment problems

    International Nuclear Information System (INIS)

    Carlson, K.E.; Ransom, V.H.; Roth, P.A.

    1987-03-01

    The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code has been developed to perform transient simulation of the thermal hydraulic systems that may be found in fusion reactors, space reactors, and other advanced systems. As an assessment of current capability the code was applied to a number of physical problems, both conceptual and actual experiments. Results indicate that the numerical solution to the basic conservation equations is technically sound, and that generally good agreement can be obtained when modeling relevant hydrodynamic experiments. The assessment also demonstrates basic fusion system modeling capability and verifies compatibility of the code with both CDC and CRAY mainframes. Areas where improvements could be made include constitutive modeling, which describes the interfacial exchange term. 13 refs., 84 figs

  10. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    Science.gov (United States)

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Developing a flexible and verifiable integrated dose assessment capability

    International Nuclear Information System (INIS)

    Parzyck, D.C.; Rhea, T.A.; Copenhaver, E.D.; Bogard, J.S.

    1987-01-01

    A flexible yet verifiable system of computing and recording personnel doses is needed. Recent directions in statutes establish the trend of combining internal and external doses. We are developing a Health Physics Information Management System (HPIMS) that will centralize dosimetry calculations and data storage; integrate health physics records with other health-related disciplines, such as industrial hygiene, medicine, and safety; provide a more auditable system with published algorithms and clearly defined flowcharts of system operation; readily facilitate future changes dictated by new regulations, new dosimetric models, and new systems of units; and address ad-hoc inquiries regarding worker/workplace interactions, including potential synergisms with non-radiation exposures. The system is modular and provides a high degree of isolation from low-level detail, allowing flexibility for changes without adversely affecting other parts of the system. 10 refs., 3 figs

  12. Changes in the contractile state, fine structure and metabolism of cardiac muscle cells during the development of rigor mortis.

    Science.gov (United States)

    Vanderwee, M A; Humphrey, S M; Gavin, J B; Armiger, L C

    1981-01-01

    Transmural slices from the left anterior papillary muscle of dog hearts were maintained for 120 min in a moist atmosphere at 37 degrees C. At 15-min intervals tissue samples were taken for estimation of adenosine triphosphate (ATP) and glucose-6-phosphate (G6P) and for electron microscopic examination. At the same time the deformability under standard load of comparable regions of an adjacent slice of tissue was measured. ATP levels fell rapidly during the first 45 to 75 min after excision of the heart. During a subsequent further decline in ATP, the mean deformability of myocardium fell from 30 to 12% indicating the development of rigor mortis. Conversely, G6P levels increased during the first decline in adenosine triphosphate but remained relatively steady thereafter. Whereas many of the myocardial cells fixed after 5 min contracted on contact with glutaraldehyde, all cells examined after 15 to 40 min were relaxed. A progressive increase in the proportion of contracted cells was observed during the rapid increase in myocardial rigidity. During this late contraction the cells showed morphological evidence of irreversible injury. These findings suggest that ischaemic myocytes contract just before actin and myosin become strongly linked to maintain the state of rigor mortis.

  13. A Rigorous Investigation on the Ground State of the Penson-Kolb Model

    Science.gov (United States)

    Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi

    2003-05-01

    By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002

  14. Prevalence of Ex Vivo High On-treatment Platelet Reactivity on Antiplatelet Therapy after Transient Ischemic Attack or Ischemic Stroke on the PFA-100(®) and VerifyNow(®).

    LENUS (Irish Health Repository)

    Kinsella, Justin A

    2012-09-12

    BACKGROUND: The prevalence of ex vivo high on-treatment platelet reactivity (HTPR) to commonly prescribed antiplatelet regimens after transient ischemic attack (TIA) or ischemic stroke is uncertain. METHODS: Platelet function inhibition was simultaneously assessed with modified light transmission aggregometry (VerifyNow; Accumetrics Inc, San Diego, CA) and with a moderately high shear stress platelet function analyzer (PFA-100; Siemens Medical Solutions USA, Inc, Malvern, PA) in a pilot, cross-sectional study of TIA or ischemic stroke patients. Patients were assessed on aspirin-dipyridamole combination therapy (n = 51) or clopidogrel monotherapy (n = 25). RESULTS: On the VerifyNow, HTPR on aspirin was identified in 4 of 51 patients (8%) on aspirin-dipyridamole combination therapy (≥550 aspirin reaction units on the aspirin cartridge). Eleven of 25 (44%) patients had HTPR on clopidogrel (≥194 P2Y12 reaction units on the P2Y12 cartridge). On the PFA-100, 21 of 51 patients (41%) on aspirin-dipyridamole combination therapy had HTPR on the collagen-epinephrine (C-EPI) cartridge. Twenty-three of 25 patients (92%) on clopidogrel had HTPR on the collagen-adenosine diphosphate (C-ADP) cartridge. The proportion of patients with antiplatelet HTPR was lower on the VerifyNow than PFA-100 in patients on both regimens (P < .001). CONCLUSIONS: The prevalence of ex vivo antiplatelet HTPR after TIA or ischemic stroke is markedly influenced by the method used to assess platelet reactivity. The PFA-100 C-ADP cartridge is not sensitive at detecting the antiplatelet effects of clopidogrel ex vivo. Larger prospective studies with the VerifyNow and with the PFA-100 C-EPI and recently released Innovance PFA P2Y cartridges (Siemens Medical Solutions USA, Inc) in addition to newer tests of platelet function are warranted to assess whether platelet function monitoring predicts clinical outcome in ischemic cerebrovascular disease.

  15. Rigorous assessment of patterning solution of metal layer in 7 nm technology node

    Science.gov (United States)

    Gao, Weimin; Ciofi, Ivan; Saad, Yves; Matagne, Philippe; Bachmann, Michael; Gillijns, Werner; Lucas, Kevin; Demmerle, Wolfgang; Schmoeller, Thomas

    2016-01-01

    In a 7 nm node (N7), the logic design requires a critical poly pitch of 42 to 45 nm and a metal 1 (M1) pitch of 28 to 32 nm. Such high-pattern density pushes the 193 immersion lithography solution toward its limit and also brings extremely complex patterning scenarios. The N7 M1 layer may require a self-aligned quadruple patterning (SAQP) with a triple litho-etch (LE3) block process. Therefore, the whole patterning process flow requires multiple exposure+etch+deposition processes and each step introduces a particular impact on the pattern profiles and the topography. In this study, we have successfully integrated a simulation tool that enables emulation of the whole patterning flow with realistic process-dependent three-dimensional (3-D) profile and topology. We use this tool to study the patterning process variations of the N7 M1 layer including the overlay control, the critical dimension uniformity budget, and the lithographic process window (PW). The resulting 3-D pattern structure can be used to optimize the process flow, verify design rules, extract parasitics, and most importantly, simulate the electric field, and identify hot spots for dielectric reliability. As an example application, the maximum electric field at M1 tip-to-tip, which is one of the most critical patterning locations, has been simulated and extracted. The approach helps to investigate the impact of process variations on dielectric reliability. We have also assessed the alternative M1 patterning flow with a single exposure block using extreme ultraviolet lithography (EUVL) and analyzed its advantages compared to the LE3 block approach.

  16. Calling Out Cheaters : Covert Security with Public VerifiabilitySecurity

    DEFF Research Database (Denmark)

    Asharov, Gilad; Orlandi, Claudio

    2012-01-01

    We introduce the notion of covert security with public verifiability, building on the covert security model introduced by Aumann and Lindell (TCC 2007). Protocols that satisfy covert security guarantee that the honest parties involved in the protocol will notice any cheating attempt with some...... constant probability ε. The idea behind the model is that the fear of being caught cheating will be enough of a deterrent to prevent any cheating attempt. However, in the basic covert security model, the honest parties are not able to persuade any third party (say, a judge) that a cheating occurred. We...... propose (and formally define) an extension of the model where, when an honest party detects cheating, it also receives a certificate that can be published and used to persuade other parties, without revealing any information about the honest party’s input. In addition, malicious parties cannot create fake...

  17. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  18. How to analyze the supply and demand balance to predict future volatility

    International Nuclear Information System (INIS)

    Willis, K.

    2002-01-01

    The impact of electricity price volatility with a focus on supply and demand is discussed. Factors that influence volatility and the impact of volatility on market participants are also highlighted. It is this author's view that despite initial difficulties and customary inefficiencies (not unusual in a new market), the market performed as might have been expected and government interference in the market was not necessary. As far as market participants are concerned, they must be vigilant at all times because of the millisecond time horizons; an understanding of market volatility and its key drivers are absolutely essential to proper management of risks inherent in the market. An understanding of the relationship between supply and demand and volatility is important. Just as important, or even more so, is an appreciation of the delicate nature of this relationship and the dynamics behind it. Consequently, one must apply the same rigor that is used to analyze supply and demand in the electricity market to other market fundamentals such as the natural gas market, interconnections, generation fuel sources, regulatory actions and market rules. A three-step process, involving a review of history to develop the relationships, supply and demand forecasts and scenario planning, is proposed for use in analyzing supply and demand to predict future volatility. Irrespective of the sophistication of analytical tools, the most important determinant to success in the electricity market is for market players to perceive the earliest signs of a changing market landscape and respond to them with lightning speed

  19. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death

    Directory of Open Access Journals (Sweden)

    Evgeniy R. Galimov

    2018-03-01

    Full Text Available Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence. Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC. This phenomenon is accompanied by a wave of intramuscular Ca2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death.

  20. Rigorous numerical modeling of scattering-type scanning near-field optical microscopy and spectroscopy

    Science.gov (United States)

    Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun

    2017-11-01

    Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.

  1. Noninteractive Verifiable Outsourcing Algorithm for Bilinear Pairing with Improved Checkability

    Directory of Open Access Journals (Sweden)

    Yanli Ren

    2017-01-01

    Full Text Available It is well known that the computation of bilinear pairing is the most expensive operation in pairing-based cryptography. In this paper, we propose a noninteractive verifiable outsourcing algorithm of bilinear pairing based on two servers in the one-malicious model. The outsourcer need not execute any expensive operation, such as scalar multiplication and modular exponentiation. Moreover, the outsourcer could detect any failure with a probability close to 1 if one of the servers misbehaves. Therefore, the proposed algorithm improves checkability and decreases communication cost compared with the previous ones. Finally, we utilize the proposed algorithm as a subroutine to achieve an anonymous identity-based encryption (AIBE scheme with outsourced decryption and an identity-based signature (IBS scheme with outsourced verification.

  2. Verifying reciprocal relations for experimental diffusion coefficients in multicomponent mixtures

    DEFF Research Database (Denmark)

    Medvedev, Oleg; Shapiro, Alexander

    2003-01-01

    The goal of the present study is to verify the agreement of the available data on diffusion in ternary mixtures with the theoretical requirement of linear non-equilibrium thermodynamics consisting in symmetry of the matrix of the phenomenological coefficients. A common set of measured diffusion...... coefficients for a three-component mixture consists of four Fickian diffusion coefficients, each being reported separately. However, the Onsager theory predicts the existence of only three independent coefficients, as one of them disappears due to the symmetry requirement. Re-calculation of the Fickian...... extended sets of experimental data and reliable thermodynamic models were available. The sensitivity of the symmetry property to different thermodynamic parameters of the models was also checked. (C) 2003 Elsevier Science B.V. All rights reserved....

  3. Predictive value of soluble haemoglobin scavenger receptor CD163 serum levels for survival in verified tuberculosis patients

    DEFF Research Database (Denmark)

    Knudsen, T.B.; Gustafson, P.; Kronborg, G.

    2005-01-01

    Pre-treatment serum levels of sCD163 were measured in a cohort of 236 suspected tuberculosis (TB) cases from Guinea-Bissau, with a median follow-up period of 3.3 years (range 0-6.4 years). In 113 cases, the diagnosis of TB was verified by positive sputum microscopy and/or culture. Among the verif......Pre-treatment serum levels of sCD163 were measured in a cohort of 236 suspected tuberculosis (TB) cases from Guinea-Bissau, with a median follow-up period of 3.3 years (range 0-6.4 years). In 113 cases, the diagnosis of TB was verified by positive sputum microscopy and/or culture. Among...

  4. Verifying Real-Time Systems using Explicit-time Description Methods

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking has been extensively researched in recent years. Many new formalisms with time extensions and tools based on them have been presented. On the other hand, Explicit-Time Description Methods aim to verify real-time systems with general untimed model checkers. Lamport presented an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables for time requirements. This paper proposes a new explicit-time description method with no reliance on global variables. Instead, it uses rendezvous synchronization steps between the Tick process and each system process to simulate time. This new method achieves better modularity and facilitates usage of more complex timing constraints. The two explicit-time description methods are implemented in DIVINE, a well-known distributed-memory model checker. Preliminary experiment results show that our new method, with better modularity, is comparable to Lamport's method with respect to time and memory efficiency.

  5. A scalable, self-analyzing digital locking system for use on quantum optics experiments.

    Science.gov (United States)

    Sparkes, B M; Chrzanowski, H M; Parrain, D P; Buchler, B C; Lam, P K; Symul, T

    2011-07-01

    Digital control of optics experiments has many advantages over analog control systems, specifically in terms of the scalability, cost, flexibility, and the integration of system information into one location. We present a digital control system, freely available for download online, specifically designed for quantum optics experiments that allows for automatic and sequential re-locking of optical components. We show how the inbuilt locking analysis tools, including a white-noise network analyzer, can be used to help optimize individual locks, and verify the long term stability of the digital system. Finally, we present an example of the benefits of digital locking for quantum optics by applying the code to a specific experiment used to characterize optical Schrödinger cat states.

  6. Biclustering via optimal re-ordering of data matrices in systems biology: rigorous methods and comparative studies

    Directory of Open Access Journals (Sweden)

    Feng Xiao-Jiang

    2008-10-01

    Full Text Available Abstract Background The analysis of large-scale data sets via clustering techniques is utilized in a number of applications. Biclustering in particular has emerged as an important problem in the analysis of gene expression data since genes may only jointly respond over a subset of conditions. Biclustering algorithms also have important applications in sample classification where, for instance, tissue samples can be classified as cancerous or normal. Many of the methods for biclustering, and clustering algorithms in general, utilize simplified models or heuristic strategies for identifying the "best" grouping of elements according to some metric and cluster definition and thus result in suboptimal clusters. Results In this article, we present a rigorous approach to biclustering, OREO, which is based on the Optimal RE-Ordering of the rows and columns of a data matrix so as to globally minimize the dissimilarity metric. The physical permutations of the rows and columns of the data matrix can be modeled as either a network flow problem or a traveling salesman problem. Cluster boundaries in one dimension are used to partition and re-order the other dimensions of the corresponding submatrices to generate biclusters. The performance of OREO is tested on (a metabolite concentration data, (b an image reconstruction matrix, (c synthetic data with implanted biclusters, and gene expression data for (d colon cancer data, (e breast cancer data, as well as (f yeast segregant data to validate the ability of the proposed method and compare it to existing biclustering and clustering methods. Conclusion We demonstrate that this rigorous global optimization method for biclustering produces clusters with more insightful groupings of similar entities, such as genes or metabolites sharing common functions, than other clustering and biclustering algorithms and can reconstruct underlying fundamental patterns in the data for several distinct sets of data matrices arising

  7. Analyzing energy consumption of wireless networks. A model-based approach

    Energy Technology Data Exchange (ETDEWEB)

    Yue, Haidi

    2013-03-04

    During the last decades, wireless networking has been continuously a hot topic both in academy and in industry. Many different wireless networks have been introduced like wireless local area networks, wireless personal networks, wireless ad hoc networks, and wireless sensor networks. If these networks want to have a long term usability, the power consumed by the wireless devices in each of these networks needs to be managed efficiently. Hence, a lot of effort has been carried out for the analysis and improvement of energy efficiency, either for a specific network layer (protocol), or new cross-layer designs. In this thesis, we apply model-based approach for the analysis of energy consumption of different wireless protocols. The protocols under consideration are: one leader election protocol, one routing protocol, and two medium access control protocols. By model-based approach we mean that all these four protocols are formalized as some formal models, more precisely, as discrete-time Markov chains (DTMCs), Markov decision processes (MDPs), or stochastic timed automata (STA). For the first two models, DTMCs and MDPs, we model them in PRISM, a prominent model checker for probabilistic model checking, and apply model checking technique to analyze them. Model checking belongs to the family of formal methods. It discovers exhaustively all possible (reachable) states of the models, and checks whether these models meet a given specification. Specifications are system properties that we want to study, usually expressed by some logics, for instance, probabilistic computer tree logic (PCTL). However, while model checking relies on rigorous mathematical foundations and automatically explores the entire state space of a model, its applicability is also limited by the so-called state space explosion problem -- even systems of moderate size often yield models with an exponentially larger state space that thwart their analysis. Hence for the STA models in this thesis, since there

  8. 78 FR 69871 - Agency Information Collection Activities: myE-Verify, Revision of a Currently Approved Collection

    Science.gov (United States)

    2013-11-21

    ... Collection (1) Type of Information Collection: Revision of a Currently Approved Collection. (2) Title of the... respond: E-Verify Self Check--Identity Authentication 2,900,000 responses at 0.0833 hours (5 minutes) per...

  9. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  10. Revisiting the scientific method to improve rigor and reproducibility of immunohistochemistry in reproductive science.

    Science.gov (United States)

    Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E

    2018-04-21

    Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.

  11. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    Science.gov (United States)

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  12. Analytic solution to verify code predictions of two-phase flow in a boiling water reactor core channel

    International Nuclear Information System (INIS)

    Chen, K.F.; Olson, C.A.

    1983-01-01

    One reliable method that can be used to verify the solution scheme of a computer code is to compare the code prediction to a simplified problem for which an analytic solution can be derived. An analytic solution for the axial pressure drop as a function of the flow was obtained for the simplified problem of homogeneous equilibrium two-phase flow in a vertical, heated channel with a cosine axial heat flux shape. This analytic solution was then used to verify the predictions of the CONDOR computer code, which is used to evaluate the thermal-hydraulic performance of boiling water reactors. The results show excellent agreement between the analytic solution and CONDOR prediction

  13. Orientation of English Medieval Parish Churches

    Science.gov (United States)

    Hoare, Peter G.

    Our understanding of the alignment of English medieval parish churches, after more than three centuries of research, is far from complete. The arrangement of relatively few structures has been explained beyond reasonable doubt, and tests of the overwhelmingly popular festival orientation theory are often insufficiently rigorous to provide convincing answers. Much work remains to be done, including verifying and analyzing some of the existing raw data, determining whether the present church was dedicated at the time of construction, examining wills for evidence of early dedications, measuring the effect of eastern horizons on sunrise azimuths, and consulting excavation reports to assess whether earlier buildings may have influenced the arrangement of those churches that replaced them.

  14. 13 CFR 127.404 - What happens if SBA is unable to verify a concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA is unable to verify a concern's eligibility? 127.404 Section 127.404 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  15. Verifying operator fitness - an imperative not an option

    International Nuclear Information System (INIS)

    Scott, A.B. Jr.

    1987-01-01

    In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered

  16. [The development and evaluation of software to verify diagnostic accuracy].

    Science.gov (United States)

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  17. Modelling and Verifying Communication Failure of Hybrid Systems in HCSP

    DEFF Research Database (Denmark)

    Wang, Shuling; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    Hybrid systems are dynamic systems with interacting discrete computation and continuous physical processes. They have become ubiquitous in our daily life, e.g. automotive, aerospace and medical systems, and in particular, many of them are safety-critical. For a safety-critical hybrid system......, in the presence of communication failure, the expected control from the controller will get lost and as a consequence the physical process cannot behave as expected. In this paper, we mainly consider the communication failure caused by the non-engagement of one party in communication action, i.......e. the communication itself fails to occur. To address this issue, this paper proposes a formal framework by extending HCSP, a formal modeling language for hybrid systems, for modeling and verifying hybrid systems in the absence of receiving messages due to communication failure. We present two inference systems...

  18. Rigorous Mathematical Thinking Approach to Enhance Students’ Mathematical Creative and Critical Thinking Abilities

    Science.gov (United States)

    Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.

    2017-09-01

    The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.

  19. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death.

    Science.gov (United States)

    Galimov, Evgeniy R; Pryor, Rosina E; Poole, Sarah E; Benedetto, Alexandre; Pincus, Zachary; Gems, David

    2018-03-06

    Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence). Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC). This phenomenon is accompanied by a wave of intramuscular Ca 2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca 2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death. VIDEO ABSTRACT. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  20. Control group design: enhancing rigor in research of mind-body therapies for depression.

    Science.gov (United States)

    Kinser, Patricia Anne; Robins, Jo Lynne

    2013-01-01

    Although a growing body of research suggests that mind-body therapies may be appropriate to integrate into the treatment of depression, studies consistently lack methodological sophistication particularly in the area of control groups. In order to better understand the relationship between control group selection and methodological rigor, we provide a brief review of the literature on control group design in yoga and tai chi studies for depression, and we discuss challenges we have faced in the design of control groups for our recent clinical trials of these mind-body complementary therapies for women with depression. To address the multiple challenges of research about mind-body therapies, we suggest that researchers should consider 4 key questions: whether the study design matches the research question; whether the control group addresses performance, expectation, and detection bias; whether the control group is ethical, feasible, and attractive; and whether the control group is designed to adequately control for nonspecific intervention effects. Based on these questions, we provide specific recommendations about control group design with the goal of minimizing bias and maximizing validity in future research.

  1. Measurement and characterization of slippage and slip-law using a rigorous analysis in dynamics of oscillating rheometer: Newtonian fluid

    Science.gov (United States)

    Azese, Martin Ndi

    2018-02-01

    This article presents a rigorous calculation involving velocity slip of Newtonian fluid where we analyze and solve the unsteady Navier-Stokes equation with emphasis on its rheological implication. The goal of which is to model a simple yet effective non-invasive way of quantifying and characterizing slippage. Indeed this contrasts with previous techniques that exhibit inherent limitations whereby injecting foreign objects usually alter the flow. This problem is built on the Couette rheological flow system such that μ-Newton force and μ-stress are captured and processed to obtain wall slip. Our model leads to a linear partial differential equation and upon enforcing linear-Navier slip boundary conditions (BC) yields inhomogeneous and unsteady "Robin-type" BC. A dimensional analysis reveals salient dimensionless parameters: Roshko, Strouhal, and Reynolds while highlighting slip-numbers from BC. We also solve the slip-free case to corroborate and validate our results. Several graphs are generated showing slip effects, particularly, studying how slip-numbers, a key input, differentiate themselves to the outputs. We also confirm this in a graphical fashion by presenting the flow profile across channel width, velocity, and stress at both walls. A perturbation scheme is introduced to calculate long-time behavior when the system seats for long. More importantly, in the end, we justify the existence of a reverse mechanism, where an inverse transformation like Fourier transform uses the output data to retrieve slip-numbers and slip law, thus quantifying and characterizing slip. Therefore, we not only substantiate our analysis, but we also justify our claim, measurement and characterization, and theorize realizability of our proposition.

  2. Verifying SeVeCom Using Set-based Abstraction

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander; Modesti, Paolo

    We formally analyze the Secure Vehicle Communication system developed by the EU-project SeVeCom, using the AIF framework which is based on a novel set-abstraction technique. The model involves the hardware security modules (HSMs) of a number of cars, a certification authority, and the protocols...... executed between them. Each participant stores a database of keys that can be added or deleted depending on the different operations. The AIF-framework allows us to model and automatically analyze such databases with- out bounding the number of steps that the system can make and, in contrast to previous...

  3. Verifying SeVeCom Using Set-based Abstraction

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander; Modesti, Paolo

    2011-01-01

    We formally analyze the Secure Vehicle Communication system developed by the EU-project SeVeCom, using the AIF framework which is based on a novel set-abstraction technique. The model involves the hardware security modules (HSMs) of a number of cars, a certication authority, and the protocols...... executed between them. Each participant stores a database of keys that can be added or deleted depending on the dierent operations. The AIF-framework allows us to model and automatically analyze such databases with- out bounding the number of steps that the system can make and, in contrast to previous...

  4. AUTOMATIC ESTIMATION OF SIZE PARAMETERS USING VERIFIED COMPUTERIZED STEREOANALYSIS

    Directory of Open Access Journals (Sweden)

    Peter R Mouton

    2011-05-01

    Full Text Available State-of-the-art computerized stereology systems combine high-resolution video microscopy and hardwaresoftware integration with stereological methods to assist users in quantifying multidimensional parameters of importance to biomedical research, including volume, surface area, length, number, their variation and spatial distribution. The requirement for constant interactions between a trained, non-expert user and the targeted features of interest currently limits the throughput efficiency of these systems. To address this issue we developed a novel approach for automatic stereological analysis of 2-D images, Verified Computerized Stereoanalysis (VCS. The VCS approach minimizes the need for user interactions with high contrast [high signal-to-noise ratio (S:N] biological objects of interest. Performance testing of the VCS approach confirmed dramatic increases in the efficiency of total object volume (size estimation, without a loss of accuracy or precision compared to conventional computerized stereology. The broad application of high efficiency VCS to high-contrast biological objects on tissue sections could reduce labor costs, enhance hypothesis testing, and accelerate the progress of biomedical research focused on improvements in health and the management of disease.

  5. A Rigorous Theory of Many-Body Prethermalization for Periodically Driven and Closed Quantum Systems

    Science.gov (United States)

    Abanin, Dmitry; De Roeck, Wojciech; Ho, Wen Wei; Huveneers, François

    2017-09-01

    Prethermalization refers to the transient phenomenon where a system thermalizes according to a Hamiltonian that is not the generator of its evolution. We provide here a rigorous framework for quantum spin systems where prethermalization is exhibited for very long times. First, we consider quantum spin systems under periodic driving at high frequency {ν}. We prove that up to a quasi-exponential time {τ_* ˜ e^{c ν/log^3 ν}}, the system barely absorbs energy. Instead, there is an effective local Hamiltonian {\\widehat D} that governs the time evolution up to {τ_*}, and hence this effective Hamiltonian is a conserved quantity up to {τ_*}. Next, we consider systems without driving, but with a separation of energy scales in the Hamiltonian. A prime example is the Fermi-Hubbard model where the interaction U is much larger than the hopping J. Also here we prove the emergence of an effective conserved quantity, different from the Hamiltonian, up to a time {τ_*} that is (almost) exponential in {U/J}.

  6. Effects of Chilling and Partial Freezing on Rigor Mortis Changes of Bighead Carp (Aristichthys nobilis) Fillets: Cathepsin Activity, Protein Degradation and Microstructure of Myofibrils.

    Science.gov (United States)

    Lu, Han; Liu, Xiaochang; Zhang, Yuemei; Wang, Hang; Luo, Yongkang

    2015-12-01

    To investigate the effects of chilling and partial freezing on rigor mortis changes in bighead carp (Aristichthys nobilis), pH, cathepsin B, cathepsin B+L activities, SDS-PAGE of sarcoplasmic and myofibrillar proteins, texture, and changes in microstructure of fillets at 4 °C and -3 °C were determined at 0, 2, 4, 8, 12, 24, 48, and 72 h after slaughter. The results indicated that pH of fillets (6.50 to 6.80) was appropriate for cathepsin function during the rigor mortis. For fillets that were chilled and partially frozen, the cathepsin activity in lysosome increased consistently during the first 12 h, followed by a decrease from the 12 to 24 h, which paralleled an increase in activity in heavy mitochondria, myofibrils and sarcoplasm. There was no significant difference in cathepsin activity in lysosomes between fillets at 4 °C and -3 °C (P > 0.05). Partially frozen fillets had greater cathepsin activity in heavy mitochondria than chilled samples from the 48 to 72 h. In addition, partially frozen fillets showed higher cathepsin activity in sarcoplasm and lower cathepsin activity in myofibrils compared with chilled fillets. Correspondingly, we observed degradation of α-actinin (105 kDa) by cathepsin L in chilled fillets and degradation of creatine kinase (41 kDa) by cathepsin B in partially frozen fillets during the rigor mortis. The decline of hardness for both fillets might be attributed to the accumulation of cathepsin in myofibrils from the 8 to 24 h. The lower cathepsin activity in myofibrils for fillets that were partially frozen might induce a more intact cytoskeletal structure than fillets that were chilled. © 2015 Institute of Food Technologists®

  7. Limitation of degree information for analyzing the interaction evolution in online social networks

    Science.gov (United States)

    Shang, Ke-Ke; Yan, Wei-Sheng; Xu, Xiao-Ke

    2014-04-01

    Previously many studies on online social networks simply analyze the static topology in which the friend relationship once established, then the links and nodes will not disappear, but this kind of static topology may not accurately reflect temporal interactions on online social services. In this study, we define four types of users and interactions in the interaction (dynamic) network. We found that active, disappeared, new and super nodes (users) have obviously different strength distribution properties and this result also can be revealed by the degree characteristics of the unweighted interaction and friendship (static) networks. However, the active, disappeared, new and super links (interactions) only can be reflected by the strength distribution in the weighted interaction network. This result indicates the limitation of the static topology data on analyzing social network evolutions. In addition, our study uncovers the approximately stable statistics for the dynamic social network in which there are a large variation for users and interaction intensity. Our findings not only verify the correctness of our definitions, but also helped to study the customer churn and evaluate the commercial value of valuable customers in online social networks.

  8. Preparation of Input Deck to analyze the Nuclear Power Plant for the Use of Regulatory Verification

    International Nuclear Information System (INIS)

    Kang, Doo Hyuk; Kim, Hyung Seok; Suh, Jae Seung; Ahn, Seung Hoon

    2009-01-01

    The objectives of this paper are to make out the input deck that analyzes a nuclear power plant for the use of regulatory verification and to produce its calculation note. We have been maintained the input deck of T/H safety codes used in existing domestic reactors to ensure independent and accurate regulatory verification for the thermal-hydraulic safety analysis in domestic NPPs. This paper is mainly divided into two steps: first step is to compare existing input deck to the calculation note in order to verify the consistency. Next step is to model 3-dimensional reactor pressure vessel using MULTID component instead of the 1D existing input deck

  9. Medline Plus

    Full Text Available ... to-twin transfusion syndrome Ultrasound Vaccines Vasectomy A.D.A.M., Inc. is accredited by URAC, also ... is an independent audit to verify that A.D.A.M. follows rigorous standards of quality and ...

  10. Vapor-Phase Infrared Absorptivity Coefficient of Cyclohexyl Isothiocyanate

    National Research Council Canada - National Science Library

    Samuels, Alan C; Miles, Jr., Ronald W; Williams, Barry R; Hulet, Melissa S

    2008-01-01

    ...)) at a spectral resolution of 0.125 cm(-1). The chemical used in the feedstock was subjected to a rigorous analysis by gas chromatography-mass spectrometry, nuclear magnetic resonance, and Karl-Fischer titration to verify its purity...

  11. Meat quality and rigor mortis development in broiler chickens with gas-induced anoxia and postmortem electrical stimulation.

    Science.gov (United States)

    Sams, A R; Dzuik, C S

    1999-10-01

    This study was conducted to evaluate the combined rigor-accelerating effects of postmortem electrical stimulation (ES) and argon-induced anoxia (Ar) of broiler chickens. One hundred broilers were processed in the following treatments: untreated controls, ES, Ar, or Ar with ES (Ar + ES). Breast fillets were harvested at 1 h postmortem for all treatments or at 1 and 6 h postmortem for the control carcasses. Fillets were sampled for pH and ratio of inosine to adenosine (R-value) and were then individually quick frozen (IQF) or aged on ice (AOI) until 24 h postmortem. Color was measured in the AOI fillets at 24 h postmortem. All fillets were then cooked and evaluated for Allo-Kramer shear value. The Ar treatment accelerated the normal pH decline, whereas the ES and AR + ES treatments yielded even lower pH values at 1 h postmortem. The Ar + ES treatment had a greater R-value than the ES treatment, which was greater than either the Ar or 1-h controls, which, in turn, were not different from each other. The ES treatment had the lowest L* value, and ES, Ar, and Ar + ES produced significantly higher a* values than the 1-h controls. For the IQF fillets, the ES and Ar + ES treatments were not different in shear value but were lower than Ar, which was lower than the 1-h controls. The same was true for the AOI fillets except that the ES and the Ar treatments were not different. These results indicated that although ES and Ar had rigor-accelerating and tenderizing effects, ES seemed to be more effective than Ar; there was little enhancement when Ar was added to the ES treatment and fillets were deboned at 1 h postmortem.

  12. Verified Representations of Landau's "Grundlagen" in the lambda-delta Family and in the Calculus of Constructions

    Directory of Open Access Journals (Sweden)

    Ferruccio Guidi

    2016-01-01

    Full Text Available Landau's "Grundlagen der Analysis" formalized in the language Aut-QE, represents an early milestone in computer-checked mathematics and is the only non-trivial development finalized in the languages of the Automath family. Here we discuss an implemented procedure producing a faithful representation of the Grundlagen in the Calculus of Constructions, verified by the proof assistant Coq 8.4.3. The point at issue is distinguishing lambda-abstractions from pi-abstractions where the original text uses Automath unified binders, taking care of the cases in which a binder corresponds to both abstractions at one time. It is a fact that some binders can be disambiguated only by verifying the Grundlagen in a calculus accepting Aut-QE and the Calculus of Constructions. To this end, we rely on lambda-delta version 3, a system that the author is proposing here for the first time.

  13. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    Science.gov (United States)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  14. Analyzing three-dimensional position of region of interest using an image of contrast media using unilateral X-ray exposure

    International Nuclear Information System (INIS)

    Harauchi, Hajime; Gotou, Hiroshi; Tanooka, Masao

    1994-01-01

    Analyzing three-dimensional internal structure of object in an X-ray study is usually performed by using two or more of the incidents of an X-ray direction. In this report, we analyzed the three-dimensional position of tubes with a phantom by using both contrast media and imaging of one direction in the X-ray study. The concentration of the iodine in contrast media can be known by using the log-subtraction image of only the one-directional incident X-ray. Also the diameter of tube filled with contrast media is calculated by the concentration of iodine. So we can show the three-dimensional position of tubes geometrically, by the diameter of tube and the measured value of the film. We verified this method by an experiment according to the theory. (author)

  15. Modeling and Analyzing Real-Time Multiprocessor Systems

    NARCIS (Netherlands)

    Wiggers, M.H.; Thiele, Lothar; Lee, Edward A.; Schlieker, Simon; Bekooij, Marco Jan Gerrit

    2010-01-01

    Researchers have proposed approaches to verify that real-time multiprocessor systems meet their timeliness constraints. These approaches make assumptions on the model of computation, the load placed on the multiprocessor system, and the faults that can arise. This heterogeneous set of assumptions

  16. Caracterização do processo de rigor mortis em músculos de eqüinos e maciez da carne Caracterization of rigor mortis process of muscle horse and meat tenderness

    Directory of Open Access Journals (Sweden)

    Tatiana Pacheco Rodrigues

    2004-08-01

    Full Text Available Esta pesquisa utilizou 12 eqüinos de diferentes idades, abatidos em um matadouro-frigorífico (SIF 1803 em Araguari-MG, e estudou a temperatura, pH, comprimento de sarcômero em diferentes intervalos de tempo após abate (1h, 5h, 8h, 10h, 12h, 15h e 24h e força de cisalhamento (maciez dos músculos Longissimus dorsi e Semitendinosus, com intuito de caracterizar o desenvolvimento do processo de rigor mortis de eqüídeos durante o processamento industrial. A temperatura da câmara fria variou de 10,2°C a 4,0°C e a temperatura média inicial das carcaças foi de 35,32°C e a final de 4,15°C. O pH inicial médio do músculo Longissimus dorsi foi 6,49 e o final 5,63, e para o músculo Semitendinosus o pH inicial médio foi 6,44 e o final 5,70. A menor medida de sarcômero observada em ambos os músculos foi na 15ª hora após abate, ou seja, 1,44µm e 1,41µm, respectivamente. A carne dos eqüídeos adultos foi mais dura (pThis work studied 12 horses at different ages butchered in a slaughterhouse in Minas Gerais State, Brazil (SIF 1803 and evaluated temperature, pH, sarcomere length in different periods after slaughter (1h, 5h, 8h, 10h, 12h, 15h, and 24 hours as well as the shear force (meat tenderness of the Longissimus dorsi and Semitendinosus muscles, aiming at characterizing the rigor mortis onset in the meat during industrial processing. The chilly room temperature varied from 10.2°C to 4.0°C, and the mean initial carcass temperature was 35.32°C and the final one was 4.15°C. The mean initial pH of Longissimus dorsi was 6.49 and the final one was 5.63; the mean initial pH of Semitendinosus was 6.44 and the final one was 5.70. The smallest sarcomere size obtained in both muscles occurred at 15 hours postmortem, and the sarcomere lengths were 1.44 µm and 1.41 µm, respectively. The meat from adult horses was tougher than that from young ones (p<0.05, and the Semitendinosus muscle was tougher than Longissimus dorsi muscle.

  17. The effect of different methods and image analyzers on the results of the in vivo comet assay.

    Science.gov (United States)

    Kyoya, Takahiro; Iwamoto, Rika; Shimanura, Yuko; Terada, Megumi; Masuda, Shuichi

    2018-01-01

    The in vivo comet assay is a widely used genotoxicity test that can detect DNA damage in a range of organs. It is included in the Organisation for Economic Co-operation and Development Guidelines for the Testing of Chemicals. However, various protocols are still used for this assay, and several different image analyzers are used routinely to evaluate the results. Here, we verified a protocol that largely contributes to the equivalence of results, and we assessed the effect on the results when slides made from the same sample were analyzed using two different image analyzers (Comet Assay IV vs Comet Analyzer). Standardizing the agarose concentrations and DNA unwinding and electrophoresis times had a large impact on the equivalence of the results between the different methods used for the in vivo comet assay. In addition, there was some variation in the sensitivity of the two different image analyzers tested; however this variation was considered to be minor and became negligible when the test conditions were standardized between the two different methods. By standardizing the concentrations of low melting agarose and DNA unwinding and electrophoresis times between both methods used in the current study, the sensitivity to detect the genotoxicity of a positive control substance in the in vivo comet assay became generally comparable, independently of the image analyzer used. However, there may still be the possibility that other conditions, except for the three described here, could affect the reproducibility of the in vivo comet assay.

  18. Importance of All-in-one (MCNPX2.7.0+CINDER2008) Code for Rigorous Transmutation Study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Oyeon [Institute for Modeling and Simulation Convergence, Daegu (Korea, Republic of); Kim, Kwanghyun [RadTek Co. Ltd., Daejeon (Korea, Republic of)

    2015-10-15

    It can be utilized as a possible mechanism for reducing the volume and hazard of radioactive waste by transforming hazardous radioactive elements with long half-life into less hazardous elements with short halflife. Thus, the understanding of the transmutation mechanism and beneficial machinery design technologies are important and useful. Although the terminology transmutation was rooted back to alchemy which transforms the base metals into gold in the middle ages, Rutherford and Soddy were the first observers by discovering the natural transmutation as a part of radioactive decay of the alpha decay type in early 20th century. Along with the development of computing technology, analysis software, for example, CINDER was developed for rigorous atomic transmutation study. The code has a long history of development from the original work of T. England at Bettis Atomic Power Laboratory (BAPL) in the early 1960s. It has been used to calculate the inventory of nuclides in an irradiated material. CINDER'90 which is recently released involved an upgrade of the code to allow the spontaneous tracking of chains based upon the significant density or pass-by of a nuclide, where pass-by represents the density of a nuclide transforming to other nuclides. Nuclear transmutation process is governed by highly non-linear differential equation. Chaotic nature of the non-linear equation bespeaks the importance of the accurate input data (i.e. number of significant digits). Thus, reducing the human interrogation is very important for the rigorous transmutation study and 'allin- one' code structure is desired. Note that non-linear characteristic of the transmutation equation caused by the flux changes due to the number density change during a given time interval (intrinsic physical phenomena) is not considered in this study. In this study, we only emphasized the effects of human interrogation in the computing process solving nonlinear differential equations, as shown in

  19. RIGOROUS GEOREFERENCING OF ALSAT-2A PANCHROMATIC AND MULTISPECTRAL IMAGERY

    Directory of Open Access Journals (Sweden)

    I. Boukerch

    2013-04-01

    Full Text Available The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI, require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.

  20. Rigorous derivation of the mean-field green functions of the two-band Hubbard model of superconductivity

    International Nuclear Information System (INIS)

    Adam, G.; Adam, S.

    2007-01-01

    The Green function (GF) equation of motion technique for solving the effective two-band Hubbard model of high-T c superconductivity in cuprates rests on the Hubbard operator (HO) algebra. We show that, if we take into account the invariance to translations and spin reversal, the HO algebra results in invariance properties of several specific correlation functions. The use of these properties allows rigorous derivation and simplification of the expressions of the frequency matrix (FM) and of the generalized mean-field approximation (GMFA) Green functions (GFs) of the model. For the normal singlet hopping and anomalous exchange pairing correlation functions which enter the FM and GMFA-GFs, the use of spectral representations allows the identification and elimination of exponentially small quantities. This procedure secures the reduction of the correlation order to the GMFA-GF expressions

  1. Use of a theoretical equation of state to interpret time-dependent free volume in polymer glasses

    International Nuclear Information System (INIS)

    Curro, J.G.; Lagasse, R.R.; Simha, R.

    1981-01-01

    Many physical properties of polymer glasses change spontaneously during isothermal aging by a process commonly modeled as collapse of free volume. The model has not been verified rigorously because free volume cannot be unambiguously measured. In the present investigation we tentatively identify the free-volume fraction with the fraction of empty sites in the equation of state of Simha and Somcynsky. With this theory, volume recovery measurements can be analyzed to yield directly the time-dependent, free-volume fraction. Using this approach, recent volume measurements on poly(methyl methacrylate) are analyzed. The resulting free-volume fractions are then used in the Doolittle equation to predict the shift in stress relaxation curves at 23 0 C. These predicted shift factors agree with the experimental measurements of Cizmecioglu et al. In addition, it is shown that previous assumptions concerning temperature dependence of free volume are inconsistent with the theory

  2. Standard test method for verifying the alignment of X-Ray diffraction instrumentation for residual stress measurement

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the preparation and use of a flat stress-free test specimen for the purpose of checking the systematic error caused by instrument misalignment or sample positioning in X-ray diffraction residual stress measurement, or both. 1.2 This test method is applicable to apparatus intended for X-ray diffraction macroscopic residual stress measurement in polycrystalline samples employing measurement of a diffraction peak position in the high-back reflection region, and in which the θ, 2θ, and ψ rotation axes can be made to coincide (see Fig. 1). 1.3 This test method describes the use of iron powder which has been investigated in round-robin studies for the purpose of verifying the alignment of instrumentation intended for stress measurement in ferritic or martensitic steels. To verify instrument alignment prior to stress measurement in other metallic alloys and ceramics, powder having the same or lower diffraction angle as the material to be measured should be prepared in similar fashion...

  3. Rigorous control of logarithmic corrections in four-dimensional phi4 spin systems. II. Critical behavior of susceptibility and correlation length

    International Nuclear Information System (INIS)

    Hara, T.; Tasaki, H.

    1987-01-01

    Continuing the analysis started in Part I of this work, they investigate critical phenomena in weakly coupled phi 4 spin systems in four dimensions. Concerning the critical behavior of the susceptibility and the correlation length (in the high-temperature phase), the existence of logarithmic corrections to their mean field type behavior is rigorously shown (i.e., they prove chi(t) ∼ t -1 absolute value 1n t/sup 1/3/, zeta(t) ∼ t/sup -1/2/ absolute value of ln t/sup 1/6/)

  4. Construct a procedure to verify radiation protection for apparatus of industrial gamma radiography

    International Nuclear Information System (INIS)

    Nghiem Xuan Long; Trinh Dinh Truong; Dinh Chi Hung; Le Ngoc Hieu

    2013-01-01

    Apparatus for industrial gamma radiography include an exposure container, source guide tube, remote control hand crank assembly and other attached equipment. It is used a lot in inspection and evaluation of projects. In Vietnam, there are now more than 50 companies in radiography field and more than 100 apparatus are being used on the site. Therefore, the verification and evaluation is very necessary and important. This project constructs a procedure to verify a radiation protection for apparatus in the industrial gamma radiography for its application in Vietnam. (author)

  5. Verifying Elimination Programs with a Special Emphasis on Cysticercosis Endpoints and Postelimination Surveillance

    Directory of Open Access Journals (Sweden)

    Sukwan Handali

    2012-01-01

    Full Text Available Methods are needed for determining program endpoints or postprogram surveillance for any elimination program. Cysticercosis has the necessary effective strategies and diagnostic tools for establishing an elimination program; however, tools to verify program endpoints have not been determined. Using a statistical approach, the present study proposed that taeniasis and porcine cysticercosis antibody assays could be used to determine with a high statistical confidence whether an area is free of disease. Confidence would be improved by using secondary tests such as the taeniasis coproantigen assay and necropsy of the sentinel pigs.

  6. Estimation of the convergence order of rigorous coupled-wave analysis for OCD metrology

    Science.gov (United States)

    Ma, Yuan; Liu, Shiyuan; Chen, Xiuguo; Zhang, Chuanwei

    2011-12-01

    In most cases of optical critical dimension (OCD) metrology, when applying rigorous coupled-wave analysis (RCWA) to optical modeling, a high order of Fourier harmonics is usually set up to guarantee the convergence of the final results. However, the total number of floating point operations grows dramatically as the truncation order increases. Therefore, it is critical to choose an appropriate order to obtain high computational efficiency without losing much accuracy in the meantime. In this paper, the convergence order associated with the structural and optical parameters has been estimated through simulation. The results indicate that the convergence order is linear with the period of the sample when fixing the other parameters, both for planar diffraction and conical diffraction. The illuminated wavelength also affects the convergence of a final result. With further investigations concentrated on the ratio of illuminated wavelength to period, it is discovered that the convergence order decreases with the growth of the ratio, and when the ratio is fixed, convergence order jumps slightly, especially in a specific range of wavelength. This characteristic could be applied to estimate the optimum convergence order of given samples to obtain high computational efficiency.

  7. Computer vision-based evaluation of pre- and postrigor changes in size and shape of Atlantic cod (Gadus morhua) and Atlantic salmon (Salmo salar) fillets during rigor mortis and ice storage: effects of perimortem handling stress.

    Science.gov (United States)

    Misimi, E; Erikson, U; Digre, H; Skavhaug, A; Mathiassen, J R

    2008-03-01

    The present study describes the possibilities for using computer vision-based methods for the detection and monitoring of transient 2D and 3D changes in the geometry of a given product. The rigor contractions of unstressed and stressed fillets of Atlantic salmon (Salmo salar) and Atlantic cod (Gadus morhua) were used as a model system. Gradual changes in fillet shape and size (area, length, width, and roundness) were recorded for 7 and 3 d, respectively. Also, changes in fillet area and height (cross-section profiles) were tracked using a laser beam and a 3D digital camera. Another goal was to compare rigor developments of the 2 species of farmed fish, and whether perimortem stress affected the appearance of the fillets. Some significant changes in fillet size and shape were found (length, width, area, roundness, height) between unstressed and stressed fish during the course of rigor mortis as well as after ice storage (postrigor). However, the observed irreversible stress-related changes were small and would hardly mean anything for postrigor fish processors or consumers. The cod were less stressed (as defined by muscle biochemistry) than the salmon after the 2 species had been subjected to similar stress bouts. Consequently, the difference between the rigor courses of unstressed and stressed fish was more extreme in the case of salmon. However, the maximal whole fish rigor strength was judged to be about the same for both species. Moreover, the reductions in fillet area and length, as well as the increases in width, were basically of similar magnitude for both species. In fact, the increases in fillet roundness and cross-section height were larger for the cod. We conclude that the computer vision method can be used effectively for automated monitoring of changes in 2D and 3D shape and size of fish fillets during rigor mortis and ice storage. In addition, it can be used for grading of fillets according to uniformity in size and shape, as well as measurement of

  8. Verifying the agreed framework between the United States and North Korea

    International Nuclear Information System (INIS)

    May, M.M.

    2001-01-01

    Under the 1994 Agreed Framework (AF) between the United States and the Democratic People Republic of Korea (DPRK), the US and its allies will provide two nuclear-power reactors and other benefits to the DPRK in exchange for an agreement by the DPRK to declare how much nuclear-weapon material it has produced; to identify, freeze, and eventually dismantle specified facilities for producing this material; and to remain a party to the nuclear Non- Proliferation Treaty (NPT) and allow the implementation of its safeguards agreement. This study assesses the verifiability of these provisions. The study concludes verification can be accomplished, given cooperation and openness from the DPRK. Special effort will be needed from the IAEA, as well as support from the US and the Republic of Korea. (author)

  9. Forced oral opening for cadavers with rigor mortis: two approaches for the myotomy on the temporal muscles.

    Science.gov (United States)

    Nakayama, Y; Aoki, Y; Niitsu, H; Saigusa, K

    2001-04-15

    Forensic dentistry plays an essential role in personal identification procedures. An adequate interincisal space of cadavers with rigor mortis is required to obtain detailed dental findings. We have developed intraoral and two directional approaches, for myotomy of the temporal muscles. The intraoral approach, in which the temporalis was dissected with scissors inserted via an intraoral incision, was adopted for elderly cadavers, females and emaciated or exhausted bodies, and had a merit of no incision on the face. The two directional approach, in which myotomy was performed with thread-wire saw from behind and with scissors via the intraoral incision, was designed for male muscular youths. Both approaches were effective to obtain a desired degree of an interincisal opening without facial damage.

  10. 24 CFR 5.218 - Penalties for failing to disclose and verify Social Security and Employer Identification Numbers.

    Science.gov (United States)

    2010-04-01

    ... and verify Social Security and Employer Identification Numbers. 5.218 Section 5.218 Housing and Urban... REQUIREMENTS; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security...

  11. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    International Nuclear Information System (INIS)

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing

  12. Evaluating MC and A effectiveness to verify the presence of nuclear materials

    International Nuclear Information System (INIS)

    Dawson, P.G.; Morzinski, J.A.; Ostenak, Carl A.; Longmire, V.L.; Jewell, D.; Williams, J.D.

    2001-01-01

    Traditional materials accounting is focused exclusively on the material balance area (MBA), and involves periodically closing a material balance based on accountability measurements conducted during a physical inventory. In contrast, the physical inventory for Los Alamos National Laboratory's near-real-time accounting system is established around processes and looks more like an item inventory. That is, the intent is not to measure material for accounting purposes, since materials have already been measured in the normal course of daily operations. A given unit process operates many times over the course of a material balance period. The product of a given unit process may move for processing within another unit process in the same MBA or may be transferred out of the MBA. Since few materials are unmeasured the physical inventory for a near-real-time process area looks more like an item inventory. Thus, the intent of the physical inventory is to locate the materials on the books and verify information about the materials contained in the books. Closing a materials balance for such an area is a matter of summing all the individual mass balances for the batches processed by all unit processes in the MBA. Additionally, performance parameters are established to measure the program's effectiveness. Program effectiveness for verifying the presence of nuclear material is required to be equal to or greater than a prescribed performance level, process measurements must be within established precision and accuracy values, physical inventory results meet or exceed performance requirements, and inventory differences are less than a target/goal quantity. This approach exceeds DOE established accounting and physical inventory program requirements. Hence, LANL is committed to this approach and to seeking opportunities for further improvement through integrated technologies. This paper will provide a detailed description of this evaluation process.

  13. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling

    Science.gov (United States)

    Antweiler, Ronald C.; Writer, Jeffrey H.; Murphy, Sheila F.

    2014-01-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations – such as missing the Lagrangian parcel by less than 1 h – can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed “verified Lagrangian” sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2–4 h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we

  14. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling.

    Science.gov (United States)

    Antweiler, Ronald C; Writer, Jeffrey H; Murphy, Sheila F

    2014-02-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations - such as missing the Lagrangian parcel by less than 1h - can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed "verified Lagrangian" sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2-4h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we show how data

  15. Verifying large modular systems using iterative abstraction refinement

    International Nuclear Information System (INIS)

    Lahtinen, Jussi; Kuismin, Tuomas; Heljanko, Keijo

    2015-01-01

    Digital instrumentation and control (I&C) systems are increasingly used in the nuclear engineering domain. The exhaustive verification of these systems is challenging, and the usual verification methods such as testing and simulation are typically insufficient. Model checking is a formal method that is able to exhaustively analyse the behaviour of a model against a formally written specification. If the model checking tool detects a violation of the specification, it will give out a counter-example that demonstrates how the specification is violated in the system. Unfortunately, sometimes real life system designs are too big to be directly analysed by traditional model checking techniques. We have developed an iterative technique for model checking large modular systems. The technique uses abstraction based over-approximations of the model behaviour, combined with iterative refinement. The main contribution of the work is the concrete abstraction refinement technique based on the modular structure of the model, the dependency graph of the model, and a refinement sampling heuristic similar to delta debugging. The technique is geared towards proving properties, and outperforms BDD-based model checking, the k-induction technique, and the property directed reachability algorithm (PDR) in our experiments. - Highlights: • We have developed an iterative technique for model checking large modular systems. • The technique uses BDD-based model checking, k-induction, and PDR in parallel. • We have tested our algorithm by verifying two models with it. • The technique outperforms classical model checking methods in our experiments

  16. Inosine-5'-monophosphate is a candidate agent to resolve rigor mortis of skeletal muscle.

    Science.gov (United States)

    Matsuishi, Masanori; Tsuji, Mariko; Yamaguchi, Megumi; Kitamura, Natsumi; Tanaka, Sachi; Nakamura, Yukinobu; Okitani, Akihiro

    2016-11-01

    The object of the present study was to reveal the action of inosine-5'-monophosphate (IMP) toward myofibrils in postmortem muscles. IMP solubilized isolated actomyosin within a narrow range of KCl concentration, 0.19-0.20 mol/L, because of the dissociation of actomyosin into actin and myosin, but it did not solubilize the proteins in myofibrils with 0.2 mol/L KCl. However, IMP could solubilize both proteins in myofibrils with 0.2 mol/L KCl in the presence of 1 m mol/L pyrophosphate or 1.0-3.3 m mol/L adenosine-5'-diphosphate (ADP). Thus, we presumed that pyrophosphate and ADP released thin filaments composed of actin, and thick filaments composed of myosin from restraints of myofibrils, and then both filaments were solubilized through the IMP-induced dissociation of actomyosin. Thus, we concluded that IMP is a candidate agent to resolve rigor mortis because of its ability to break the association between thick and thin filaments. © 2016 Japanese Society of Animal Science.

  17. Chromatin immunoprecipitation to analyze DNA binding sites of HMGA2.

    Directory of Open Access Journals (Sweden)

    Nina Winter

    Full Text Available BACKGROUND: HMGA2 is an architectonic transcription factor abundantly expressed during embryonic and fetal development and it is associated with the progression of malignant tumors. The protein harbours three basically charged DNA binding domains and an acidic protein binding C-terminal domain. DNA binding induces changes of DNA conformation and hence results in global overall change of gene expression patterns. Recently, using a PCR-based SELEX (Systematic Evolution of Ligands by Exponential Enrichment procedure two consensus sequences for HMGA2 binding have been identified. METHODOLOGY/PRINCIPAL FINDINGS: In this investigation chromatin immunoprecipitation (ChIP experiments and bioinformatic methods were used to analyze if these binding sequences can be verified on chromatin of living cells as well. CONCLUSION: After quantification of HMGA2 protein in different cell lines the colon cancer derived cell line HCT116 was chosen for further ChIP experiments because of its 3.4-fold higher HMGA2 protein level. 49 DNA fragments were obtained by ChIP. These fragments containing HMGA2 binding sites have been analyzed for their AT-content, location in the human genome and similarities to sequences generated by a SELEX study. The sequences show a significantly higher AT-content than the average of the human genome. The artificially generated SELEX sequences and short BLAST alignments (11 and 12 bp of the ChIP fragments from living cells show similarities in their organization. The flanking regions are AT-rich, whereas a lower conservation is present in the center of the sequences.

  18. Rigorous covariance propagation of geoid errors to geodetic MDT estimates

    Science.gov (United States)

    Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.

    2012-04-01

    The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.

  19. Scattering of atoms by a stationary sinusoidal hard wall: Rigorous treatment in (n+1) dimensions and comparison with the Rayleigh method

    International Nuclear Information System (INIS)

    Goodman, F.O.

    1977-01-01

    A rigorous treatment of the scattering of atoms by a stationary sinusoidal hard wall in (n+1) dimensions is presented, a previous treatment by Masel, Merrill, and Miller for n=1 being contained as a special case. Numerical comparisons are made with the GR method of Garcia, which incorporates the Rayleigh hypothesis. Advantages and disadvantages of both methods are discussed, and it is concluded that the Rayleigh GR method, if handled properly, will probably work satisfactorily in physically realistic cases

  20. Methods to verify absorbed dose of irradiated containers and evaluation of dosimeters

    International Nuclear Information System (INIS)

    Gao Meixu; Wang Chuanyao; Tang Zhangxong; Li Shurong

    2001-01-01

    The research on dose distribution in irradiated food containers and evaluation of several methods to verify absorbed dose were carried out. The minimum absorbed dose of treated five orange containers was in the top of the highest or in the bottom of lowest container. D max /D min in this study was 1.45 irradiated in a commercial 60 Co facility. The density of orange containers was about 0.391g/cm 3 . The evaluation of dosimeters showed that the PMMA-YL and clear PMMA dosimeters have linear relationship with dose response, and the word NOT in STERIN-125 and STERIN-300 indicators were covered completely at the dosage of 125 and 300 Gy respectively. (author)