WorldWideScience

Sample records for rigorous application process

  1. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  2. Monitoring muscle optical scattering properties during rigor mortis

    Science.gov (United States)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  3. Putrefactive rigor: apparent rigor mortis due to gas distension.

    Science.gov (United States)

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  4. A rigorous pole representation of multilevel cross sections and its practical applications

    International Nuclear Information System (INIS)

    Hwang, R.N.

    1987-01-01

    In this article a rigorous method for representing the multilevel cross sections and its practical applications are described. It is a generalization of the rationale suggested by de Saussure and Perez for the s-wave resonances. A computer code WHOPPER has been developed to convert the Reich-Moore parameters into the pole and residue parameters in momentum space. Sample calculations have been carried out to illustrate that the proposed method preserves the rigor of the Reich-Moore cross sections exactly. An analytical method has been developed to evaluate the pertinent Doppler-broadened line shape functions. A discussion is presented on how to minimize the number of pole parameters so that the existing reactor codes can be best utilized

  5. Realizing rigor in the mathematics classroom

    CERN Document Server

    Hull, Ted H (Henry); Balka, Don S

    2014-01-01

    Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data

  6. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    Science.gov (United States)

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  7. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    Science.gov (United States)

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  8. Hidden Markov processes theory and applications to biology

    CERN Document Server

    Vidyasagar, M

    2014-01-01

    This book explores important aspects of Markov and hidden Markov processes and the applications of these ideas to various problems in computational biology. The book starts from first principles, so that no previous knowledge of probability is necessary. However, the work is rigorous and mathematical, making it useful to engineers and mathematicians, even those not interested in biological applications. A range of exercises is provided, including drills to familiarize the reader with concepts and more advanced problems that require deep thinking about the theory. Biological applications are t

  9. Application of the rigorous method to x-ray and neutron beam scattering on rough surfaces

    International Nuclear Information System (INIS)

    Goray, Leonid I.

    2010-01-01

    The paper presents a comprehensive numerical analysis of x-ray and neutron scattering from finite-conducting rough surfaces which is performed in the frame of the boundary integral equation method in a rigorous formulation for high ratios of characteristic dimension to wavelength. The single integral equation obtained involves boundary integrals of the single and double layer potentials. A more general treatment of the energy conservation law applicable to absorption gratings and rough mirrors is considered. In order to compute the scattering intensity of rough surfaces using the forward electromagnetic solver, Monte Carlo simulation is employed to average the deterministic diffraction grating efficiency due to individual surfaces over an ensemble of realizations. Some rules appropriate for numerical implementation of the theory at small wavelength-to-period ratios are presented. The difference between the rigorous approach and approximations can be clearly seen in specular reflectances of Au mirrors with different roughness parameters at wavelengths where grazing incidence occurs at close to or larger than the critical angle. This difference may give rise to wrong estimates of rms roughness and correlation length if they are obtained by comparing experimental data with calculations. Besides, the rigorous approach permits taking into account any known roughness statistics and allows exact computation of diffuse scattering.

  10. Application of the ALARA process in the regulation of nuclear activities

    International Nuclear Information System (INIS)

    1991-05-01

    In this report the historical and conceptual basis of the ALARA process has been reviewed. The application solely of a prescriptive approach, particularly a rigorous quantitative approach to the decision-making process, has been questioned. While the Committees recognize the value of quantitative techniques they strongly emphasize that application of the ALARA concept is a much broader process for the determination of acceptable levels of protection. An ALARA process should take into account social and economic factors that are not quantifiable and involve representation of all those having a legitimate interest in the results of the process

  11. Fluctuations of Lévy processes with applications introductory lectures

    CERN Document Server

    Kyprianou, Andreas E

    2014-01-01

    Lévy processes are the natural continuous-time analogue of random walks and form a rich class of stochastic processes around which a robust mathematical theory exists. Their application appears in the theory of many areas of classical and modern stochastic processes including storage models, renewal processes, insurance risk models, optimal stopping problems, mathematical finance, continuous-state branching processes and positive self-similar Markov processes. This textbook is based on a series of graduate courses concerning the theory and application of Lévy processes from the perspective of their path fluctuations. Central to the presentation is the decomposition of paths in terms of excursions from the running maximum as well as an understanding of short- and long-term behaviour. The book aims to be mathematically rigorous while still providing an intuitive feel for underlying principles. The results and applications often focus on the case of Lévy processes with jumps in only one direction, for which r...

  12. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  13. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  14. Discrete stochastic processes and applications

    CERN Document Server

    Collet, Jean-François

    2018-01-01

    This unique text for beginning graduate students gives a self-contained introduction to the mathematical properties of stochastics and presents their applications to Markov processes, coding theory, population dynamics, and search engine design. The book is ideal for a newly designed course in an introduction to probability and information theory. Prerequisites include working knowledge of linear algebra, calculus, and probability theory. The first part of the text focuses on the rigorous theory of Markov processes on countable spaces (Markov chains) and provides the basis to developing solid probabilistic intuition without the need for a course in measure theory. The approach taken is gradual beginning with the case of discrete time and moving on to that of continuous time. The second part of this text is more applied; its core introduces various uses of convexity in probability and presents a nice treatment of entropy.

  15. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  16. Fast rigorous numerical method for the solution of the anisotropic neutron transport problem and the NITRAN system for fusion neutronics application. Pt. 1

    International Nuclear Information System (INIS)

    Takahashi, A.; Rusch, D.

    1979-07-01

    Some recent neutronics experiments for fusion reactor blankets show that the precise treatment of anisotropic secondary emissions for all types of neutron scattering is needed for neutron transport calculations. In the present work new rigorous methods, i.e. based on non-approximative microscopic neutron balance equations, are applied to treat the anisotropic collision source term in transport equations. The collision source calculation is free from approximations except for the discretization of energy, angle and space variables and includes the rigorous treatment of nonelastic collisions, as far as nuclear data are given. Two methods are presented: first the Ii-method, which relies on existing nuclear data files and then, as an ultimate goal, the I*-method, which aims at the use of future double-differential cross section data, but which is also applicable to the present single-differential data basis to allow a smooth transition to the new data type. An application of the Ii-method is given in the code system NITRAN which employs the Ssub(N)-method to solve the transport equations. Both rigorous methods, the Ii- and the I*-method, are applicable to all radiation transport problems and they can be used also in the Monte-Carlo-method to solve the transport problem. (orig./RW) [de

  17. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  18. Impact of post-rigor high pressure processing on the physicochemical and microbial shelf-life of cultured red abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Perkins, L Brian; Yang, Tom C; Skonberg, Denise I

    2016-03-01

    High pressure processing (HPP) of post-rigor abalone at 300MPa for 10min extended the refrigerated shelf-life to four times that of unprocessed controls. Shucked abalone meats were processed at 100 or 300MPa for 5 or 10min, and stored at 2°C for 35days. Treatments were analyzed for aerobic plate count (APC), total volatile base nitrogen (TVBN), K-value, biogenic amines, color, and texture. APC did not exceed 10(6) and TVBN levels remained below 35mg/100g for 35days for the 300MPa treatments. No biogenic amines were detected in the 300MPa treatments, but putrescine and cadaverine were detected in the control and 100MPa treatments. Color and texture were not affected by HPP or storage time. These results indicate that post-rigor processing at 300MPa for 10min can significantly increase refrigerated shelf-life of abalone without affecting chemical or physical quality characteristics important to consumers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    Science.gov (United States)

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (Prigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (Prigor fillets (37.8 ± 0.8) and had significantly lower (Prigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  20. [Sustainable process improvement with application of 'lean philosophy'].

    Science.gov (United States)

    Rouppe van der Voort, Marc B V; van Merode, G G Frits; Veraart, Henricus G N

    2013-01-01

    Process improvement is increasingly being implemented, particularly with the aid of 'lean philosophy'. This management philosophy aims to improve quality by reducing 'wastage'. Local improvements can produce negative effects elsewhere due to interdependence of processes. An 'integrated system approach' is required to prevent this. Some hospitals claim that this has been successful. Research into process improvement with the application of lean philosophy has reported many positive effects, defined as improved safety, quality and efficiency. Due to methodological shortcomings and lack of rigorous evaluations it is, however, not yet possible to determine the impact of this approach. It is, however, obvious that the investigated applications are fragmentary, with a dominant focus on the instrumental aspect of the philosophy and a lack of integration in a total system, and with insufficient attention to human aspects. Process improvement is required to achieve better and more goal-oriented healthcare. To achieve this, hospitals must develop integrated system approaches that combine methods for process design with continuous improvement of processes and with personnel management. It is crucial that doctors take the initiative to guide and improve processes in an integral manner.

  1. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  2. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    Science.gov (United States)

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  3. A rigorous phenomenological analysis of the ππ scattering lengths

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.; Sararu, M.

    1979-11-01

    The constraining power of the present experimental data, combined with the general theoretical knowledge about ππ scattering, upon the scattering lengths of this process, is investigated by means of a rigorous functional method. We take as input the experimental phase shifts and make no hypotheses about the high energy behaviour of the amplitudes, using only absolute bounds derived from axiomatic field theory and exact consequences of crossing symmetry. In the simplest application of the method, involving only the π 0 π 0 S-wave, we explored numerically a number of values proposed by various authors for the scattering lengths a 0 and a 2 and found that no one appears to be especially favoured. (author)

  4. Quality properties of pre- and post-rigor beef muscle after interventions with high frequency ultrasound.

    Science.gov (United States)

    Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn

    2014-11-01

    The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  5. Using Project Complexity Determinations to Establish Required Levels of Project Rigor

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Thomas D.

    2015-10-01

    This presentation discusses the project complexity determination process that was developed by National Security Technologies, LLC, for the U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office for implementation at the Nevada National Security Site (NNSS). The complexity determination process was developed to address the diversity of NNSS project types, size, and complexity; to fill the need for one procedure but with provision for tailoring the level of rigor to the project type, size, and complexity; and to provide consistent, repeatable, effective application of project management processes across the enterprise; and to achieve higher levels of efficiency in project delivery. These needs are illustrated by the wide diversity of NNSS projects: Defense Experimentation, Global Security, weapons tests, military training areas, sensor development and testing, training in realistic environments, intelligence community support, sensor development, environmental restoration/waste management, and disposal of radioactive waste, among others.

  6. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  7. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications

    Directory of Open Access Journals (Sweden)

    Vassilis Gikas

    2016-08-01

    Full Text Available With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS, Inertial Measurement Unit (IMU and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness capabilities (i.e., potential and limitations based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android and iPhone 5s (iOS. Our findings indicate that the deviation of the smartphone locations from ground truth (trueness deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of

  8. An introduction to continuous-time stochastic processes theory, models, and applications to finance, biology, and medicine

    CERN Document Server

    Capasso, Vincenzo

    2015-01-01

    This textbook, now in its third edition, offers a rigorous and self-contained introduction to the theory of continuous-time stochastic processes, stochastic integrals, and stochastic differential equations. Expertly balancing theory and applications, the work features concrete examples of modeling real-world problems from biology, medicine, industrial applications, finance, and insurance using stochastic methods. No previous knowledge of stochastic processes is required. Key topics include: * Markov processes * Stochastic differential equations * Arbitrage-free markets and financial derivatives * Insurance risk * Population dynamics, and epidemics * Agent-based models New to the Third Edition: * Infinitely divisible distributions * Random measures * Levy processes * Fractional Brownian motion * Ergodic theory * Karhunen-Loeve expansion * Additional applications * Additional  exercises * Smoluchowski  approximation of  Langevin systems An Introduction to Continuous-Time Stochastic Processes, Third Editio...

  9. Elementary process theory axiomatic introduction and applications

    CERN Document Server

    Cabbolet, Marcoen J T F

    2011-01-01

    Modern physics lacks a unitary theory that applies to all four fundamental interactions. This PhD thesis is a proposal for a single, complete, and coherent scheme of mathematically formulated elementary laws of nature. While the first chapter presents the general background, the second chapter addresses the method by which the main result has been developed. The next three chapters rigorously introduce the Elementary Process Theory, its mathematical foundations, and its applications to physics, cosmology and philosophy of mind. The final two chapters discuss the results and present the conclusions. Summarizing, the Elementary Process Theory is a scheme of seven well-formed closed expressions, written in the mathematical language of set matrix theory – a generalization of Zermelo-Fraenkel set theory. In the physical world, these seven expressions can be interpreted as elementary principles governing the universe at supersmall scale. The author critically confronts the theory with Quantum Mechanics and Genera...

  10. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    Science.gov (United States)

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  11. Rigorous Science: a How-To Guide

    Directory of Open Access Journals (Sweden)

    Arturo Casadevall

    2016-11-01

    Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.

  12. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  13. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  14. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  15. Characterization of rigor mortis of longissimus dorsi and triceps ...

    African Journals Online (AJOL)

    24 h) of the longissimus dorsi (LD) and triceps brachii (TB) muscles as well as the shear force (meat tenderness) and colour were evaluated, aiming at characterizing the rigor mortis in the meat during industrial processing. Data statistic treatment demonstrated that carcass temperature and pH decreased gradually during ...

  16. Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.

    Science.gov (United States)

    Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten

    2018-01-01

    Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence

  17. Long persistence of rigor mortis at constant low temperature.

    Science.gov (United States)

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  18. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Directory of Open Access Journals (Sweden)

    Augusto Beléndez

    2012-08-01

    Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  19. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Science.gov (United States)

    Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto

    2012-01-01

    There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  20. Rigorous high-precision enclosures of fixed points and their invariant manifolds

    Science.gov (United States)

    Wittig, Alexander N.

    The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by

  1. A case of instantaneous rigor?

    Science.gov (United States)

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  2. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    OpenAIRE

    K. Di; Y. Liu; B. Liu; M. Peng

    2012-01-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D c...

  3. Mathematical Rigor in Introductory Physics

    Science.gov (United States)

    Vandyke, Michael; Bassichis, William

    2011-10-01

    Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.

  4. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    Science.gov (United States)

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  5. A Framework for Rigorously Identifying Research Gaps in Qualitative Literature Reviews

    DEFF Research Database (Denmark)

    Müller-Bloch, Christoph; Kranz, Johann

    2015-01-01

    Identifying research gaps is a fundamental goal of literature reviewing. While it is widely acknowledged that literature reviews should identify research gaps, there are no methodological guidelines for how to identify research gaps in qualitative literature reviews ensuring rigor and replicability....... Our study addresses this gap and proposes a framework that should help scholars in this endeavor without stifling creativity. To develop the framework we thoroughly analyze the state-of-the-art procedure of identifying research gaps in 40 recent literature reviews using a grounded theory approach....... Based on the data, we subsequently derive a framework for identifying research gaps in qualitative literature reviews and demonstrate its application with an example. Our results provide a modus operandi for identifying research gaps, thus enabling scholars to conduct literature reviews more rigorously...

  6. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  7. "Rigor mortis" in a live patient.

    Science.gov (United States)

    Chakravarthy, Murali

    2010-03-01

    Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.

  8. Classroom Talk for Rigorous Reading Comprehension Instruction

    Science.gov (United States)

    Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.

    2004-01-01

    This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…

  9. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    Science.gov (United States)

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  10. Reciprocity relations in transmission electron microscopy: A rigorous derivation.

    Science.gov (United States)

    Krause, Florian F; Rosenauer, Andreas

    2017-01-01

    A concise derivation of the principle of reciprocity applied to realistic transmission electron microscopy setups is presented making use of the multislice formalism. The equivalence of images acquired in conventional and scanning mode is thereby rigorously shown. The conditions for the applicability of the found reciprocity relations is discussed. Furthermore the positions of apertures in relation to the corresponding lenses are considered, a subject which scarcely has been addressed in previous publications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    Science.gov (United States)

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  12. [Experimental study of restiffening of the rigor mortis].

    Science.gov (United States)

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  13. [Rigor mortis -- a definite sign of death?].

    Science.gov (United States)

    Heller, A R; Müller, M P; Frank, M D; Dressler, J

    2005-04-01

    In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.

  14. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    Science.gov (United States)

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  15. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  16. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles

    Science.gov (United States)

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-01-01

    Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282

  17. Einstein's Theory A Rigorous Introduction for the Mathematically Untrained

    CERN Document Server

    Grøn, Øyvind

    2011-01-01

    This book provides an introduction to the theory of relativity and the mathematics used in its processes. Three elements of the book make it stand apart from previously published books on the theory of relativity. First, the book starts at a lower mathematical level than standard books with tensor calculus of sufficient maturity to make it possible to give detailed calculations of relativistic predictions of practical experiments. Self-contained introductions are given, for example vector calculus, differential calculus and integrations. Second, in-between calculations have been included, making it possible for the non-technical reader to follow step-by-step calculations. Thirdly, the conceptual development is gradual and rigorous in order to provide the inexperienced reader with a philosophically satisfying understanding of the theory.  Einstein's Theory: A Rigorous Introduction for the Mathematically Untrained aims to provide the reader with a sound conceptual understanding of both the special and genera...

  18. Experimental evaluation of rigor mortis IX. The influence of the breaking (mechanical solution) on the development of rigor mortis.

    Science.gov (United States)

    Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice

    2008-04-07

    Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.

  19. The jABC Approach to Rigorous Collaborative Development of SCM Applications

    Science.gov (United States)

    Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong

    Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.

  20. Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound

    Science.gov (United States)

    Shiraishi, Naoto; Tajima, Hiroyasu

    2017-08-01

    A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.

  1. An ultramicroscopic study on rigor mortis.

    Science.gov (United States)

    Suzuki, T

    1976-01-01

    Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.

  2. Tenderness of pre- and post rigor lamb longissimus muscle.

    Science.gov (United States)

    Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad

    2011-08-01

    Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Evaluation of physical dimension changes as nondestructive measurements for monitoring rigor mortis development in broiler muscles.

    Science.gov (United States)

    Cavitt, L C; Sams, A R

    2003-07-01

    Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.

  4. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix

  5. Rigor index, fillet yield and proximate composition of cultured striped catfish (Pangasianodon hypophthalmus for its suitability in processing industries in Bangladesh

    Directory of Open Access Journals (Sweden)

    Salma Noor-E Islami

    2014-12-01

    Full Text Available Rigor-index in market-size striped catfish (Pangasianodon hypophthalmus, locally called Thai-Pangas was determined to assess fillet yield for production of value-added products. In whole fish, rigor started within 1 hr after death under both iced and room temperature conditions while rigor-index reached a maximum of 72.23% within 8 hr and 85.5% within 5 hr at room temperature and iced condition, respectively, which was fully relaxed after 22 hr under both storage conditions. Post-mortem muscle pH decreased to 6.8 after 2 hr, 6.2 after 8 hr and sharp increase to 6.9 after 9 hr. There was a positive correlation between rigor progress and pH shift in fish fillets. Hand filleting was done post-rigor and fillet yield experiment showed 50.4±2.1% fillet, 8.0±0.2% viscera, 8.0±1.3% skin and 32.0±3.2% carcass could be obtained from Thai-Pangas. Proximate composition analysis of four regions of Thai-Pangas viz., head region, middle region, tail region and viscera revealed moisture 78.36%, 81.14%, 81.45% and 57.33%; protein 15.83%, 15.97%, 16.14% and 17.20%; lipid 4.61%, 1.82%, 1.32% and 24.31% and ash 1.09%, 0.96%, 0.95% and 0.86%, respectively indicating suitability of Thai-Pangas for production of value-added products such as fish fillets.

  6. Differential rigor development in red and white muscle revealed by simultaneous measurement of tension and stiffness.

    Science.gov (United States)

    Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki

    2004-02-10

    Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.

  7. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Fractional calculus with applications in mechanics vibrations and diffusion processes

    CERN Document Server

    Atanackovic, T; Stankovic, Bogoljub; Zorica , Dusan

    2014-01-01

    This book contains mathematical preliminaries in which basic definitions of fractional derivatives and spaces are presented. The central part of the book contains various applications in classical mechanics including fields such as: viscoelasticity, heat conduction, wave propagation and variational Hamilton-type principles. Mathematical rigor will be observed in the applications. The authors provide some problems formulated in the classical setting and some in the distributional setting. The solutions to these problems are presented in analytical form and these solutions are then analyzed num

  9. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    Science.gov (United States)

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Rigor or mortis: best practices for preclinical research in neuroscience.

    Science.gov (United States)

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Rigor in Qualitative Supply Chain Management Research

    DEFF Research Database (Denmark)

    Goffin, Keith; Raja, Jawwad; Claes, Björn

    2012-01-01

    , reliability, and theoretical saturation. Originality/value – It is the authors' contention that the addition of the repertory grid technique to the toolset of methods used by logistics and supply chain management researchers can only enhance insights and the building of robust theories. Qualitative studies......Purpose – The purpose of this paper is to share the authors' experiences of using the repertory grid technique in two supply chain management studies. The paper aims to demonstrate how the two studies provided insights into how qualitative techniques such as the repertory grid can be made more...... rigorous than in the past, and how results can be generated that are inaccessible using quantitative methods. Design/methodology/approach – This paper presents two studies undertaken using the repertory grid technique to illustrate its application in supply chain management research. Findings – The paper...

  12. Rigorous Photogrammetric Processing of CHANG'E-1 and CHANG'E-2 Stereo Imagery for Lunar Topographic Mapping

    Science.gov (United States)

    Di, K.; Liu, Y.; Liu, B.; Peng, M.

    2012-07-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.

  13. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  14. A Rigorous Treatment of Energy Extraction from a Rotating Black Hole

    Science.gov (United States)

    Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.

    2009-05-01

    The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.

  15. High and low rigor temperature effects on sheep meat tenderness and ageing.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  16. Layout optimization of DRAM cells using rigorous simulation model for NTD

    Science.gov (United States)

    Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe

    2014-03-01

    scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.

  17. Physiological studies of muscle rigor mortis in the fowl

    International Nuclear Information System (INIS)

    Nakahira, S.; Kaneko, K.; Tanaka, K.

    1990-01-01

    A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)

  18. Estimation of the breaking of rigor mortis by myotonometry.

    Science.gov (United States)

    Vain, A; Kauppila, R; Vuori, E

    1996-05-31

    Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.

  19. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  20. Rigorous solution to Bargmann-Wigner equation for integer spin

    CERN Document Server

    Huang Shi Zhong; Wu Ning; Zheng Zhi Peng

    2002-01-01

    A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived

  1. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    Directory of Open Access Journals (Sweden)

    K. Di

    2012-07-01

    Full Text Available Chang'E-1(CE-1 and Chang'E-2(CE-2 are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1 refining EOPs by correcting the attitude angle bias, 2 refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model and DOM (Digital Ortho Map are automatically generated.

  2. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    Science.gov (United States)

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  3. Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation

    Directory of Open Access Journals (Sweden)

    Ümit Çiftçi

    2010-03-01

    Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test

  4. Bringing scientific rigor to community-developed programs in Hong Kong

    Directory of Open Access Journals (Sweden)

    Fabrizio Cecilia S

    2012-12-01

    Full Text Available Abstract Background This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR. Methods The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Results Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. Conclusions The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  5. Bringing scientific rigor to community-developed programs in Hong Kong.

    Science.gov (United States)

    Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M

    2012-12-31

    This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  6. Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis

    Directory of Open Access Journals (Sweden)

    Alireza Raygan Shirazinezhad

    2015-06-01

    Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.

  7. Rigor mortis development in turkey breast muscle and the effect of electrical stunning.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-11-01

    Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.

  8. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Jacobs, M.; van der Padt, A.

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  9. Rigorous bounds on the free energy of electron-phonon models

    NARCIS (Netherlands)

    Raedt, Hans De; Michielsen, Kristel

    1997-01-01

    We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do

  10. Accelerating Biomedical Discoveries through Rigor and Transparency.

    Science.gov (United States)

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  11. Trends: Rigor Mortis in the Arts.

    Science.gov (United States)

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  12. Rigor, vigor, and the study of health disparities.

    Science.gov (United States)

    Adler, Nancy; Bush, Nicole R; Pantell, Matthew S

    2012-10-16

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.

  13. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    Science.gov (United States)

    Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-03-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.

  14. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    International Nuclear Information System (INIS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-01-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc

  15. A CUMULATIVE MIGRATION METHOD FOR COMPUTING RIGOROUS TRANSPORT CROSS SECTIONS AND DIFFUSION COEFFICIENTS FOR LWR LATTICES WITH MONTE CARLO

    Energy Technology Data Exchange (ETDEWEB)

    Zhaoyuan Liu; Kord Smith; Benoit Forget; Javier Ortensi

    2016-05-01

    A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices. Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.

  16. Biomedical text mining for research rigor and integrity: tasks, challenges, directions.

    Science.gov (United States)

    Kilicoglu, Halil

    2017-06-13

    An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.

  17. How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research

    OpenAIRE

    Wolf, Joachim; Rosenberg, Timo

    2012-01-01

    This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...

  18. Onset of rigor mortis is earlier in red muscle than in white muscle.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  19. Genomic signal processing

    CERN Document Server

    Shmulevich, Ilya

    2007-01-01

    Genomic signal processing (GSP) can be defined as the analysis, processing, and use of genomic signals to gain biological knowledge, and the translation of that knowledge into systems-based applications that can be used to diagnose and treat genetic diseases. Situated at the crossroads of engineering, biology, mathematics, statistics, and computer science, GSP requires the development of both nonlinear dynamical models that adequately represent genomic regulation, and diagnostic and therapeutic tools based on these models. This book facilitates these developments by providing rigorous mathema

  20. Photoconductivity of amorphous silicon-rigorous modelling

    International Nuclear Information System (INIS)

    Brada, P.; Schauer, F.

    1991-01-01

    It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)

  1. A Thermodynamic Library for Simulation and Optimization of Dynamic Processes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Gaspar, Jozsef; Jørgensen, John Bagterp

    2017-01-01

    Process system tools, such as simulation and optimization of dynamic systems, are widely used in the process industries for development of operational strategies and control for process systems. These tools rely on thermodynamic models and many thermodynamic models have been developed for different...... compounds and mixtures. However, rigorous thermodynamic models are generally computationally intensive and not available as open-source libraries for process simulation and optimization. In this paper, we describe the application of a novel open-source rigorous thermodynamic library, ThermoLib, which...... is designed for dynamic simulation and optimization of vapor-liquid processes. ThermoLib is implemented in Matlab and C and uses cubic equations of state to compute vapor and liquid phase thermodynamic properties. The novelty of ThermoLib is that it provides analytical first and second order derivatives...

  2. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    Science.gov (United States)

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  3. Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.

    Science.gov (United States)

    Forgatch, Marion S; Kjøbli, John

    2016-09-01

    Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.

  4. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  5. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    Science.gov (United States)

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    Science.gov (United States)

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  7. Rigor mortis in an unusual position: Forensic considerations.

    Science.gov (United States)

    D'Souza, Deepak H; Harish, S; Rajesh, M; Kiran, J

    2011-07-01

    We report a case in which the dead body was found with rigor mortis in an unusual position. The dead body was lying on its back with limbs raised, defying gravity. Direction of the salivary stains on the face was also defying the gravity. We opined that the scene of occurrence of crime is unlikely to be the final place where the dead body was found. The clues were revealing a homicidal offence and an attempt to destroy the evidence. The forensic use of 'rigor mortis in an unusual position' is in furthering the investigations, and the scientific confirmation of two facts - the scene of death (occurrence) is different from the scene of disposal of dead body, and time gap between the two places.

  8. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    Science.gov (United States)

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (psalting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  9. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    Science.gov (United States)

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (prigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  10. A novel construction of complex-valued Gaussian processes with arbitrary spectral densities and its application to excitation energy transfer.

    Science.gov (United States)

    Chen, Xin; Cao, Jianshu; Silbey, Robert J

    2013-06-14

    The recent experimental discoveries about excitation energy transfer (EET) in light harvesting antenna (LHA) attract a lot of interest. As an open non-equilibrium quantum system, the EET demands more rigorous theoretical framework to understand the interaction between system and environment and therein the evolution of reduced density matrix. A phonon is often used to model the fluctuating environment and convolutes the reduced quantum system temporarily. In this paper, we propose a novel way to construct complex-valued Gaussian processes to describe thermal quantum phonon bath exactly by converting the convolution of influence functional into the time correlation of complex Gaussian random field. Based on the construction, we propose a rigorous and efficient computational method, the covariance decomposition and conditional propagation scheme, to simulate the temporarily entangled reduced system. The new method allows us to study the non-Markovian effect without perturbation under the influence of different spectral densities of the linear system-phonon coupling coefficients. Its application in the study of EET in the Fenna-Matthews-Olson model Hamiltonian under four different spectral densities is discussed. Since the scaling of our algorithm is linear due to its Monte Carlo nature, the future application of the method for large LHA systems is attractive. In addition, this method can be used to study the effect of correlated initial condition on the reduced dynamics in the future.

  11. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    Science.gov (United States)

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Rigorous theory of molecular orientational nonlinear optics

    International Nuclear Information System (INIS)

    Kwak, Chong Hoon; Kim, Gun Yeup

    2015-01-01

    Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecular hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented

  13. Likelihood-based inference for discretely observed birth-death-shift processes, with applications to evolution of mobile genetic elements.

    Science.gov (United States)

    Xu, Jason; Guttorp, Peter; Kato-Maeda, Midori; Minin, Vladimir N

    2015-12-01

    Continuous-time birth-death-shift (BDS) processes are frequently used in stochastic modeling, with many applications in ecology and epidemiology. In particular, such processes can model evolutionary dynamics of transposable elements-important genetic markers in molecular epidemiology. Estimation of the effects of individual covariates on the birth, death, and shift rates of the process can be accomplished by analyzing patient data, but inferring these rates in a discretely and unevenly observed setting presents computational challenges. We propose a multi-type branching process approximation to BDS processes and develop a corresponding expectation maximization algorithm, where we use spectral techniques to reduce calculation of expected sufficient statistics to low-dimensional integration. These techniques yield an efficient and robust optimization routine for inferring the rates of the BDS process, and apply broadly to multi-type branching processes whose rates can depend on many covariates. After rigorously testing our methodology in simulation studies, we apply our method to study intrapatient time evolution of IS6110 transposable element, a genetic marker frequently used during estimation of epidemiological clusters of Mycobacterium tuberculosis infections. © 2015, The International Biometric Society.

  14. Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies

    Science.gov (United States)

    Hagood, Margaret Carmody; Skinner, Emily Neil

    2015-01-01

    Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…

  15. Application of a systematic methodology for sustainable carbon dioxide utilization process design

    DEFF Research Database (Denmark)

    Plaza, Cristina Calvera; Frauzem, Rebecca; Gani, Rafiqul

    than carbon capture and storage. To achieve this a methodology is developed to design sustainable carbon dioxide utilization processes. First, the information on the possible utilization alternatives is collected, including the economic potential of the process and the carbon dioxide emissions...... emission are desired in order to reduce the carbon dioxide emissions. Using this estimated preliminary evaluation, the top processes, with the most negative carbon dioxide emission are investigated by rigorous detailed simulation to evaluate the net carbon dioxide emissions. Once the base case design...

  16. Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals

    Science.gov (United States)

    Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna

    2012-01-01

    Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…

  17. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    Science.gov (United States)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  18. Radiation processes in astrophysics

    CERN Document Server

    Tucker, Wallace H

    1975-01-01

    The purpose of this book is twofold: to provide a brief, simple introduction to the theory of radiation and its application in astrophysics and to serve as a reference manual for researchers. The first part of the book consists of a discussion of the basic formulas and concepts that underlie the classical and quantum descriptions of radiation processes. The rest of the book is concerned with applications. The spirit of the discussion is to present simple derivations that will provide some insight into the basic physics involved and then to state the exact results in a form useful for applications. The reader is referred to the original literature and to reviews for rigorous derivations.The wide range of topics covered is illustrated by the following table of contents: Basic Formulas for Classical Radiation Processes; Basic Formulas for Quantum Radiation Processes; Cyclotron and Synchrotron Radiation; Electron Scattering; Bremsstrahlung and Collision Losses; Radiative Recombination; The Photoelectric Effect; a...

  19. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  20. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  1. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1986-01-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first- and second-order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first-order formulation satisfies the conditions of the Hille--Yosida theorem. A foundation is laid thereby within which the domains associated with the first- and second-order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  2. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1985-05-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first and second order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first order formulation satisfies the conditions of the Hille-Yosida theorem. A foundation is laid thereby within which the domains associated with the first and second order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  3. Rigor mortis development at elevated temperatures induces pale exudative turkey meat characteristics.

    Science.gov (United States)

    McKee, S R; Sams, A R

    1998-01-01

    Development of rigor mortis at elevated post-mortem temperatures may contribute to turkey meat characteristics that are similar to those found in pale, soft, exudative pork. To evaluate this effect, 36 Nicholas tom turkeys were processed at 19 wk of age and placed in water at 40, 20, and 0 C immediately after evisceration. Pectoralis muscle samples were taken at 15 min, 30 min, 1 h, 2 h, and 4 h post-mortem and analyzed for R-value (an indirect measure of adenosine triphosphate), glycogen, pH, color, and sarcomere length. At 4 h, the remaining intact Pectoralis muscle was harvested, and aged on ice 23 h, and analyzed for drip loss, cook loss, shear values, and sarcomere length. By 15 min post-mortem, the 40 C treatment had higher R-values, which persisted through 4 h. By 1 h, the 40 C treatment pH and glycogen levels were lower than the 0 C treatment; however, they did not differ from those of the 20 C treatment. Increased L* values indicated that color became more pale by 2 h post-mortem in the 40 C treatment when compared to the 20 and 0 C treatments. Drip loss, cook loss, and shear value were increased whereas sarcomere lengths were decreased as a result of the 40 C treatment. These findings suggested that elevated post-mortem temperatures during processing resulted in acceleration of rigor mortis and biochemical changes in the muscle that produced pale, exudative meat characteristics in turkey.

  4. Caracterização do processo de rigor mortis em músculos de eqüinos e maciez da carne Caracterization of rigor mortis process of muscle horse and meat tenderness

    Directory of Open Access Journals (Sweden)

    Tatiana Pacheco Rodrigues

    2004-08-01

    Full Text Available Esta pesquisa utilizou 12 eqüinos de diferentes idades, abatidos em um matadouro-frigorífico (SIF 1803 em Araguari-MG, e estudou a temperatura, pH, comprimento de sarcômero em diferentes intervalos de tempo após abate (1h, 5h, 8h, 10h, 12h, 15h e 24h e força de cisalhamento (maciez dos músculos Longissimus dorsi e Semitendinosus, com intuito de caracterizar o desenvolvimento do processo de rigor mortis de eqüídeos durante o processamento industrial. A temperatura da câmara fria variou de 10,2°C a 4,0°C e a temperatura média inicial das carcaças foi de 35,32°C e a final de 4,15°C. O pH inicial médio do músculo Longissimus dorsi foi 6,49 e o final 5,63, e para o músculo Semitendinosus o pH inicial médio foi 6,44 e o final 5,70. A menor medida de sarcômero observada em ambos os músculos foi na 15ª hora após abate, ou seja, 1,44µm e 1,41µm, respectivamente. A carne dos eqüídeos adultos foi mais dura (pThis work studied 12 horses at different ages butchered in a slaughterhouse in Minas Gerais State, Brazil (SIF 1803 and evaluated temperature, pH, sarcomere length in different periods after slaughter (1h, 5h, 8h, 10h, 12h, 15h, and 24 hours as well as the shear force (meat tenderness of the Longissimus dorsi and Semitendinosus muscles, aiming at characterizing the rigor mortis onset in the meat during industrial processing. The chilly room temperature varied from 10.2°C to 4.0°C, and the mean initial carcass temperature was 35.32°C and the final one was 4.15°C. The mean initial pH of Longissimus dorsi was 6.49 and the final one was 5.63; the mean initial pH of Semitendinosus was 6.44 and the final one was 5.70. The smallest sarcomere size obtained in both muscles occurred at 15 hours postmortem, and the sarcomere lengths were 1.44 µm and 1.41 µm, respectively. The meat from adult horses was tougher than that from young ones (p<0.05, and the Semitendinosus muscle was tougher than Longissimus dorsi muscle.

  5. Direct integration of the S-matrix applied to rigorous diffraction

    International Nuclear Information System (INIS)

    Iff, W; Lindlein, N; Tishchenko, A V

    2014-01-01

    A novel Fourier method for rigorous diffraction computation at periodic structures is presented. The procedure is based on a differential equation for the S-matrix, which allows direct integration of the S-matrix blocks. This results in a new method in Fourier space, which can be considered as a numerically stable and well-parallelizable alternative to the conventional differential method based on T-matrix integration and subsequent conversions from the T-matrices to S-matrix blocks. Integration of the novel differential equation in implicit manner is expounded. The applicability of the new method is shown on the basis of 1D periodic structures. It is clear however, that the new technique can also be applied to arbitrary 2D periodic or periodized structures. The complexity of the new method is O(N 3 ) similar to the conventional differential method with N being the number of diffraction orders. (fast track communication)

  6. Validation of a functional model for integration of safety into process system design

    DEFF Research Database (Denmark)

    Wu, J.; Lind, M.; Zhang, X.

    2015-01-01

    with the process system functionalities as required for the intended safety applications. To provide the scientific rigor and facilitate the acceptance of qualitative modelling, this contribution focuses on developing a scientifically based validation method for functional models. The Multilevel Flow Modeling (MFM...

  7. A rigorous semantics for BPMN 2.0 process diagrams

    CERN Document Server

    Kossak, Felix; Geist, Verena; Kubovy, Jan; Natschläger, Christine; Ziebermayr, Thomas; Kopetzky, Theodorich; Freudenthaler, Bernhard; Schewe, Klaus-Dieter

    2015-01-01

    This book provides the most complete formal specification of the semantics of the Business Process Model and Notation 2.0 standard (BPMN) available to date, in a style that is easily understandable for a wide range of readers - not only for experts in formal methods, but e.g. also for developers of modeling tools, software architects, or graduate students specializing in business process management. BPMN - issued by the Object Management Group - is a widely used standard for business process modeling. However, major drawbacks of BPMN include its limited support for organizational modeling, i

  8. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    Science.gov (United States)

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  9. A new look at the statistical assessment of approximate and rigorous methods for the estimation of stabilized formation temperatures in geothermal and petroleum wells

    International Nuclear Information System (INIS)

    Espinoza-Ojeda, O M; Santoyo, E; Andaverde, J

    2011-01-01

    Approximate and rigorous solutions of seven heat transfer models were statistically examined, for the first time, to estimate stabilized formation temperatures (SFT) of geothermal and petroleum boreholes. Constant linear and cylindrical heat source models were used to describe the heat flow (either conductive or conductive/convective) involved during a borehole drilling. A comprehensive statistical assessment of the major error sources associated with the use of these models was carried out. The mathematical methods (based on approximate and rigorous solutions of heat transfer models) were thoroughly examined by using four statistical analyses: (i) the use of linear and quadratic regression models to infer the SFT; (ii) the application of statistical tests of linearity to evaluate the actual relationship between bottom-hole temperatures and time function data for each selected method; (iii) the comparative analysis of SFT estimates between the approximate and rigorous predictions of each analytical method using a β ratio parameter to evaluate the similarity of both solutions, and (iv) the evaluation of accuracy in each method using statistical tests of significance, and deviation percentages between 'true' formation temperatures and SFT estimates (predicted from approximate and rigorous solutions). The present study also enabled us to determine the sensitivity parameters that should be considered for a reliable calculation of SFT, as well as to define the main physical and mathematical constraints where the approximate and rigorous methods could provide consistent SFT estimates

  10. The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools

    Science.gov (United States)

    Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia

    2016-01-01

    Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…

  11. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  12. Experimental evaluation of rigor mortis. VIII. Estimation of time since death by repeated measurements of the intensity of rigor mortis on rats.

    Science.gov (United States)

    Krompecher, T

    1994-10-21

    The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.

  13. ATP, IMP, and glycogen in cod muscle at onset and during development of rigor mortis depend on the sampling location

    DEFF Research Database (Denmark)

    Cappeln, Gertrud; Jessen, Flemming

    2002-01-01

    Variation in glycogen, ATP, and IMP contents within individual cod muscles were studied in ice stored fish during the progress of rigor mortis. Rigor index was determined before muscle samples for chemical analyzes were taken at 16 different positions on the fish. During development of rigor......, the contents of glycogen and ATP decreased differently in relation to rigor index depending on sampling location. Although fish were considered to be in strong rigor according to the rigor index method, parts of the muscle were not in rigor as high ATP concentrations were found in dorsal and tall muscle....

  14. Integrated Process Design and Control of Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2015-01-01

    on the element concept, which is used to translate a system of compounds into elements. The operation of the reactive distillation column at the highest driving force and other candidate points is analyzed through analytical solution as well as rigorous open-loop and closed-loop simulations. By application...... of this approach, it is shown that designing the reactive distillation process at the maximum driving force results in an optimal design in terms of controllability and operability. It is verified that the reactive distillation design option is less sensitive to the disturbances in the feed at the highest driving...

  15. Disciplining Bioethics: Towards a Standard of Methodological Rigor in Bioethics Research

    Science.gov (United States)

    Adler, Daniel; Shaul, Randi Zlotnik

    2012-01-01

    Contemporary bioethics research is often described as multi- or interdisciplinary. Disciplines are characterized, in part, by their methods. Thus, when bioethics research draws on a variety of methods, it crosses disciplinary boundaries. Yet each discipline has its own standard of rigor—so when multiple disciplinary perspectives are considered, what constitutes rigor? This question has received inadequate attention, as there is considerable disagreement regarding the disciplinary status of bioethics. This disagreement has presented five challenges to bioethics research. Addressing them requires consideration of the main types of cross-disciplinary research, and consideration of proposals aiming to ensure rigor in bioethics research. PMID:22686634

  16. School Psychology as a Relational Enterprise: The Role and Process of Qualitative Methodology

    Science.gov (United States)

    Newman, Daniel S.; Clare, Mary M.

    2016-01-01

    The purpose of this article is to explore the application of qualitative research to establishing a more complete understanding of relational processes inherent in school psychology practice. We identify the building blocks of rigorous qualitative research design through a conceptual overview of qualitative paradigms, methodologies, methods (i.e.,…

  17. 49 CFR 262.11 - Application process.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Application process. 262.11 Section 262.11 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... IMPROVEMENT PROJECTS § 262.11 Application process. (a) All grant applications for opportunities funded under...

  18. Electrocardiogram artifact caused by rigors mimicking narrow complex tachycardia: a case report.

    Science.gov (United States)

    Matthias, Anne Thushara; Indrakumar, Jegarajah

    2014-02-04

    The electrocardiogram (ECG) is useful in the diagnosis of cardiac and non-cardiac conditions. Rigors due to shivering can cause electrocardiogram artifacts mimicking various cardiac rhythm abnormalities. We describe an 80-year-old Sri Lankan man with an abnormal electrocardiogram mimicking narrow complex tachycardia during the immediate post-operative period. Electrocardiogram changes caused by muscle tremor during rigors could mimic a narrow complex tachycardia. Identification of muscle tremor as a cause of electrocardiogram artifact can avoid unnecessary pharmacological and non-pharmacological intervention to prevent arrhythmias.

  19. Effect of muscle restraint on sheep meat tenderness with rigor mortis at 18°C.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Wells, Robyn W

    2002-02-01

    The effect on shear force of skeletal restraint and removing muscles from lamb m. longissimus thoracis et lumborum (LT) immediately after slaughter and electrical stimulation was undertaken at a rigor temperature of 18°C (n=15). The temperature of 18°C was achieved through chilling of electrically stimulated sheep carcasses in air at 12°C, air flow 1-1.5 ms(-2). In other groups, the muscle was removed at 2.5 h post-mortem and either wrapped or left non-wrapped before being placed back on the carcass to follow carcass cooling regimes. Following rigor mortis, the meat was aged for 0, 16, 40 and 65 h at 15°C and frozen. For the non-stimulated samples, the meat was aged for 0, 12, 36 and 60 h before being frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1 × 1 cm cross-section. Commencement of ageing was considered to take place at rigor mortis and this was taken as zero aged meat. There were no significant differences in the rate of tenderisation and initial shear force for all treatments. The 23% cook loss was similar for all wrapped and non-wrapped situations and the values decreased slightly with longer ageing durations. Wrapping was shown to mimic meat left intact on the carcass, as it prevented significant prerigor shortening. Such techniques allows muscles to be removed and placed in a controlled temperature environment to enable precise studies of ageing processes.

  20. Biclustering via optimal re-ordering of data matrices in systems biology: rigorous methods and comparative studies

    Directory of Open Access Journals (Sweden)

    Feng Xiao-Jiang

    2008-10-01

    Full Text Available Abstract Background The analysis of large-scale data sets via clustering techniques is utilized in a number of applications. Biclustering in particular has emerged as an important problem in the analysis of gene expression data since genes may only jointly respond over a subset of conditions. Biclustering algorithms also have important applications in sample classification where, for instance, tissue samples can be classified as cancerous or normal. Many of the methods for biclustering, and clustering algorithms in general, utilize simplified models or heuristic strategies for identifying the "best" grouping of elements according to some metric and cluster definition and thus result in suboptimal clusters. Results In this article, we present a rigorous approach to biclustering, OREO, which is based on the Optimal RE-Ordering of the rows and columns of a data matrix so as to globally minimize the dissimilarity metric. The physical permutations of the rows and columns of the data matrix can be modeled as either a network flow problem or a traveling salesman problem. Cluster boundaries in one dimension are used to partition and re-order the other dimensions of the corresponding submatrices to generate biclusters. The performance of OREO is tested on (a metabolite concentration data, (b an image reconstruction matrix, (c synthetic data with implanted biclusters, and gene expression data for (d colon cancer data, (e breast cancer data, as well as (f yeast segregant data to validate the ability of the proposed method and compare it to existing biclustering and clustering methods. Conclusion We demonstrate that this rigorous global optimization method for biclustering produces clusters with more insightful groupings of similar entities, such as genes or metabolites sharing common functions, than other clustering and biclustering algorithms and can reconstruct underlying fundamental patterns in the data for several distinct sets of data matrices arising

  1. 7 CFR 1416.703 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Application process. 1416.703 Section 1416.703 Agriculture Regulations of the Department of Agriculture (Continued) COMMODITY CREDIT CORPORATION, DEPARTMENT... PROGRAMS 2005 Hurricane Tree Assistance Program § 1416.703 Application process. (a) A complete application...

  2. 7 CFR 1416.103 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Application process. 1416.103 Section 1416.103 Agriculture Regulations of the Department of Agriculture (Continued) COMMODITY CREDIT CORPORATION, DEPARTMENT... PROGRAMS Livestock Compensation Program § 1416.103 Application process. (a) Applicants must submit to CCC...

  3. Process Fragment Libraries for Easier and Faster Development of Process-based Applications

    Directory of Open Access Journals (Sweden)

    David Schumm

    2011-01-01

    Full Text Available The term “process fragment” is recently gaining momentum in business process management research. We understand a process fragment as a connected and reusable process structure, which has relaxed completeness and consistency criteria compared to executable processes. We claim that process fragments allow for an easier and faster development of process-based applications. As evidence to this claim we present a process fragment concept and show a sample collection of concrete, real-world process fragments. We present advanced application scenarios for using such fragments in development of process-based applications. Process fragments are typically managed in a repository, forming a process fragment library. On top of a process fragment library from previous work, we discuss the potential impact of using process fragment libraries in cross-enterprise collaboration and application integration.

  4. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  5. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    Science.gov (United States)

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  6. Rigorous derivation of porous-media phase-field equations

    Science.gov (United States)

    Schmuck, Markus; Kalliadasis, Serafim

    2017-11-01

    The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.

  7. 7 CFR 1416.204 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Application process. 1416.204 Section 1416.204 Agriculture Regulations of the Department of Agriculture (Continued) COMMODITY CREDIT CORPORATION, DEPARTMENT... PROGRAMS Livestock Indemnity Program II § 1416.204 Application process. (a) Applicants must submit to CCC a...

  8. 7 CFR 1709.114 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Application process. 1709.114 Section 1709.114 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF... Application process. The RUS will request applications for high energy cost grants on a competitive basis by...

  9. 7 CFR 760.1105 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Application process. 760.1105 Section 760.1105 Agriculture Regulations of the Department of Agriculture (Continued) FARM SERVICE AGENCY, DEPARTMENT OF... Application process. (a) Participants must submit to FSA: (1) A completed application in accordance with § 760...

  10. 7 CFR 760.907 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Application process. 760.907 Section 760.907 Agriculture Regulations of the Department of Agriculture (Continued) FARM SERVICE AGENCY, DEPARTMENT OF... Application process. (a) To apply for 2005-2007 LIP, submit a completed application to the administrative...

  11. 49 CFR 80.7 - Application process.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Application process. 80.7 Section 80.7 Transportation Office of the Secretary of Transportation CREDIT ASSISTANCE FOR SURFACE TRANSPORTATION PROJECTS § 80.7 Application process. (a) Public and private applicants for credit assistance under this part...

  12. Effects of well-boat transportation on the muscle pH and onset of rigor mortis in Atlantic salmon.

    Science.gov (United States)

    Gatica, M C; Monti, G; Gallo, C; Knowles, T G; Warriss, P D

    2008-07-26

    During the transport of salmon (Salmo salar), in a well-boat, 10 fish were sampled at each of six stages: in cages after crowding at the farm (stage 1), in the well-boat after loading (stage 2), in the well-boat after eight hours transport and before unloading (stage 3), in the resting cages immediately after finishing unloading (stage 4), after 24 hours resting in cages, (stage 5) and in the processing plant after pumping from the resting cages (stage 6). The water in the well-boat was at ambient temperature with recirculation to the sea. At each stage the fish were stunned percussively and bled by gill cutting. Immediately after death, and then every three hours for 18 hours, the muscle pH and rigor index of the fish were measured. At successive stages the initial muscle pH of the fish decreased, except for a slight gain in stage 5, after they had been rested for 24 hours. The lowest initial muscle pH was observed at stage 6. The fishes' rigor index showed that rigor developed more quickly at each successive stage, except for a slight decrease in rate at stage 5, attributable to the recovery of muscle reserves.

  13. Sonoelasticity to monitor mechanical changes during rigor and ageing.

    Science.gov (United States)

    Ayadi, A; Culioli, J; Abouelkaram, S

    2007-06-01

    We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.

  14. Applicant Perspectives on the Otolaryngology Residency Application Process.

    Science.gov (United States)

    Ward, Matthew; Pingree, Christian; Laury, Adrienne M; Bowe, Sarah N

    2017-08-01

    It has been nearly 25 years since medical students were queried regarding their perspectives on otolaryngology-head and neck surgery (OHNS) residency selection. Understanding this viewpoint is critical to improving the current application process. To evaluate the perceptions of 2016 OHNS residency applicants regarding the application process and offer suggestions for reform. In this cross-sectional study of anonymous online survey data, a 14-question survey was designed based on resources obtained from a computerized PubMed, Ovid, and GoogleScholar database search of the English language from January 1, 1990, through December 31, 2015, was conducted using the following search terms: (medical student OR applicant) AND (application OR match) AND otolaryngology. The survey was administered to 2016 OHNS residency applicants to examine 4 primary areas: current attitudes toward the match, effect of the new Otolaryngology Program Directors Organization personal statement mandate, sources of advice and information, and suggestions for improvement. In January 2016, an email was sent to 100 program directors asking them to distribute the survey to current OHNS applicants at their institution. One follow-up reminder email was sent in February 2016. A link to the survey was posted on the Otomatch.com homepage on January 28, 2016, with the last response received on March 28, 2016. Survey responses regarding the residency application process. A total of 150 of 370 residency applicants (40.5%) responded to the survey. Of these, 125 respondents (90.6%) noted applying to programs in which they had no specific interest simply to improve their chances of matching. Applicants intended to apply to more programs than they actually did (63.6 vs 60.8; r = 0.19; 95% CI, -0.03 to 0.40). Program directors advised fewer applications than other sources; however, 58 respondents (38.7%) did not receive advice from a program director. A total of 121 respondents (80.7%) found online program

  15. 8 CFR 1240.63 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Application process. 1240.63 Section 1240.63 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE IMMIGRATION... Application process. (a) Form and fees. Except as provided in paragraph (b) of this section, the application...

  16. Model-Based Integrated Process Design and Controller Design of Chemical Processes

    DEFF Research Database (Denmark)

    Abd Hamid, Mohd Kamaruddin Bin

    that is typically formulated as a mathematical programming (optimization with constraints) problem is solved by the so-called reverse approach by decomposing it into four sequential hierarchical sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection......This thesis describes the development and application of a new systematic modelbased methodology for performing integrated process design and controller design (IPDC) of chemical processes. The new methodology is simple to apply, easy to visualize and efficient to solve. Here, the IPDC problem...... are ordered according to the defined performance criteria (objective function). The final selected design is then verified through rigorous simulation. In the pre-analysis sub-problem, the concepts of attainable region and driving force are used to locate the optimal process-controller design solution...

  17. A rigorous proof for the Landauer-Büttiker formula

    DEFF Research Database (Denmark)

    Cornean, Horia Decebal; Jensen, Arne; Moldoveanu, V.

    Recently, Avron et al. shed new light on the question of quantum transport in mesoscopic samples coupled to particle reservoirs by semi-infinite leads. They rigorously treat the case when the sample undergoes an adiabatic evolution thus generating a current through th leads, and prove the so call...

  18. 47 CFR 76.41 - Franchise application process.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Franchise application process. 76.41 Section 76... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Cable Franchise Applications § 76.41 Franchise application process. (a) Definition. Competitive franchise applicant. For the purpose of this section, an applicant...

  19. Application of wavelets in speech processing

    CERN Document Server

    Farouk, Mohamed Hesham

    2014-01-01

    This book provides a survey on wide-spread of employing wavelets analysis  in different applications of speech processing. The author examines development and research in different application of speech processing. The book also summarizes the state of the art research on wavelet in speech processing.

  20. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death

    Directory of Open Access Journals (Sweden)

    Evgeniy R. Galimov

    2018-03-01

    Full Text Available Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence. Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC. This phenomenon is accompanied by a wave of intramuscular Ca2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death.

  1. Process-driven applications with BPMN

    CERN Document Server

    Stiehl, Volker

    2014-01-01

    How can we optimize differentiating business processes and exploit their full potential? Here Volker Stiehl provides answers, utilizing the various options that the BPMN (Business Process Model and Notation) standard offers for planning, implementing and monitoring processes. The book presents an approach for implementing an architecture for applications that strives to find a balance between development and maintenance costs, sustainability, scalability and fault tolerance; that meets flexibility requirements without becoming inordinately complex itself; and that keeps the end application a

  2. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    Science.gov (United States)

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  3. A rigorous mechanistic model for predicting gas hydrate formation kinetics: The case of CO2 recovery and sequestration

    International Nuclear Information System (INIS)

    ZareNezhad, Bahman; Mottahedin, Mona

    2012-01-01

    Highlights: ► A mechanistic model for predicting gas hydrate formation kinetics is presented. ► A secondary nucleation rate model is proposed for the first time. ► Crystal–crystal collisions and crystal–impeller collisions are distinguished. ► Simultaneous determination of nucleation and growth kinetics are established. ► Important for design of gas hydrate based energy storage and CO 2 recovery systems. - Abstract: A rigorous mechanistic model for predicting gas hydrate formation crystallization kinetics is presented and the special case of CO 2 gas hydrate formation regarding CO 2 recovery and sequestration processes has been investigated by using the proposed model. A physical model for prediction of secondary nucleation rate is proposed for the first time and the formation rates of secondary nuclei by crystal–crystal collisions and crystal–impeller collisions are formulated. The objective functions for simultaneous determination of nucleation and growth kinetics are presented and a theoretical framework for predicting the dynamic behavior of gas hydrate formation is presented. Predicted time variations of CO 2 content, total number and surface area of produced hydrate crystals are in good agreement with the available experimental data. The proposed approach can have considerable application for design of gas hydrate converters regarding energy storage and CO 2 recovery processes.

  4. Plasma processing: Technologies and applications

    International Nuclear Information System (INIS)

    Naddaf, M.; Saloum, S.

    2005-01-01

    This study aims to present the fundamentals of physics of plasmas, methods of generation, diagnostics, and applications for processing of materials. The first chapter defines plasma in general as well as its main parameters, the most important differential equations in plasma physics, and classifies the types of plasmas. the various methods and techniques to create and sustain plasma are presented in the second chapter. Chapter 3 focuses on plasma diagnostic methods and tools. While chapter 4 deals with applications of plasma processing such as; surface modification of materials, plasma ashing and etching, plasma cutting, and the environmental applications of plasma. Plasma polymerization and its various applications have been presented in more details in the last chapter. (Author)

  5. Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research

    Science.gov (United States)

    Andriessen, Daniel

    2004-01-01

    This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…

  6. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  7. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    Science.gov (United States)

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  8. Rigorous, robust and systematic: Qualitative research and its contribution to burn care. An integrative review.

    Science.gov (United States)

    Kornhaber, Rachel Anne; de Jong, A E E; McLean, L

    2015-12-01

    Qualitative methods are progressively being implemented by researchers for exploration within healthcare. However, there has been a longstanding and wide-ranging debate concerning the relative merits of qualitative research within the health care literature. This integrative review aimed to exam the contribution of qualitative research in burns care and subsequent rehabilitation. Studies were identified using an electronic search strategy using the databases PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), Excerpta Medica database (EMBASE) and Scopus of peer reviewed primary research in English between 2009 to April 2014 using Whittemore and Knafl's integrative review method as a guide for analysis. From the 298 papers identified, 26 research papers met the inclusion criteria. Across all studies there was an average of 22 participants involved in each study with a range of 6-53 participants conducted across 12 nations that focussed on burns prevention, paediatric burns, appropriate acquisition and delivery of burns care, pain and psychosocial implications of burns trauma. Careful and rigorous application of qualitative methodologies promotes and enriches the development of burns knowledge. In particular, the key elements in qualitative methodological process and its publication are critical in disseminating credible and methodologically sound qualitative research. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  9. Rigorous derivation from Landau-de Gennes theory to Ericksen-Leslie theory

    OpenAIRE

    Wang, Wei; Zhang, Pingwen; Zhang, Zhifei

    2013-01-01

    Starting from Beris-Edwards system for the liquid crystal, we present a rigorous derivation of Ericksen-Leslie system with general Ericksen stress and Leslie stress by using the Hilbert expansion method.

  10. Striation Patterns of Ox Muscle in Rigor Mortis

    Science.gov (United States)

    Locker, Ronald H.

    1959-01-01

    Ox muscle in rigor mortis offers a selection of myofibrils fixed at varying degrees of contraction from sarcomere lengths of 3.7 to 0.7 µ. A study of this material by phase contrast and electron microscopy has revealed four distinct successive patterns of contraction, including besides the familiar relaxed and contracture patterns, two intermediate types (2.4 to 1.9 µ, 1.8 to 1.5 µ) not previously well described. PMID:14417790

  11. Application of parallel computing to seismic damage process simulation of an arch dam

    International Nuclear Information System (INIS)

    Zhong Hong; Lin Gao; Li Jianbo

    2010-01-01

    The simulation of damage process of high arch dam subjected to strong earthquake shocks is significant to the evaluation of its performance and seismic safety, considering the catastrophic effect of dam failure. However, such numerical simulation requires rigorous computational capacity. Conventional serial computing falls short of that and parallel computing is a fairly promising solution to this problem. The parallel finite element code PDPAD was developed for the damage prediction of arch dams utilizing the damage model with inheterogeneity of concrete considered. Developed with programming language Fortran, the code uses a master/slave mode for programming, domain decomposition method for allocation of tasks, MPI (Message Passing Interface) for communication and solvers from AZTEC library for solution of large-scale equations. Speedup test showed that the performance of PDPAD was quite satisfactory. The code was employed to study the damage process of a being-built arch dam on a 4-node PC Cluster, with more than one million degrees of freedom considered. The obtained damage mode was quite similar to that of shaking table test, indicating that the proposed procedure and parallel code PDPAD has a good potential in simulating seismic damage mode of arch dams. With the rapidly growing need for massive computation emerged from engineering problems, parallel computing will find more and more applications in pertinent areas.

  12. Radar micro-doppler signatures processing and applications

    CERN Document Server

    Chen, Victor C; Miceli, William J

    2014-01-01

    Radar Micro-Doppler Signatures: Processing and applications concentrates on the processing and application of radar micro-Doppler signatures in real world situations, providing readers with a good working knowledge on a variety of applications of radar micro-Doppler signatures.

  13. Mathematical framework for fast and rigorous track fit for the ZEUS detector

    Energy Technology Data Exchange (ETDEWEB)

    Spiridonov, Alexander

    2008-12-15

    In this note we present a mathematical framework for a rigorous approach to a common track fit for trackers located in the inner region of the ZEUS detector. The approach makes use of the Kalman filter and offers a rigorous treatment of magnetic field inhomogeneity, multiple scattering and energy loss. We describe mathematical details of the implementation of the Kalman filter technique with a reduced amount of computations for a cylindrical drift chamber, barrel and forward silicon strip detectors and a forward straw drift chamber. Options with homogeneous and inhomogeneous field are discussed. The fitting of tracks in one ZEUS event takes about of 20ms on standard PC. (orig.)

  14. 7 CFR 1416.602 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Application process. 1416.602 Section 1416.602 Agriculture Regulations of the Department of Agriculture (Continued) COMMODITY CREDIT CORPORATION, DEPARTMENT... PROGRAMS Nursery Disaster Program § 1416.602 Application process. (a) Producers wishing to receive benefits...

  15. 7 CFR 1416.403 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Application process. 1416.403 Section 1416.403 Agriculture Regulations of the Department of Agriculture (Continued) COMMODITY CREDIT CORPORATION, DEPARTMENT... PROGRAMS Fruit and Vegetable Disaster Program § 1416.403 Application process. (a) Producers wishing to...

  16. 7 CFR 1416.303 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Application process. 1416.303 Section 1416.303 Agriculture Regulations of the Department of Agriculture (Continued) COMMODITY CREDIT CORPORATION, DEPARTMENT... PROGRAMS Citrus Disaster Program § 1416.303 Application process. (a) Producers wishing to receive benefits...

  17. 7 CFR 1416.503 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Application process. 1416.503 Section 1416.503 Agriculture Regulations of the Department of Agriculture (Continued) COMMODITY CREDIT CORPORATION, DEPARTMENT... PROGRAMS Tropical Fruit Disaster Program § 1416.503 Application process. (a) Producers wishing to receive...

  18. 7 CFR 1717.161 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Application process. 1717.161 Section 1717.161 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF... Consolidations of Electric Borrowers § 1717.161 Application process. (a) Borrowers are responsible for ensuring...

  19. 7 CFR 774.19 - Processing applications.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Processing applications. 774.19 Section 774.19 Agriculture Regulations of the Department of Agriculture (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS EMERGENCY LOAN FOR SEED PRODUCERS PROGRAM § 774.19 Processing applications...

  20. Laser applications in materials processing

    International Nuclear Information System (INIS)

    Ready, J.F.

    1980-01-01

    The seminar focused on laser annealing of semiconductors, laser processing of semiconductor devices and formation of coatings and powders, surface modification with lasers, and specialized laser processing methods. Papers were presented on the theoretical analysis of thermal and mass transport during laser annealing, applications of scanning continuous-wave and pulsed lasers in silicon technology, laser techniques in photovoltaic applications, and the synthesis of ceramic powders from laser-heated gas-phase reactants. Other papers included: reflectance changes of metals during laser irradiation, surface-alloying using high-power continuous lasers, laser growth of silicon ribbon, and commercial laser-shock processes

  1. Learning from Science and Sport - How we, Safety, "Engage with Rigor"

    Science.gov (United States)

    Herd, A.

    2012-01-01

    As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a

  2. Stochastic processes

    CERN Document Server

    Borodin, Andrei N

    2017-01-01

    This book provides a rigorous yet accessible introduction to the theory of stochastic processes. A significant part of the book is devoted to the classic theory of stochastic processes. In turn, it also presents proofs of well-known results, sometimes together with new approaches. Moreover, the book explores topics not previously covered elsewhere, such as distributions of functionals of diffusions stopped at different random times, the Brownian local time, diffusions with jumps, and an invariance principle for random walks and local times. Supported by carefully selected material, the book showcases a wealth of examples that demonstrate how to solve concrete problems by applying theoretical results. It addresses a broad range of applications, focusing on concrete computational techniques rather than on abstract theory. The content presented here is largely self-contained, making it suitable for researchers and graduate students alike.

  3. Caracterização do processo de rigor mortis em músculos de cordeiros da raça Santa Inês e F1 Santa Inês x Dorper Characterization of rigor mortis process of muscles lamb of Santa Inês breed and F1 Santa Inês x Dorper

    Directory of Open Access Journals (Sweden)

    Rafael dos Santos Costa

    2011-01-01

    álise sensorial, quando comparadas diferentes grupos genéticos, observou-se uma boa correlação inversa (r = -0,87.The development of rigor mortis process of butcher animal carcasses directly influencing the meat quality. The characteristics of rigor mortis process in ovine carcass during industrial chilling to obtain the chilled carcasses have been studied in other countries and in Brazil in Santa Ines sheep, but not yet established in F1 Dorper x Santa Inês. Thus, this research was designed to characterize the rigor mortis process of Semitendinosus and Triceps brachii muscles during the industrial chilling and meat tenderness in 10 ovine carcasses. Ten intact male ovines breed were randomly assembled, six of Santa Inês breed and four F1 Dorper x Santa Inês, slaughtered at Campos Slaughterhouse - Campos dos Goytacazes, Rio de Janeiro. After exsanguination, were measured temperature, pH and sarcomere length at different times (4h; 6h; 8h; 10h; 12h; and 24h and shear force or tenderness (48h of Semitendinosus muscle. In parallel was accomplished the sensorial analysis relationships to instrumental values of this muscle. The chilling room temperature varied between 12.2 °C (4h and -0.5°C (24h and the mean temperature of carcasses was 26.80°C and -0.20°C, respectively. The mean initial pH of Semitendinosus was 6.62 and final 5.64 and of Triceps brachii was 6.50 (4h and 5.68 (24h. The maximum contraction of sarcomere of Semitendinosus occurred at 12th hour (1.50mm after exsanguination whereas for the Triceps brachii,at the range of the 10th to 24th hours (1.53 to 1.57mm. Semitendinosus muscle shear force and tenderness was similar in lambs of Santa Ines breed and F1 Dorper x Santa Inês, demonstrating that the genetic group did not affect meat tenderness. The sensory panel confirmed the results obtained in instrumental analysis. The correlation of instrumental analysis (shear force when compared different genetic groups, was found a good inversed correlation (r = -0.87.

  4. A rigorous test for a new conceptual model for collisions

    International Nuclear Information System (INIS)

    Peixoto, E.M.A.; Mu-Tao, L.

    1979-01-01

    A rigorous theoretical foundation for the previously proposed model is formulated and applied to electron scattering by H 2 in the gas phase. An rigorous treatment of the interaction potential between the incident electron and the Hydrogen molecule is carried out to calculate Differential Cross Sections for 1 KeV electrons, using Glauber's approximation Wang's molecular wave function for the ground electronic state of H 2 . Moreover, it is shown for the first time that, when adequately done, the omission of two center terms does not adversely influence the results of molecular calculations. It is shown that the new model is far superior to the Independent Atom Model (or Independent Particle Model). The accuracy and simplicity of the new model suggest that it may be fruitfully applied to the description of other collision phenomena (e.g., in molecular beam experiments and nuclear physics). A new techniques is presented for calculations involving two center integrals within the frame work of the Glauber's approximation for scattering. (Author) [pt

  5. Rigorous Analysis of a Randomised Number Field Sieve

    OpenAIRE

    Lee, Jonathan; Venkatesan, Ramarathnam

    2018-01-01

    Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...

  6. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death.

    Science.gov (United States)

    Galimov, Evgeniy R; Pryor, Rosina E; Poole, Sarah E; Benedetto, Alexandre; Pincus, Zachary; Gems, David

    2018-03-06

    Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence). Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC). This phenomenon is accompanied by a wave of intramuscular Ca 2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca 2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death. VIDEO ABSTRACT. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  7. Rigorous upper bounds for transport due to passive advection by inhomogeneous turbulence

    International Nuclear Information System (INIS)

    Krommes, J.A.; Smith, R.A.

    1987-05-01

    A variational procedure, due originally to Howard and explored by Busse and others for self-consistent turbulence problems, is employed to determine rigorous upper bounds for the advection of a passive scalar through an inhomogeneous turbulent slab with arbitrary generalized Reynolds number R and Kubo number K. In the basic version of the method, the steady-state energy balance is used as a constraint; the resulting bound, though rigorous, is independent of K. A pedagogical reference model (one dimension, K = ∞) is described in detail; the bound compares favorably with the exact solution. The direct-interaction approximation is also worked out for this model; it is somewhat more accurate than the bound, but requires considerably more labor to solve. For the basic bound, a general formalism is presented for several dimensions, finite correlation length, and reasonably general boundary conditions. Part of the general method, in which a Green's function technique is employed, applies to self-consistent as well as to passive problems, and thereby generalizes previous results in the fluid literature. The formalism is extended for the first time to include time-dependent constraints, and a bound is deduced which explicitly depends on K and has the correct physical scalings in all regimes of R and K. Two applications from the theory of turbulent plasmas ae described: flux in velocity space, and test particle transport in stochastic magnetic fields. For the velocity space problem the simplest bound reproduces Dupree's original scaling for the strong turbulence diffusion coefficient. For the case of stochastic magnetic fields, the scaling of the bounds is described for the magnetic diffusion coefficient as well as for the particle diffusion coefficient in the so-called collisionless, fluid, and double-streaming regimes

  8. 8 CFR 240.63 - Application process.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Application process. 240.63 Section 240.63 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY IMMIGRATION REGULATIONS PROCEEDINGS TO DETERMINE... Cancellation of Removal Under Section 203 of Pub. L. 105-100 § 240.63 Application process. (a) Form and fees...

  9. Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D

    Science.gov (United States)

    Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas

    2017-11-01

    One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.

  10. Effect of pre-rigor stretch and various constant temperatures on the rate of post-mortem pH fall, rigor mortis and some quality traits of excised porcine biceps femoris muscle strips.

    Science.gov (United States)

    Vada-Kovács, M

    1996-01-01

    Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.

  11. The rigorous stochastic matrix multiplication scheme for the calculations of reduced equilibrium density matrices of open multilevel quantum systems

    International Nuclear Information System (INIS)

    Chen, Xin

    2014-01-01

    Understanding the roles of the temporary and spatial structures of quantum functional noise in open multilevel quantum molecular systems attracts a lot of theoretical interests. I want to establish a rigorous and general framework for functional quantum noises from the constructive and computational perspectives, i.e., how to generate the random trajectories to reproduce the kernel and path ordering of the influence functional with effective Monte Carlo methods for arbitrary spectral densities. This construction approach aims to unify the existing stochastic models to rigorously describe the temporary and spatial structure of Gaussian quantum noises. In this paper, I review the Euclidean imaginary time influence functional and propose the stochastic matrix multiplication scheme to calculate reduced equilibrium density matrices (REDM). In addition, I review and discuss the Feynman-Vernon influence functional according to the Gaussian quadratic integral, particularly its imaginary part which is critical to the rigorous description of the quantum detailed balance. As a result, I establish the conditions under which the influence functional can be interpreted as the average of exponential functional operator over real-valued Gaussian processes for open multilevel quantum systems. I also show the difference between the local and nonlocal phonons within this framework. With the stochastic matrix multiplication scheme, I compare the normalized REDM with the Boltzmann equilibrium distribution for open multilevel quantum systems

  12. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration

    Directory of Open Access Journals (Sweden)

    Alireza G. Kashani

    2015-11-01

    Full Text Available In addition to precise 3D coordinates, most light detection and ranging (LIDAR systems also record “intensity”, loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of “normalization”, “correction”, or “calibration” techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration.

  13. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration.

    Science.gov (United States)

    Kashani, Alireza G; Olsen, Michael J; Parrish, Christopher E; Wilson, Nicholas

    2015-11-06

    In addition to precise 3D coordinates, most light detection and ranging (LIDAR) systems also record "intensity", loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of "normalization", "correction", or "calibration" techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration.

  14. Correct software in web applications and web services

    CERN Document Server

    Thalheim, Bernhard; Prinz, Andreas; Buchberger, Bruno

    2015-01-01

    The papers in this volume aim at obtaining a common understanding of the challenging research questions in web applications comprising web information systems, web services, and web interoperability; obtaining a common understanding of verification needs in web applications; achieving a common understanding of the available rigorous approaches to system development, and the cases in which they have succeeded; identifying how rigorous software engineering methods can be exploited to develop suitable web applications; and at developing a European-scale research agenda combining theory, methods a

  15. Muscle pH, rigor mortis and blood variables in Atlantic salmon transported in two types of well-boat.

    Science.gov (United States)

    Gatica, M C; Monti, G E; Knowles, T G; Gallo, C B

    2010-01-09

    Two systems for transporting live salmon (Salmo salar) were compared in terms of their effects on blood variables, muscle pH and rigor index: an 'open system' well-boat with recirculated sea water at 13.5 degrees C and a stocking density of 107 kg/m3 during an eight-hour journey, and a 'closed system' well-boat with water chilled from 16.7 to 2.1 degrees C and a stocking density of 243.7 kg/m3 during a seven-hour journey. Groups of 10 fish were sampled at each of four stages: in cages at the farm, in the well-boat after loading, in the well-boat after the journey and before unloading, and in the processing plant after they were pumped from the resting cages. At each sampling, the fish were stunned and bled by gill cutting. Blood samples were taken to measure lactate, osmolality, chloride, sodium, cortisol and glucose, and their muscle pH and rigor index were measured at death and three hours later. In the open system well-boat, the initial muscle pH of the fish decreased at each successive stage, and at the final stage they had a significantly lower initial muscle pH and more rapid onset of rigor than the fish transported on the closed system well-boat. At the final stage all the blood variables except glucose were significantly affected in the fish transported on both types of well-boat.

  16. Fundamentals of adaptive signal processing

    CERN Document Server

    Uncini, Aurelio

    2015-01-01

    This book is an accessible guide to adaptive signal processing methods that equips the reader with advanced theoretical and practical tools for the study and development of circuit structures and provides robust algorithms relevant to a wide variety of application scenarios. Examples include multimodal and multimedia communications, the biological and biomedical fields, economic models, environmental sciences, acoustics, telecommunications, remote sensing, monitoring, and, in general, the modeling and prediction of complex physical phenomena. The reader will learn not only how to design and implement the algorithms but also how to evaluate their performance for specific applications utilizing the tools provided. While using a simple mathematical language, the employed approach is very rigorous. The text will be of value both for research purposes and for courses of study.

  17. Assessment of the Methodological Rigor of Case Studies in the Field of Management Accounting Published in Journals in Brazil

    Directory of Open Access Journals (Sweden)

    Kelly Cristina Mucio Marques

    2015-04-01

    Full Text Available This study aims to assess the methodological rigor of case studies in management accounting published in Brazilian journals. The study is descriptive. The data were collected using documentary research and content analysis, and 180 papers published from 2008 to 2012 in accounting journals rated as A2, B1, and B2 that were classified as case studies were selected. Based on the literature, we established a set of 15 criteria that we expected to be identified (either explicitly or implicitly in the case studies to classify those case studies as appropriate from the standpoint of methodological rigor. These criteria were partially met by the papers analyzed. The aspects less aligned with those proposed in the literature were the following: little emphasis on justifying the need to understand phenomena in context; lack of explanation of the reason for choosing the case study strategy; the predominant use of questions that do not enable deeper analysis; many studies based on only one source of evidence; little use of data and information triangulation; little emphasis on the data collection method; a high number of cases in which confusion between case study as a research strategy and as data collection method were detected; a low number of papers reporting the method of data analysis; few reports on a study's contributions; and a minority highlighting the issues requiring further research. In conclusion, the method used to apply case studies to management accounting must be improved because few studies showed rigorous application of the procedures that this strategy requires.

  18. Supercritical fluid technologies for ceramic-processing applications

    International Nuclear Information System (INIS)

    Matson, D.W.; Smith, R.D.

    1989-01-01

    This paper reports on the applications of supercritical fluid technologies for ceramic processing. The physical and chemical properties of these densified gases are summarized and related to their use as solvents and processing media. Several areas are identified in which specific ceramic processes benefit from the unique properties of supercritical fluids. The rapid expansion of supercritical fluid solutions provides a technique for producing fine uniform powders and thin films of widely varying materials. Supercritical drying technologies allow the formation of highly porous aerogel products with potentially wide application. Hydrothermal processes leading to the formation of large single crystals and microcrystalline powders can also be extended into the supercritical regime of water. Additional applications and potential applications are identified in the areas of extraction of binders and other additives from ceramic compacts, densification of porous ceramics, the formation of powders in supercritical micro-emulsions, and in preceramic polymer processing

  19. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  20. Pre-rigor temperature and the relationship between lamb tenderisation, free water production, bound water and dry matter.

    Science.gov (United States)

    Devine, Carrick; Wells, Robyn; Lowe, Tim; Waller, John

    2014-01-01

    The M. longissimus from lambs electrically stimulated at 15 min post-mortem were removed after grading, wrapped in polythene film and held at 4 (n=6), 7 (n=6), 15 (n=6, n=8) and 35°C (n=6), until rigor mortis then aged at 15°C for 0, 4, 24 and 72 h post-rigor. Centrifuged free water increased exponentially, and bound water, dry matter and shear force decreased exponentially over time. Decreases in shear force and increases in free water were closely related (r(2)=0.52) and were unaffected by pre-rigor temperatures. © 2013.

  1. Application of Java technology in radiation image processing

    International Nuclear Information System (INIS)

    Cheng Weifeng; Li Zheng; Chen Zhiqiang; Zhang Li; Gao Wenhuan

    2002-01-01

    The acquisition and processing of radiation image plays an important role in modern application of civil nuclear technology. The author analyzes the rationale of Java image processing technology which includes Java AWT, Java 2D and JAI. In order to demonstrate applicability of Java technology in field of image processing, examples of application of JAI technology in processing of radiation images of large container have been given

  2. New rigorous asymptotic theorems for inverse scattering amplitudes

    International Nuclear Information System (INIS)

    Lomsadze, Sh.Yu.; Lomsadze, Yu.M.

    1984-01-01

    The rigorous asymptotic theorems both of integral and local types obtained earlier and establishing logarithmic and in some cases even power correlations aetdeen the real and imaginary parts of scattering amplitudes Fsub(+-) are extended to the inverse amplitudes 1/Fsub(+-). One also succeeds in establishing power correlations of a new type between the real and imaginary parts, both for the amplitudes themselves and for the inverse ones. All the obtained assertions are convenient to be tested in high energy experiments when the amplitudes show asymptotic behaviour

  3. Rigorous analysis of image force barrier lowering in bounded geometries: application to semiconducting nanowires

    International Nuclear Information System (INIS)

    Calahorra, Yonatan; Mendels, Dan; Epstein, Ariel

    2014-01-01

    Bounded geometries introduce a fundamental problem in calculating the image force barrier lowering of metal-wrapped semiconductor systems. In bounded geometries, the derivation of the barrier lowering requires calculating the reference energy of the system, when the charge is at the geometry center. In the following, we formulate and rigorously solve this problem; this allows combining the image force electrostatic potential with the band diagram of the bounded geometry. The suggested approach is applied to spheres as well as cylinders. Furthermore, although the expressions governing cylindrical systems are complex and can only be evaluated numerically, we present analytical approximations for the solution, which allow easy implementation in calculated band diagrams. The results are further used to calculate the image force barrier lowering of metal-wrapped cylindrical nanowires; calculations show that although the image force potential is stronger than that of planar systems, taking the complete band-structure into account results in a weaker effect of barrier lowering. Moreover, when considering small diameter nanowires, we find that the electrostatic effects of the image force exceed the barrier region, and influence the electronic properties of the nanowire core. This study is of interest to the nanowire community, and in particular for the analysis of nanowire I−V measurements where wrapped or omega-shaped metallic contacts are used. (paper)

  4. Parallelism and Scalability in an Image Processing Application

    DEFF Research Database (Denmark)

    Rasmussen, Morten Sleth; Stuart, Matthias Bo; Karlsson, Sven

    2008-01-01

    parallel programs. This paper investigates parallelism and scalability of an embedded image processing application. The major challenges faced when parallelizing the application were to extract enough parallelism from the application and to reduce load imbalance. The application has limited immediately......The recent trends in processor architecture show that parallel processing is moving into new areas of computing in the form of many-core desktop processors and multi-processor system-on-chip. This means that parallel processing is required in application areas that traditionally have not used...

  5. Parallelism and Scalability in an Image Processing Application

    DEFF Research Database (Denmark)

    Rasmussen, Morten Sleth; Stuart, Matthias Bo; Karlsson, Sven

    2009-01-01

    parallel programs. This paper investigates parallelism and scalability of an embedded image processing application. The major challenges faced when parallelizing the application were to extract enough parallelism from the application and to reduce load imbalance. The application has limited immediately......The recent trends in processor architecture show that parallel processing is moving into new areas of computing in the form of many-core desktop processors and multi-processor system-on-chips. This means that parallel processing is required in application areas that traditionally have not used...

  6. Double phosphorylation of the myosin regulatory light chain during rigor mortis of bovine Longissimus muscle.

    Science.gov (United States)

    Muroya, Susumu; Ohnishi-Kameyama, Mayumi; Oe, Mika; Nakajima, Ikuyo; Shibata, Masahiro; Chikuni, Koichi

    2007-05-16

    To investigate changes in myosin light chains (MyLCs) during postmortem aging of the bovine longissimus muscle, we performed two-dimensional gel electrophoresis followed by identification with matrix-assisted laser desorption ionization time-of-flight mass spectrometry. The results of fluorescent differential gel electrophoresis showed that two spots of the myosin regulatory light chain (MyLC2) at pI values of 4.6 and 4.7 shifted toward those at pI values of 4.5 and 4.6, respectively, by 24 h postmortem when rigor mortis was completed. Meanwhile, the MyLC1 and MyLC3 spots did not change during the 14 days postmortem. Phosphoprotein-specific staining of the gels demonstrated that the MyLC2 proteins at pI values of 4.5 and 4.6 were phosphorylated. Furthermore, possible N-terminal region peptides containing one and two phosphoserine residues were detected in each mass spectrum of the MyLC2 spots at pI values of 4.5 and 4.6, respectively. These results demonstrated that MyLC2 became doubly phosphorylated during rigor formation of the bovine longissimus, suggesting involvement of the MyLC2 phosphorylation in the progress of beef rigor mortis. Bovine; myosin regulatory light chain (RLC, MyLC2); phosphorylation; rigor mortis; skeletal muscle.

  7. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  8. Stochastic Geometry and Quantum Gravity: Some Rigorous Results

    Science.gov (United States)

    Zessin, H.

    The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.

  9. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    Science.gov (United States)

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Radar signal processing and its applications

    CERN Document Server

    Hummel, Robert; Stoica, Petre; Zelnio, Edmund

    2003-01-01

    Radar Signal Processing and Its Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In twelve selected chapters, it describes the latest advances in architectures, design methods, and applications of radar signal processing. The contributors to this work were selected from the leading researchers and practitioners in the field. This work, originally published as Volume 14, Numbers 1-3 of the journal, Multidimensional Systems and Signal Processing, will be valuable to anyone working or researching in the field of radar signal processing. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  11. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  12. Paper 3: Content and Rigor of Algebra Credit Recovery Courses

    Science.gov (United States)

    Walters, Kirk; Stachel, Suzanne

    2014-01-01

    This paper describes the content, organization and rigor of the f2f and online summer algebra courses that were delivered in summers 2011 and 2012. Examining the content of both types of courses is important because research suggests that algebra courses with certain features may be better than others in promoting success for struggling students.…

  13. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    Science.gov (United States)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  14. Injection-salting and cold-smoking of farmed atlantic cod (Gadus morhua L.) and Atlantic salmon (Salmo salar L.) at different stages of Rigor Mortis: effect on physical properties.

    Science.gov (United States)

    Akse, L; Birkeland, S; Tobiassen, T; Joensen, S; Larsen, R

    2008-10-01

    Processing of fish is generally conducted postrigor, but prerigor processing is associated with some potential advantages. The aim of this study was to study how 5 processing regimes of cold-smoked cod and salmon conducted at different stages of rigor influenced yield, fillet shrinkage, and gaping. Farmed cod and salmon was filleted, salted by brine injection of 25% NaCl, and smoked for 2 h at different stages of rigor. Filleting and salting prerigor resulted in increased fillet shrinkage and less increase in weight during brine injection, which in turn was correlated to the salt content of the fillet. These effects were more pronounced in cod fillets when compared to salmon. Early processing reduced fillet gaping and fillets were evaluated as having a firmer texture. In a follow-up trial with cod, shrinkage and weight gain during injection was studied as an effect of processing time postmortem. No changes in weight gain were observed for fillets salted the first 24 h postmortem; however, by delaying the processing 12 h postmortem, the high and rapid shrinking of cod fillets during brine injection was halved.

  15. Applications of Parallel Processing in Mobile Banking

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The future of mobile banking will be represented by such applications that support mobile, Internet banking and EFT (Electronic Funds Transfer transactions in a single user interface. In such a way, the mobile banking will be able to cover all the types of applications demanded at the market level. The parallel processing of credit card bank transactions could be performed with the help of a grid network. Excluding some limitations, the grid processing offers huge opportunities to exploit the parallelism. For this reason, a lot of applications of waiting queues in grid processing were developed in the last years. Grid networks represent a distinctive and very modern field of the parallel and distributed processing.

  16. Re-establishment of rigor mortis: evidence for a considerably longer post-mortem time span.

    Science.gov (United States)

    Crostack, Chiara; Sehner, Susanne; Raupach, Tobias; Anders, Sven

    2017-07-01

    Re-establishment of rigor mortis following mechanical loosening is used as part of the complex method for the forensic estimation of the time since death in human bodies and has formerly been reported to occur up to 8-12 h post-mortem (hpm). We recently described our observation of the phenomenon in up to 19 hpm in cases with in-hospital death. Due to the case selection (preceding illness, immobilisation), transfer of these results to forensic cases might be limited. We therefore examined 67 out-of-hospital cases of sudden death with known time points of death. Re-establishment of rigor mortis was positive in 52.2% of cases and was observed up to 20 hpm. In contrast to the current doctrine that a recurrence of rigor mortis is always of a lesser degree than its first manifestation in a given patient, muscular rigidity at re-establishment equalled or even exceeded the degree observed before dissolving in 21 joints. Furthermore, this is the first study to describe that the phenomenon appears to be independent of body or ambient temperature.

  17. Death to perturbative QCD in exclusive processes?

    Energy Technology Data Exchange (ETDEWEB)

    Eckardt, R.; Hansper, J.; Gari, M.F. [Institut fuer Theoretische Physik, Bochum (Germany)

    1994-04-01

    The authors discuss the question of whether perturbative QCD is applicable in calculations of exclusive processes at available momentum transfers. They show that the currently used method of determining hadronic quark distribution amplitudes from QCD sum rules yields wave functions which are completely undetermined because the polynomial expansion diverges. Because of the indeterminacy of the wave functions no statement can be made at present as to whether perturbative QCD is valid. The authors emphasize the necessity of a rigorous discussion of the subject and the importance of experimental data in the range of interest.

  18. Precision microwave applicators and systems for plasma and materials processing

    International Nuclear Information System (INIS)

    Asmussen, J.; Garard, R.

    1988-01-01

    Modern applications of microwave energy have imposed new requirements upon microwave processing systems. Interest in energy efficiency, processing uniformity and control of process cycles has placed new design conditions upon microwave power oscillators, microwave systems and microwave applicator design. One approach of meeting new application requirements is the use of single-mode or controlled multimode applicators. The use of a single-mode applicator for plasma generation and materials processing will be presented. Descriptions of actual applicator designs for heating, curing, and processing of solid materials and the generations of high and low pressure discharges will be given. The impact of these applicators on the total microwave system including the microwave power source will be described. Specific examples of applicator and associated microwave systems will be detailed for the applications of (1) plasma thin film deposition and (2) the precision processing and diagnosis of materials. Methods of process control and diagnosis, control of process uniformity and process scale up are discussed

  19. Differential algebras with remainder and rigorous proofs of long-term stability

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    It is shown how in addition to determining Taylor maps of general optical systems, it is possible to obtain rigorous interval bounds for the remainder term of the n-th order Taylor expansion. To this end, the three elementary operations of addition, multiplication, and differentiation in the Differential Algebraic approach are augmented by suitable interval operations in such a way that a remainder bound of the sum, product, and derivative is obtained from the Taylor polynomial and remainder bound of the operands. The method can be used to obtain bounds for the accuracy with which a Taylor map represents the true map of the particle optical system. In a more general sense, it is also useful for a variety of other numerical problems, including rigorous global optimization of highly complex functions. Combined with methods to obtain pseudo-invariants of repetitive motion and extensions of the Lyapunov- and Nekhoroshev stability theory, the latter can be used to guarantee stability for storage rings and other weakly nonlinear systems

  20. A Draft Conceptual Framework of Relevant Theories to Inform Future Rigorous Research on Student Service-Learning Outcomes

    Science.gov (United States)

    Whitley, Meredith A.

    2014-01-01

    While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…

  1. Building an Evidence Base to Inform Interventions for Pregnant and Parenting Adolescents: A Call for Rigorous Evaluation

    Science.gov (United States)

    Burrus, Barri B.; Scott, Alicia Richmond

    2012-01-01

    Adolescent parents and their children are at increased risk for adverse short- and long-term health and social outcomes. Effective interventions are needed to support these young families. We studied the evidence base and found a dearth of rigorously evaluated programs. Strategies from successful interventions are needed to inform both intervention design and policies affecting these adolescents. The lack of rigorous evaluations may be attributable to inadequate emphasis on and sufficient funding for evaluation, as well as to challenges encountered by program evaluators working with this population. More rigorous program evaluations are urgently needed to provide scientifically sound guidance for programming and policy decisions. Evaluation lessons learned have implications for other vulnerable populations. PMID:22897541

  2. 48 CFR 719.273-6 - Application process.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Application process. 719.273-6 Section 719.273-6 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT...égé Program 719.273-6 Application process. Entities interested in becoming a Mentor firm must...

  3. Rigorous Mathematical Thinking Approach to Enhance Students’ Mathematical Creative and Critical Thinking Abilities

    Science.gov (United States)

    Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.

    2017-09-01

    The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.

  4. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  5. Advances in heuristic signal processing and applications

    CERN Document Server

    Chatterjee, Amitava; Siarry, Patrick

    2013-01-01

    There have been significant developments in the design and application of algorithms for both one-dimensional signal processing and multidimensional signal processing, namely image and video processing, with the recent focus changing from a step-by-step procedure of designing the algorithm first and following up with in-depth analysis and performance improvement to instead applying heuristic-based methods to solve signal-processing problems. In this book the contributing authors demonstrate both general-purpose algorithms and those aimed at solving specialized application problems, with a spec

  6. Fuzzy image processing and applications with Matlab

    CERN Document Server

    Chaira, Tamalika

    2009-01-01

    In contrast to classical image analysis methods that employ ""crisp"" mathematics, fuzzy set techniques provide an elegant foundation and a set of rich methodologies for diverse image-processing tasks. However, a solid understanding of fuzzy processing requires a firm grasp of essential principles and background knowledge.Fuzzy Image Processing and Applications with MATLAB® presents the integral science and essential mathematics behind this exciting and dynamic branch of image processing, which is becoming increasingly important to applications in areas such as remote sensing, medical imaging,

  7. Mathematical principles of signal processing Fourier and wavelet analysis

    CERN Document Server

    Brémaud, Pierre

    2002-01-01

    Fourier analysis is one of the most useful tools in many applied sciences. The recent developments of wavelet analysis indicates that in spite of its long history and well-established applications, the field is still one of active research. This text bridges the gap between engineering and mathematics, providing a rigorously mathematical introduction of Fourier analysis, wavelet analysis and related mathematical methods, while emphasizing their uses in signal processing and other applications in communications engineering. The interplay between Fourier series and Fourier transforms is at the heart of signal processing, which is couched most naturally in terms of the Dirac delta function and Lebesgue integrals. The exposition is organized into four parts. The first is a discussion of one-dimensional Fourier theory, including the classical results on convergence and the Poisson sum formula. The second part is devoted to the mathematical foundations of signal processing - sampling, filtering, digital signal proc...

  8. Parallel processing for fluid dynamics applications

    International Nuclear Information System (INIS)

    Johnson, G.M.

    1989-01-01

    The impact of parallel processing on computational science and, in particular, on computational fluid dynamics is growing rapidly. In this paper, particular emphasis is given to developments which have occurred within the past two years. Parallel processing is defined and the reasons for its importance in high-performance computing are reviewed. Parallel computer architectures are classified according to the number and power of their processing units, their memory, and the nature of their connection scheme. Architectures which show promise for fluid dynamics applications are emphasized. Fluid dynamics problems are examined for parallelism inherent at the physical level. CFD algorithms and their mappings onto parallel architectures are discussed. Several example are presented to document the performance of fluid dynamics applications on present-generation parallel processing devices

  9. HTR's role in process heat applications

    International Nuclear Information System (INIS)

    Kuhr, Reiner

    2008-01-01

    Advanced high-temperature nuclear reactors create a number of new opportunities for nuclear process heat applications. These opportunities are based on the high-temperature heat available, smaller reactor sizes, and enhanced safety features that allow siting close to process plants. Major sources of value include the displacement of premium fuels and the elimination of CO 2 emissions from combustion of conventional fuels and their use to produce hydrogen. High value applications include steam production and cogeneration, steam methane reforming, and water splitting. Market entry by advanced high-temperature reactor technology is challenged by the evolution of nuclear licensing requirements in countries targeted for early applications, by the development of a customer base not familiar with nuclear technology and related issues, by convergence of oil industry and nuclear industry risk management, by development of public and government policy support, by resolution of nuclear waste and proliferation concerns, and by the development of new business entities and business models to support commercialization. New HTR designs may see a larger opportunity in process heat niche applications than in power given competition from larger advanced light water reactors. Technology development is required in many areas to enable these new applications, including the commercialization of new heat exchangers capable of operating at high temperatures and pressures, convective process reactors and suitable catalysts, water splitting system and component designs, and other process-side requirements. Key forces that will shape these markets include future fuel availability and pricing, implementation and monetization of CO 2 emission limits, and the formation of international energy and environmental policy that will support initiatives to provide the nuclear licensing frameworks and risk distribution needed to support private investment. This paper was developed based on a plenary

  10. Consumer Attitudes to the Higher Education Application Process.

    Science.gov (United States)

    Clarke, Geraldine; Brown, M. A.

    1998-01-01

    A survey investigated the feelings and beliefs of British university applicants at one point in the application process: after the application is acknowledged but before offers or examination results are available. Results suggest many see the process as straightforward, helpful, friendly, but a sizeable group consider it slow and complex. Many…

  11. Feedback for relatedness and competence : Can feedback in blended learning contribute to optimal rigor, basic needs, and motivation?

    NARCIS (Netherlands)

    Bombaerts, G.; Nickel, P.J.

    2017-01-01

    We inquire how peer and tutor feedback influences students' optimal rigor, basic needs and motivation. We analyze questionnaires from two courses in two subsequent years. We conclude that feedback in blended learning can contribute to rigor and basic needs, but it is not clear from our data what

  12. Technical review of process heat applications using the HTGR

    International Nuclear Information System (INIS)

    Brierley, G.

    1976-06-01

    The demand for process heat applications is surveyed. Those applications which can be served only by the high temperature gas-cooled reactor (HTGR) are identified and the status of process heat applications in Europe, USA, and Japan in December 1975 is discussed. Technical problems associated with the HTGR for process heat applications are outlined together with an appraisal of the safety considerations involved. (author)

  13. The effect of rigor mortis on the passage of erythrocytes and fluid through the myocardium of isolated dog hearts.

    Science.gov (United States)

    Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M

    1978-07-01

    The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.

  14. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  15. Application of membrane technologies for liquid radioactive waste processing

    International Nuclear Information System (INIS)

    2004-01-01

    Membrane separation processes have made impressive progress since the first synthesis of membranes almost 40 years ago. This progress was driven by strong technological needs and commercial expectations. As a result the range of successful applications of membranes and membrane processes is continuously broadening. In addition, increasing application of membrane processes and technologies lies in the increasing variations of the nature and characteristics of commercial membranes and membrane apparatus. The objective of the report is to review the information on application of membrane technologies in the processing of liquid radioactive waste. The report covers the various types of membranes, equipment design, range of applications, operational experience and the performance characteristics of different membrane processes. The report aims to provide Member States with basic information on the applicability and limitations of membrane separation technologies for processing liquid radioactive waste streams

  16. Radio-Frequency Applications for Food Processing and Safety.

    Science.gov (United States)

    Jiao, Yang; Tang, Juming; Wang, Yifen; Koral, Tony L

    2018-03-25

    Radio-frequency (RF) heating, as a thermal-processing technology, has been extending its applications in the food industry. Although RF has shown some unique advantages over conventional methods in industrial drying and frozen food thawing, more research is needed to make it applicable for food safety applications because of its complex heating mechanism. This review provides comprehensive information regarding RF-heating history, mechanism, fundamentals, and applications that have already been fully developed or are still under research. The application of mathematical modeling as a useful tool in RF food processing is also reviewed in detail. At the end of the review, we summarize the active research groups in the RF food thermal-processing field, and address the current problems that still need to be overcome.

  17. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  18. Atlantic salmon skin and fillet color changes effected by perimortem handling stress, rigor mortis, and ice storage.

    Science.gov (United States)

    Erikson, U; Misimi, E

    2008-03-01

    The changes in skin and fillet color of anesthetized and exhausted Atlantic salmon were determined immediately after killing, during rigor mortis, and after ice storage for 7 d. Skin color (CIE L*, a*, b*, and related values) was determined by a Minolta Chroma Meter. Roche SalmoFan Lineal and Roche Color Card values were determined by a computer vision method and a sensory panel. Before color assessment, the stress levels of the 2 fish groups were characterized in terms of white muscle parameters (pH, rigor mortis, and core temperature). The results showed that perimortem handling stress initially significantly affected several color parameters of skin and fillets. Significant transient fillet color changes also occurred in the prerigor phase and during the development of rigor mortis. Our results suggested that fillet color was affected by postmortem glycolysis (pH drop, particularly in anesthetized fillets), then by onset and development of rigor mortis. The color change patterns during storage were different for the 2 groups of fish. The computer vision method was considered suitable for automated (online) quality control and grading of salmonid fillets according to color.

  19. Rigorous results on measuring the quark charge below color threshold

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1979-01-01

    Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)

  20. Rigorous Integration of Non-Linear Ordinary Differential Equations in Chebyshev Basis

    Czech Academy of Sciences Publication Activity Database

    Dzetkulič, Tomáš

    2015-01-01

    Roč. 69, č. 1 (2015), s. 183-205 ISSN 1017-1398 R&D Projects: GA MŠk OC10048; GA ČR GD201/09/H057 Institutional research plan: CEZ:AV0Z10300504 Keywords : Initial value problem * Rigorous integration * Taylor model * Chebyshev basis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.366, year: 2015

  1. Industrial Applications of Image Processing

    Science.gov (United States)

    Ciora, Radu Adrian; Simion, Carmen Mihaela

    2014-11-01

    The recent advances in sensors quality and processing power provide us with excellent tools for designing more complex image processing and pattern recognition tasks. In this paper we review the existing applications of image processing and pattern recognition in industrial engineering. First we define the role of vision in an industrial. Then a dissemination of some image processing techniques, feature extraction, object recognition and industrial robotic guidance is presented. Moreover, examples of implementations of such techniques in industry are presented. Such implementations include automated visual inspection, process control, part identification, robots control. Finally, we present some conclusions regarding the investigated topics and directions for future investigation

  2. Parallel and distributed processing: applications to power systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Felix; Murphy, Liam [California Univ., Berkeley, CA (United States). Dept. of Electrical Engineering and Computer Sciences

    1994-12-31

    Applications of parallel and distributed processing to power systems problems are still in the early stages. Rapid progress in computing and communications promises a revolutionary increase in the capacity of distributed processing systems. In this paper, the state-of-the art in distributed processing technology and applications is reviewed and future trends are discussed. (author) 14 refs.,1 tab.

  3. Reviving Markov processes and applications

    International Nuclear Information System (INIS)

    Cai, H.

    1988-01-01

    In this dissertation we study a procedure which restarts a Markov process when the process is killed by some arbitrary multiplicative functional. The regenerative nature of this revival procedure is characterized through a Markov renewal equation. An interesting duality between the revival procedure and the classical killing operation is found. Under the condition that the multiplicative functional possesses an intensity, the generators of the revival process can be written down explicitly. An intimate connection is also found between the perturbation of the sample path of a Markov process and the perturbation of a generator (in Kato's sense). The applications of the theory include the study of the processes like piecewise-deterministic Markov process, virtual waiting time process and the first entrance decomposition (taboo probability)

  4. Rigorous quantum limits on monitoring free masses and harmonic oscillators

    Science.gov (United States)

    Roy, S. M.

    2018-03-01

    There are heuristic arguments proposing that the accuracy of monitoring position of a free mass m is limited by the standard quantum limit (SQL): σ2( X (t ) ) ≥σ2( X (0 ) ) +(t2/m2) σ2( P (0 ) ) ≥ℏ t /m , where σ2( X (t ) ) and σ2( P (t ) ) denote variances of the Heisenberg representation position and momentum operators. Yuen [Phys. Rev. Lett. 51, 719 (1983), 10.1103/PhysRevLett.51.719] discovered that there are contractive states for which this result is incorrect. Here I prove universally valid rigorous quantum limits (RQL), viz. rigorous upper and lower bounds on σ2( X (t ) ) in terms of σ2( X (0 ) ) and σ2( P (0 ) ) , given by Eq. (12) for a free mass and by Eq. (36) for an oscillator. I also obtain the maximally contractive and maximally expanding states which saturate the RQL, and use the contractive states to set up an Ozawa-type measurement theory with accuracies respecting the RQL but beating the standard quantum limit. The contractive states for oscillators improve on the Schrödinger coherent states of constant variance and may be useful for gravitational wave detection and optical communication.

  5. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    Science.gov (United States)

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  6. Application of High Pressure in Food Processing

    Directory of Open Access Journals (Sweden)

    Herceg, Z.

    2011-01-01

    Full Text Available In high pressure processing, foods are subjected to pressures generally in the range of 100 – 800 (1200 MPa. The processing temperature during pressure treatments can be adjusted from below 0 °C to above 100 °C, with exposure times ranging from a few seconds to 20 minutes and even longer, depending on process conditions. The effects of high pressure are system volume reduction and acceleration of reactions that lead to volume reduction. The main areas of interest regarding high-pressure processing of food include: inactivation of microorganisms, modification of biopolymers, quality retention (especially in terms of flavour and colour, and changes in product functionality. Food components responsible for the nutritive value and sensory properties of food remain unaffected by high pressure. Based on the theoretical background of high-pressure processing and taking into account its advantages and limitations, this paper aims to show its possible application in food processing. The paper gives an outline of the special equipment used in highpressure processing. Typical high pressure equipment in which pressure can be generated either by direct or indirect compression are presented together with three major types of high pressure food processing: the conventional (batch system, semicontinuous and continuous systems. In addition to looking at this technology’s ability to inactivate microorganisms at room temperature, which makes it the ultimate alternative to thermal treatments, this paper also explores its application in dairy, meat, fruit and vegetable processing. Here presented are the effects of high-pressure treatment in milk and dairy processing on the inactivation of microorganisms and the modification of milk protein, which has a major impact on rennet coagulation and curd formation properties of treated milk. The possible application of this treatment in controlling cheese manufacture, ripening and safety is discussed. The opportunities

  7. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    Science.gov (United States)

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  8. Advances in Nuclear Power Process Heat Applications

    International Nuclear Information System (INIS)

    2012-05-01

    Following an IAEA coordinated research project, this publication compiles the findings of research and development activities related to practical nuclear process heat applications. An overview of current progress on high temperature gas cooled reactors coupling schemes for different process heat applications, such as hydrogen production and desalination is included. The associated safety aspects are also highlighted. The summary report documents the results and conclusions of the project.

  9. Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor

    Science.gov (United States)

    Nathues, Christina; Würbel, Hanno

    2016-01-01

    animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm–benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research. PMID:27911892

  10. Application of biohydrometallurgy to uranium ore processing

    International Nuclear Information System (INIS)

    Zhang Jiantang

    1989-01-01

    The development on application of biohydrometallargy to uranium ore processing is briefly introduced. The device designed for oxidizing ferrous ions in solution by using biomembrane, several bacterial leaching methods and the experimental results are given in this paper. The presented biohydrometallurgical process for recovering uranium includes bacterial leaching following by adsorption using tertiary amine resin 351 and oxidation of ferrous ions in the device with biomembranes. This process brings more economical benefits for treating silicate type original ores. The prospects on application of biogydrometallyurgy to solution mining is also discussed

  11. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  12. 7 CFR 1942.2 - Processing applications.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 13 2010-01-01 2009-01-01 true Processing applications. 1942.2 Section 1942.2 Agriculture Regulations of the Department of Agriculture (Continued) RURAL HOUSING SERVICE, RURAL BUSINESS... agency under Public Law 103-354 may be requested by the applicant under subpart B of part 1900 of this...

  13. Image processing applications: From particle physics to society

    International Nuclear Information System (INIS)

    Sotiropoulou, C.-L.; Citraro, S.; Dell'Orso, M.; Luciano, P.; Gkaitatzis, S.; Giannetti, P.

    2017-01-01

    We present an embedded system for extremely efficient real-time pattern recognition execution, enabling technological advancements with both scientific and social impact. It is a compact, fast, low consumption processing unit (PU) based on a combination of Field Programmable Gate Arrays (FPGAs) and the full custom associative memory chip. The PU has been developed for real time tracking in particle physics experiments, but delivers flexible features for potential application in a wide range of fields. It has been proposed to be used in accelerated pattern matching execution for Magnetic Resonance Fingerprinting (biomedical applications), in real time detection of space debris trails in astronomical images (space applications) and in brain emulation for image processing (cognitive image processing). We illustrate the potentiality of the PU for the new applications.

  14. THE APPLICATION PROCESS OF HAMBURG RULES, GIVEN THE CONTEXT OF THE EMERGENCE AND ENTRY INTO FORCE OF THE NEW ROMANIAN CIVIL CODE

    Directory of Open Access Journals (Sweden)

    Adriana Elena Belu

    2013-11-01

    Full Text Available The paper aims to conduct a comparative analysis and tries to offer an objective point of view regarding a number of questions arisen in practice, related to the applicability of the 1978 Hamburg Rules and keeping public order of Romanian private international law, such as those that aim at: agreeing upon the applicability of the foreign law by the Romanian parties; applicability of the Hamburg Rules; public nuisance of the Romanian private international law; character of public policy rule of the Hamburg Rules. In the application process of the Hamburg Rules, given the context of the emergence and entry into force of the New Civil Code, obviously, the provisions of the Romanian Civil Code shall apply in addition, where the international convention lacks. Therefore, in order to apply the logic of the provisions of the Civil Code in full compliance with the international standards, though giving priority to the latter rules, a rigorous analysis is required, analysis which becomes more complex given the fact that, in accordance with Art. 230 of Law no. 71/2011 to implement Law no. 287/2009 on the Civil Code, Book II "About Maritime Trade and Sailing" of the Commercial Code, will be abolished upon the entry into force of the Maritime Code, as those provisions remain in force, being applied with priority to the rules of the Civil Code.

  15. A rigorous proof of the Landau-Peierls formula and much more

    DEFF Research Database (Denmark)

    Briet, Philippe; Cornean, Horia; Savoie, Baptiste

    2012-01-01

    We present a rigorous mathematical treatment of the zero-field orbital magnetic susceptibility of a non-interacting Bloch electron gas, at fixed temperature and density, for both metals and semiconductors/insulators. In particular, we obtain the Landau-Peierls formula in the low temperature and d...... and density limit as conjectured by Kjeldaas and Kohn (Phys Rev 105:806–813, 1957)....

  16. Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.

    Science.gov (United States)

    Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia

    2015-01-01

    mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.

  17. Application engineering for process computer systems

    International Nuclear Information System (INIS)

    Mueller, K.

    1975-01-01

    The variety of tasks for process computers in nuclear power stations necessitates the centralization of all production stages from the planning stage to the delivery of the finished process computer system (PRA) to the user. This so-called 'application engineering' comprises all of the activities connected with the application of the PRA: a) establishment of the PRA concept, b) project counselling, c) handling of offers, d) handling of orders, e) internal handling of orders, f) technical counselling, g) establishing of parameters, h) monitoring deadlines, i) training of customers, j) compiling an operation manual. (orig./AK) [de

  18. Estimation of the time since death--reconsidering the re-establishment of rigor mortis.

    Science.gov (United States)

    Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter

    2013-01-01

    In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.

  19. College Readiness in California: A Look at Rigorous High School Course-Taking

    Science.gov (United States)

    Gao, Niu

    2016-01-01

    Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…

  20. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  1. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  2. Optimal correction and design parameter search by modern methods of rigorous global optimization

    International Nuclear Information System (INIS)

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  3. 12 CFR 926.5 - Housing associate application process.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Housing associate application process. 926.5 Section 926.5 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK MEMBERS AND HOUSING ASSOCIATES FEDERAL HOME LOAN BANK HOUSING ASSOCIATES § 926.5 Housing associate application process. (a...

  4. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  5. Applications of Radiation Processing in Industry

    International Nuclear Information System (INIS)

    Abad, Lucille V.

    2015-01-01

    Radiation processing has long been known as commercially viable technology that can be beneficially used to enhance the characteristics of many materials. Several gamma irradiators and electron beam accelerators are operating worldwide which are utilized for various established industrial applications. These could be used for the following processes: a) radiation crosslinking e.g. crosslinking of wires and cables, heat shrinkable film and tube productions, manufacture of plastic bags and tubings for medical products, pre-curing of automobile tire components, curing of polymeric coatings, etc. b) radiation degradation e.g. Scrap Teflon (Polytetraflouroethylene) to form powders, disinfestations and pasteurization of agricultural products, sterilization of medical products, etc.; and c) radiation grafting e.g. grafted non-woven fabrics for metal adsorbent. Emerging applications for radiation processing include grafted membranes for fuel cell, electrodes, cell sheet for tissue engineering, nanoparticle production, polymer composite synthesis, and fibrous catalyst for biodiesel production. Current researches at the Philippine Nuclear Research Institute consist of crosslinking of natural and synthetic polymers for medical application e.g. wound dressing, hemostats, and bioimplants for vesicouretal reflux (VUR); grafting of natural and synthetic fabrics for metal adsorbents; and radiation degradation of carrageenan as plant growth promoter. (author)

  6. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    Energy Technology Data Exchange (ETDEWEB)

    Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com [Faculty of Science, Chandrakasem Rajabhat University, Ratchadaphisek Road, Chatuchak, Bangkok 10900 (Thailand); Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com [Department of Mathematics and Computer Science, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Chatrabhuti, Auttakit, E-mail: dma3ac2@gmail.com [Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Visser, Matt, E-mail: matt.visser@msor.vuw.ac.nz [School of Mathematics, Statistics, and Operations Research, Victoria University of Wellington, PO Box 600, Wellington 6140 (New Zealand)

    2016-06-02

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  7. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    International Nuclear Information System (INIS)

    Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt

    2016-01-01

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  8. Use of neural networks in process engineering. Thermodynamics, diffusion, and process control and simulation applications

    International Nuclear Information System (INIS)

    Otero, F

    1998-01-01

    This article presents the current status of the use of Artificial Neural Networks (ANNs) in process engineering applications where common mathematical methods do not completely represent the behavior shown by experimental observations, results, and plant operating data. Three examples of the use of ANNs in typical process engineering applications such as prediction of activity in solvent-polymer binary systems, prediction of a surfactant self-diffusion coefficient of micellar systems, and process control and simulation are shown. These examples are important for polymerization applications, enhanced-oil recovery, and automatic process control

  9. Application of Hydroforming Process in Sheet Metal Formation

    OpenAIRE

    GRIZELJ, Branko; CUMIN, Josip; ERGIĆ, Todor

    2009-01-01

    This article deals with the theory and application of a hydroforming process. Nowadays automobile manufacturers use high strength sheet metal plates. This high strength steel sheet metal plates are strain hardened in the process of metal forming. With the use of high strength steel, cars are made lightweight, which is intended for low fuel consumption because of high energy prices. Some examples of application of a hydroforming process are simulated with FEM.

  10. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br

    2008-07-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  11. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    International Nuclear Information System (INIS)

    Botelho, Luiz C.L.

    2008-01-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  12. A plea for rigorous conceptual analysis as central method in transnational law design

    NARCIS (Netherlands)

    Rijgersberg, R.; van der Kaaij, H.

    2013-01-01

    Although shared problems are generally easily identified in transnational law design, it is considerably more difficult to design frameworks that transcend the peculiarities of local law in a univocal fashion. The following exposition is a plea for giving more prominence to rigorous conceptual

  13. Ion exchange process: History, evolution and applications

    International Nuclear Information System (INIS)

    Mazzoldi, P.; Carturan, S.; Sada, C.; Quaranta, A.; Sglavo, V.M.

    2013-01-01

    The aim of this paper is to present a review on some aspects and applications of ion exchange process in glasses, ferroelectric and polymers in the fields of optics, nanotechnology, gas sensors and chemical strengthening. The formation of nanoparticles in ion-exchanged glasses, as effect of ion or laser irradiation, is discussed. A discussion on the potentialities of ion exchange process in comparison to ion implantation in optical devices and nanotechnology is also introduced. Analytical techniques applied to the study of the ion exchange process are illustrated. The studies of ion exchange process in “Natural materials” constitute the content of a specific paragraph, for applications in water cleaning. Some initial considerations on the “old age” of this technique are introduced.

  14. Assessment of very high-temperature reactors in process applications. Appendix II. VHTR process heat application studies

    International Nuclear Information System (INIS)

    Jones, J.E.; Gambill, W.R.; Cooper, R.H.; Fox, E.C.; Fuller, L.C.; Littlefield, C.C.; Silverman, M.D.

    1977-06-01

    A critical review is presented of the technology and economics for coupling a very high-temperature gas-cooled reactor to a variety of process applications. It is concluded that nuclear steam reforming of light hydrocarbons for coal conversion could be a near-term alternative and that direct nuclear coal gasification could be a future consideration. Thermochemical water splitting appears to be more costly and its availability farther in the future than the coal-conversion systems. Nuclear steelmaking is competitive with the direct reduction of iron ore from conventional coal-conversion processes but not competitive with the reforming of natural gas at present gas prices. Nuclear process heat for petroleum refining, even with the necessary backup systems, is competitive with fossil energy sources. The processing with nuclear heat of oil shale and tar sands is of marginal economic importance. An analysis of peaking power applications using nuclear heat was also made. It is concluded that steam reforming methane for energy storage and production of peaking power is not a viable economic alternative, but that energy storage with a high-temperature heat transfer salt (HTS) is competitive with conventional peaking systems. An examination of the materials required in process heat exchangers is made

  15. Empirical processes: theory and applications

    OpenAIRE

    Venturini Sergio

    2005-01-01

    Proceedings of the 2003 Summer School in Statistics and Probability in Torgnon (Aosta, Italy) held by Prof. Jon A. Wellner and Prof. M. Banerjee. The topic presented was the theory of empirical processes with applications to statistics (m-estimation, bootstrap, semiparametric theory).

  16. A NEW BENCHMARK FOR PLANTWIDE PROCESS CONTROL

    Directory of Open Access Journals (Sweden)

    N. Klafke

    Full Text Available Abstract The hydrodealkylation process of toluene (HDA has been used as a case study in a large number of control studies. However, in terms of industrial application, this process has become obsolete and is nowadays superseded by new technologies capable of processing heavy aromatic compounds, which increase the added value of the raw materials, such as the process of transalkylation and disproportionation of toluene (TADP. TADP also presents more complex feed and product streams and challenging operational characteristics both in the reactor and separator sections than in HDA. This work is aimed at proposing the TADP process as a new benchmark for plantwide control studies in lieu of the HAD process. For this purpose, a nonlinear dynamic rigorous model for the TADP process was developed using Aspen Plus™ and Aspen Dynamics™ and industrial conditions. Plantwide control structures (oriented to control and to the process were adapted and applied for the first time for this process. The results show that, even though both strategies are similar in terms of control performance, the optimization of economic factors must still be sought.

  17. Rigor mortis and the epileptology of Charles Bland Radcliffe (1822-1889).

    Science.gov (United States)

    Eadie, M J

    2007-03-01

    Charles Bland Radcliffe (1822-1889) was one of the physicians who made major contributions to the literature on epilepsy in the mid-19th century, when the modern understanding of the disorder was beginning to emerge, particularly in England. His experimental work was concerned with the electrical properties of frog muscle and nerve. Early in his career he related his experimental findings to the phenomenon of rigor mortis and concluded that, contrary to the general belief of the time, muscle contraction depended on the cessation of nerve input, and muscle relaxation on its presence. He adhered to this counter-intuitive interpretation throughout his life and, based on it, produced an epileptology that was very different from those of his contemporaries and successors. His interpretations were ultimately without any direct influence on the advance of knowledge. However, his idea that withdrawal of an inhibitory process released previously suppressed muscular contractile powers, when applied to the brain rather than the periphery of the nervous system, permitted Hughlings Jackson to explain certain psychological phenomena that accompany or follow some epileptic events. As well, Radcliffe was one of the chief early advocates for potassium bromide, the first effective anticonvulsant.

  18. Some applications on laser material processing

    International Nuclear Information System (INIS)

    Oros, C.

    2005-01-01

    An overview of the state-of-the-art in laser material processing for a large types of lasers from IR (CO 2 laser, NdYAG laser) to UV (excimer laser) and different kinds of materials (metals, dielectrics) is given. Laser radiation has found a wide range of applications as machining tool for various kinds of materials processing. The machining geometry, the work piece geometry, the material properties and economic productivity claim for customized systems with special design for beam guiding, shaping and delivery in order to fully utilize the laser radiation for surface processing with optimum efficiency, maximum processing speed and high processing quality. The laser-material interaction involves complex processes of heating, melting, vaporization, ejection of atoms, ions, and molecules, shock waves, plasma initiation and plasma expansion. The interaction is dependent on the laser beam parameters (pulse duration, energy and wavelength), the solid target properties and the surrounding environments condition. Experimental results for laser surface melting and laser ablation are given. Also, assuming the applicability of a one dimensional model for short pulses used, and restricting condition to single-pulse exposure, the temperature rise on the target was calculated taking account of the finite optical absorption depth and pulse duration of the laser

  19. Supersymmetry and the Parisi-Sourlas dimensional reduction: A rigorous proof

    International Nuclear Information System (INIS)

    Klein, A.; Landau, L.J.; Perez, J.F.

    1984-01-01

    Functional integrals that are formally related to the average correlation functions of a classical field theory in the presence of random external sources are given a rigorous meaning. Their dimensional reduction to the Schwinger functions of the corresponding quantum field theory in two fewer dimensions is proven. This is done by reexpressing those functional integrals as expectations of a supersymmetric field theory. The Parisi-Sourlas dimensional reduction of a supersymmetric field theory to a usual quantum field theory in two fewer dimensions is proven. (orig.)

  20. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  1. Induction of depressed mood: a test of opponent-process theory.

    Science.gov (United States)

    Ranieri, D J; Zeiss, A M

    1984-12-01

    Solomon's (1980) opponent-process theory of acquired motivation has been used to explain many phenomena in which affective or hedonic contrasts appear to exist, but has not been applied to the induction of depressed mood. The purpose of this study, therefore, was to determine whether opponent-process theory can be applied to this area. Velten's (1968) mood-induction procedure was used and subjects were assigned either to a depression-induction condition or to one of two control groups. Self-report measures of depressed mood were taken before, during, and at several points after the mood induction. Results were not totally consistent with a rigorous set of criteria for supporting an opponent-process interpretation. This suggests that the opponent-process model may not be applicable to induced depressed mood. Possible weaknesses in the experimental design, along with implications for opponent-process theory, are discussed.

  2. Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II

    Energy Technology Data Exchange (ETDEWEB)

    George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg; Aiysha Sultana; Tyler Van Leeuwen

    2009-06-01

    This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2 storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.

  3. Post mortem rigor development in the Egyptian goose (Alopochen aegyptiacus) breast muscle (pectoralis): factors which may affect the tenderness.

    Science.gov (United States)

    Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C

    2016-01-15

    Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.

  4. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  5. Prospects of HTGR process heat application and role of HTTR

    International Nuclear Information System (INIS)

    Shiozawa, S.; Miyamoto, Y.

    2000-01-01

    At Japan Atomic Energy Research Institute, an effort on development of process heat application with high temperature gas cooled reactor (HTGR) has been continued for providing a future clean alternative to the burning of fossil energy for the production of industrial process heat. The project is named 'HTTR Heat Utilization Project', which includes a demonstration of hydrogen production using the first Japanese HTGR of High Temperature Engineering Test Reactor (HTTR). In the meantime, some countries, such as China, Indonesia, Russia and South Africa are trying to explore the HTGR process heat application for industrial use. One of the key issues for this application is economy. It has been recognized for a long time and still now that the HTGR heat application system is not economically competitive to the current fossil ones, because of the high cost of the HTGR itself. However, the recent movement on the HTGR development, as represented by South Africa Pebble Beds Modular Reactor (SA-PBMR) Project, has revealed that the HTGRs are well economically competitive in electricity production to fossil fuel energy supply under a certain condition. This suggests that the HTGR process heat application will also possibly get economical in the near future. In the present paper, following a brief introduction describing the necessity of the HTGRs for the future process heat application, Japanese activities and prospect of the development on the process heat application with the HTGRs are described in relation with the HTTR Project. In conclusion, the process heat application system with HTGRs is thought technically and economically to be one of the most promising applications to solve the global environmental issues and energy shortage which may happen in the future. However, the commercialization for the hydrogen production system from water, which is the final goal of the HTGR process heat application, must await the technology development to be completed in 2030's at the

  6. Toward a more rigorous application of margins and uncertainties within the nuclear weapons life cycle : a Sandia perspective

    International Nuclear Information System (INIS)

    Klenke, Scott Edward; Novotny, George Charles; Paulsen Robert A., Jr.; Diegert, Kathleen V.; Trucano, Timothy Guy; Pilch, Martin M.

    2007-01-01

    This paper presents the conceptual framework that is being used to define quantification of margins and uncertainties (QMU) for application in the nuclear weapons (NW) work conducted at Sandia National Laboratories. The conceptual framework addresses the margins and uncertainties throughout the NW life cycle and includes the definition of terms related to QMU and to figures of merit. Potential applications of QMU consist of analyses based on physical data and on modeling and simulation. Appendix A provides general guidelines for addressing cases in which significant and relevant physical data are available for QMU analysis. Appendix B gives the specific guidance that was used to conduct QMU analyses in cycle 12 of the annual assessment process. Appendix C offers general guidelines for addressing cases in which appropriate models are available for use in QMU analysis. Appendix D contains an example that highlights the consequences of different treatments of uncertainty in model-based QMU analyses

  7. Bayesian networks applied to process diagnostics. Applications in energy industry

    Energy Technology Data Exchange (ETDEWEB)

    Widarsson, Bjoern (ed.); Karlsson, Christer; Dahlquist, Erik [Maelardalen Univ., Vaesteraas (Sweden); Nielsen, Thomas D.; Jensen, Finn V. [Aalborg Univ. (Denmark)

    2004-10-01

    Uncertainty in process operation occurs frequently in heat and power industry. This makes it hard to find the occurrence of an abnormal process state from a number of process signals (measurements) or find the correct cause to an abnormality. Among several other methods, Bayesian Networks (BN) is a method to build a model which can handle uncertainty in both process signals and the process itself. The purpose of this project is to investigate the possibilities to use BN for fault detection and diagnostics in combined heat and power industries through execution of two different applications. Participants from Aalborg University represent the knowledge of BN and participants from Maelardalen University have the experience from modelling heat and power applications. The co-operation also includes two energy companies; Elsam A/S (Nordjyllandsverket) and Maelarenergi AB (Vaesteraas CHP-plant), where the two applications are made with support from the plant personnel. The project ended out in two quite different applications. At Nordjyllandsverket, an application based (due to the lack of process knowledge) on pure operation data is build with capability to detect an abnormal process state in a coal mill. Detection is made through a conflict analysis when entering process signals into a model built by analysing the operation database. The application at Maelarenergi is built with a combination of process knowledge and operation data and can detect various faults caused by the fuel. The process knowledge is used to build a causal network structure and the structure is then trained by data from the operation database. Both applications are made as off-online applications, but they are ready for being run on-line. The performance of fault detection and diagnostics are good, but a lack of abnormal process states with known cause reduces the evaluation possibilities. Advantages with combining expert knowledge of the process with operation data are the possibility to represent

  8. Review of computational fluid dynamics applications in biotechnology processes.

    Science.gov (United States)

    Sharma, C; Malhotra, D; Rathore, A S

    2011-01-01

    Computational fluid dynamics (CFD) is well established as a tool of choice for solving problems that involve one or more of the following phenomena: flow of fluids, heat transfer,mass transfer, and chemical reaction. Unit operations that are commonly utilized in biotechnology processes are often complex and as such would greatly benefit from application of CFD. The thirst for deeper process and product understanding that has arisen out of initiatives such as quality by design provides further impetus toward usefulness of CFD for problems that may otherwise require extensive experimentation. Not surprisingly, there has been increasing interest in applying CFD toward a variety of applications in biotechnology processing in the last decade. In this article, we will review applications in the major unit operations involved with processing of biotechnology products. These include fermentation,centrifugation, chromatography, ultrafiltration, microfiltration, and freeze drying. We feel that the future applications of CFD in biotechnology processing will focus on establishing CFD as a tool of choice for providing process understanding that can be then used to guide more efficient and effective experimentation. This article puts special emphasis on the work done in the last 10 years. © 2011 American Institute of Chemical Engineers

  9. Application of Asymptotic and Rigorous Techniques for the Characterization of Interferences Caused by a Wind Turbine in Its Neighborhood

    Directory of Open Access Journals (Sweden)

    Maria Jesús Algar

    2013-01-01

    Full Text Available This paper presents a complete assessment to the interferences caused in the nearby radio systems by wind turbines. Three different parameters have been considered: the scattered field of a wind turbine, its radar cross-section (RCS, and the Doppler shift generated by the rotating movements of the blades. These predictions are very useful for the study of the influence of wind farms in radio systems. To achieve this, both high-frequency techniques, such as Geometrical Theory of Diffraction/Uniform Theory of Diffraction (GTD/UTD and Physical Optics (PO, and rigorous techniques, like Method of Moments (MoM, have been used. In the analysis of the scattered field, conductor and dielectric models of the wind turbine have been analyzed. In this way, realistic results can be obtained. For all cases under analysis, the wind turbine has been modeled with NURBS (Non-Uniform Rational B-Spline surfaces since they allow the real shape of the object to be accurately replicated with very little information.

  10. 75 FR 29732 - Career and Technical Education Program-Promoting Rigorous Career and Technical Education Programs...

    Science.gov (United States)

    2010-05-27

    ... rigorous knowledge and skills in English- language arts and mathematics that employers and colleges expect... specialists and to access the student outcome data needed to meet annual evaluation and reporting requirements...

  11. Rigorous Line-Based Transformation Model Using the Generalized Point Strategy for the Rectification of High Resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Kun Hu

    2016-09-01

    Full Text Available High precision geometric rectification of High Resolution Satellite Imagery (HRSI is the basis of digital mapping and Three-Dimensional (3D modeling. Taking advantage of line features as basic geometric control conditions instead of control points, the Line-Based Transformation Model (LBTM provides a practical and efficient way of image rectification. It is competent to build the mathematical relationship between image space and the corresponding object space accurately, while it reduces the workloads of ground control and feature recognition dramatically. Based on generalization and the analysis of existing LBTMs, a novel rigorous LBTM is proposed in this paper, which can further eliminate the geometric deformation caused by sensor inclination and terrain variation. This improved nonlinear LBTM is constructed based on a generalized point strategy and resolved by least squares overall adjustment. Geo-positioning accuracy experiments with IKONOS, GeoEye-1 and ZiYuan-3 satellite imagery are performed to compare rigorous LBTM with other relevant line-based and point-based transformation models. Both theoretic analysis and experimental results demonstrate that the rigorous LBTM is more accurate and reliable without adding extra ground control. The geo-positioning accuracy of satellite imagery rectified by rigorous LBTM can reach about one pixel with eight control lines and can be further improved by optimizing the horizontal and vertical distribution of control lines.

  12. Engineering application of anaerobic ammonium oxidation process in wastewater treatment.

    Science.gov (United States)

    Mao, Nianjia; Ren, Hongqiang; Geng, Jinju; Ding, Lili; Xu, Ke

    2017-08-01

    Anaerobic ammonium oxidation (Anammox), a promising biological nitrogen removal process, has been verified as an efficient, sustainable and cost-effective alternative to conventional nitrification and denitrification processes. To date, more than 110 full-scale anammox plants have been installed and are in operation, treating industrial NH 4 + -rich wastewater worldwide, and anammox-based technologies are flourishing. This review the current state of the art for engineering applications of the anammox process, including various anammox-based technologies, reactor selection and attempts to apply it at different wastewater plants. Process control and implementation for stable performance are discussed as well as some remaining issues concerning engineering application are exposed, including the start-up period, process disturbances, greenhouse gas emissions and especially mainstream anammox applications. Finally, further development of the anammox engineering application is proposed in this review.

  13. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  14. The hand surgery fellowship application process: expectations, logistics, and costs.

    Science.gov (United States)

    Meals, Clifton; Osterman, Meredith

    2015-04-01

    To investigate expectations, logistics, and costs relevant to the hand surgery fellowship application process. We sought to discover (1) what both applicants and program directors are seeking, (2) what both parties have to offer, (3) how both parties collect information about each other, and (4) the costs incurred in arranging each match. We conducted on-line surveys of hand surgery fellowship applicants for appointment in 2015 and of current fellowship program directors. Sixty-two applicants and 41 program directors completed the survey. Results revealed applicants' demographic characteristics, qualifications, method of ranking hand fellowship programs, costs incurred (both monetary and opportunity) during the application process, ultimate match status, and suggestions for change. Results also revealed program directors' program demographics, rationale for offering interviews and favorably ranking applicants, application-related logistical details, costs incurred (both monetary and opportunity) during the application process, and suggestions for change. Applicants for hand surgery fellowship training are primarily interested in a potential program's academic reputation, emphasis on orthopedic surgery, and location. The typical, successfully matched applicant was a 30-year-old male orthopedic resident with 3 publications to his credit. Applicants rely on peers and Web sites for information about fellowships. Fellowship directors are primarily seeking applicants recommended by other experienced surgeons and with positive personality traits. The typical fellowship director offers a single year of orthopedic-based fellowship training to 2 fellows per year and relies on a common application and in-person interviews to collect information about applicants. Applicants appear to be more concerned than directors about the current state of the match process. Applicants and directors alike incur heavy costs, in both dollars and opportunity, to arrange each match. A nuanced

  15. Fractional Processes and Fractional-Order Signal Processing Techniques and Applications

    CERN Document Server

    Sheng, Hu; Qiu, TianShuang

    2012-01-01

    Fractional processes are widely found in science, technology and engineering systems. In Fractional Processes and Fractional-order Signal Processing, some complex random signals, characterized by the presence of a heavy-tailed distribution or non-negligible dependence between distant observations (local and long memory), are introduced and examined from the ‘fractional’ perspective using simulation, fractional-order modeling and filtering and realization of fractional-order systems. These fractional-order signal processing (FOSP) techniques are based on fractional calculus, the fractional Fourier transform and fractional lower-order moments. Fractional Processes and Fractional-order Signal Processing: • presents fractional processes of fixed, variable and distributed order studied as the output of fractional-order differential systems; • introduces FOSP techniques and the fractional signals and fractional systems point of view; • details real-world-application examples of FOSP techniques to demonstr...

  16. Dynamic modeling of the isoamyl acetate reactive distillation process

    Directory of Open Access Journals (Sweden)

    Ali Syed Sadiq

    2017-03-01

    Full Text Available The cost-effectiveness of reactive distillation (RD processes makes them highly attractive for industrial applications. However, their preliminary design and subsequent scale-up and operation are challenging. Specifically, the response of RD system during fluctuations in process parameters is of paramount importance to ensure the stability of the whole process. As a result of carrying out simulations using Aspen Plus, it is shown that the RD process for isoamyl acetate production was much more economical than conventional reactor distillation configuration under optimized process conditions due to lower utilities consumption, higher conversion and smaller sizes of condenser and reboiler. Rigorous dynamic modeling of RD system was performed to evaluate its sensitivity to disturbances in critical process parameters; the product flow was quite sensitive to disturbances. Even more sensitive was product composition when the disturbance in heat duties of condenser or reboiler led to a temperature decrease. However, positive disturbance in alcohol feed is of particular concern, which clearly made the system unstable.

  17. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  18. Enterprise and system of systems capability development life-cycle processes.

    Energy Technology Data Exchange (ETDEWEB)

    Beck, David Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While the approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.

  19. Rigorous patient-prosthesis matching of Perimount Magna aortic bioprosthesis.

    Science.gov (United States)

    Nakamura, Hiromasa; Yamaguchi, Hiroki; Takagaki, Masami; Kadowaki, Tasuku; Nakao, Tatsuya; Amano, Atsushi

    2015-03-01

    Severe patient-prosthesis mismatch, defined as effective orifice area index ≤0.65 cm(2) m(-2), has demonstrated poor long-term survival after aortic valve replacement. Reported rates of severe mismatch involving the Perimount Magna aortic bioprosthesis range from 4% to 20% in patients with a small annulus. Between June 2008 and August 2011, 251 patients (mean age 70.5 ± 10.2 years; mean body surface area 1.55 ± 0.19 m(2)) underwent aortic valve replacement with a Perimount Magna bioprosthesis, with or without concomitant procedures. We performed our procedure with rigorous patient-prosthesis matching to implant a valve appropriately sized to each patient, and carried out annular enlargement when a 19-mm valve did not fit. The bioprosthetic performance was evaluated by transthoracic echocardiography predischarge and at 1 and 2 years after surgery. Overall hospital mortality was 1.6%. Only 5 (2.0%) patients required annular enlargement. The mean follow-up period was 19.1 ± 10.7 months with a 98.4% completion rate. Predischarge data showed a mean effective orifice area index of 1.21 ± 0.20 cm(2) m(-2). Moderate mismatch, defined as effective orifice area index ≤0.85 cm(2) m(-2), developed in 4 (1.6%) patients. None developed severe mismatch. Data at 1 and 2 years showed only two cases of moderate mismatch; neither was severe. Rigorous patient-prosthesis matching maximized the performance of the Perimount Magna, and no severe mismatch resulted in this Japanese population of aortic valve replacement patients. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  20. Importance of All-in-one (MCNPX2.7.0+CINDER2008) Code for Rigorous Transmutation Study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Oyeon [Institute for Modeling and Simulation Convergence, Daegu (Korea, Republic of); Kim, Kwanghyun [RadTek Co. Ltd., Daejeon (Korea, Republic of)

    2015-10-15

    It can be utilized as a possible mechanism for reducing the volume and hazard of radioactive waste by transforming hazardous radioactive elements with long half-life into less hazardous elements with short halflife. Thus, the understanding of the transmutation mechanism and beneficial machinery design technologies are important and useful. Although the terminology transmutation was rooted back to alchemy which transforms the base metals into gold in the middle ages, Rutherford and Soddy were the first observers by discovering the natural transmutation as a part of radioactive decay of the alpha decay type in early 20th century. Along with the development of computing technology, analysis software, for example, CINDER was developed for rigorous atomic transmutation study. The code has a long history of development from the original work of T. England at Bettis Atomic Power Laboratory (BAPL) in the early 1960s. It has been used to calculate the inventory of nuclides in an irradiated material. CINDER'90 which is recently released involved an upgrade of the code to allow the spontaneous tracking of chains based upon the significant density or pass-by of a nuclide, where pass-by represents the density of a nuclide transforming to other nuclides. Nuclear transmutation process is governed by highly non-linear differential equation. Chaotic nature of the non-linear equation bespeaks the importance of the accurate input data (i.e. number of significant digits). Thus, reducing the human interrogation is very important for the rigorous transmutation study and 'allin- one' code structure is desired. Note that non-linear characteristic of the transmutation equation caused by the flux changes due to the number density change during a given time interval (intrinsic physical phenomena) is not considered in this study. In this study, we only emphasized the effects of human interrogation in the computing process solving nonlinear differential equations, as shown in

  1. Application of KWU antimony removal process at Gentilly-2

    International Nuclear Information System (INIS)

    Dundar, Y.; Odar, S.; Streit, K.; Allsop, H.; Guzonas, D.

    1996-09-01

    This paper describes the work performed to adapt the KWU PWR antimony removal process to CANDU plant conditions, and the application of the process at the Hydro Quebec unit, Gentilly-2. The results of the application will be presented and the 'lessons learned' will be discussed in detail. (author)

  2. The 1,5-H-shift in 1-butoxy: A case study in the rigorous implementation of transition state theory for a multirotamer system

    Science.gov (United States)

    Vereecken, Luc; Peeters, Jozef

    2003-09-01

    The rigorous implementation of transition state theory (TST) for a reaction system with multiple reactant rotamers and multiple transition state conformers is discussed by way of a statistical rate analysis of the 1,5-H-shift in 1-butoxy radicals, a prototype reaction for the important class of H-shift reactions in atmospheric chemistry. Several approaches for deriving a multirotamer TST expression are treated: oscillator versus (hindered) internal rotor models; distinguishable versus indistinguishable atoms; and direct count methods versus degeneracy factors calculated by (simplified) direct count methods or from symmetry numbers and number of enantiomers, where applicable. It is shown that the various treatments are fully consistent, even if the TST expressions themselves appear different. The 1-butoxy H-shift reaction is characterized quantum chemically using B3LYP-DFT; the performance of this level of theory is compared to other methods. Rigorous application of the multirotamer TST methodology in an harmonic oscillator approximation based on this data yields a rate coefficient of k(298 K,1 atm)=1.4×105 s-1, and an Arrhenius expression k(T,1 atm)=1.43×1011 exp(-8.17 kcal mol-1/RT) s-1, which both closely match the experimental recommendations in the literature. The T-dependence is substantially influenced by the multirotamer treatment, as well as by the tunneling and fall-off corrections. The present results are compared to those of simplified TST calculations based solely on the properties of the lowest energy 1-butoxy rotamer.

  3. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Allgood, Glenn O [ORNL; Knox, John R [ORNL

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  4. Desarrollo constitucional, legal y jurisprudencia del principio de rigor subsidiario

    Directory of Open Access Journals (Sweden)

    Germán Eduardo Cifuentes Sandoval

    2013-09-01

    Full Text Available In colombia the environment state administration is in charge of environmental national system, SINA, SINA is made up of states entities that coexist beneath a mixed organization of centralization and decentralization. SINA decentralization express itself in a administrative and territorial level, and is waited that entities that function under this structure act in a coordinated way in order to reach suggested objectives in the environmental national politicy. To achieve the coordinated environmental administration through entities that define the SINA, the environmental legislation of Colombia has include three basic principles: 1. The principle of “armorial regional” 2. The principle of “gradationnormative” 3. The principle of “rigorsubsidiaries”. These principles belong to the article 63, law 99 of 1933, and even in the case of the two first, it is possible to find equivalents in other norms that integrate the Colombian legal system, it does not happen in that way with the “ rigor subsidiaries” because its elements are uniques of the environmental normativity and do not seem to be similar to those that make part of the principle of “ subsidiaridad” present in the article 288 of the politic constitution. The “ rigor subsidiaries” give to decentralizates entities certain type of special ability to modify the current environmental legislation to defend the local ecological patrimony. It is an administrative ability with a foundation in the decentralization autonomy that allows to take place of the reglamentary denied of the legislative power with the condition that the new normativity be more demanding that the one that belongs to the central level

  5. Quantization of noncompact coverings and its physical applications

    Science.gov (United States)

    Ivankov, Petr

    2018-02-01

    A rigorous algebraic definition of noncommutative coverings is developed. In the case of commutative algebras this definition is equivalent to the classical definition of topological coverings of locally compact spaces. The theory has following nontrivial applications: • Coverings of continuous trace algebras, • Coverings of noncommutative tori, • Coverings of the quantum SU(2) group, • Coverings of foliations, • Coverings of isospectral deformations of Spin - manifolds. The theory supplies the rigorous definition of noncommutative Wilson lines.

  6. Teaching Sustainable Process Design Using 12 Systematic Computer-Aided Tasks

    DEFF Research Database (Denmark)

    Babi, Deenesh K.

    2015-01-01

    (tasks 4-7) and then sizing, costing and economic analysis of the designed process (tasks 8-9). This produces a base case design. In tasks 10-12, the student explores opportunities for heat and/or mass integration, followed by a sustainability analysis, in order to evaluate the base case design and set......In this paper a task-based approach for teaching (sustainable) process design to students pursuing a degree in chemical and biochemical engineering is presented. In tasks 1-3 the student makes design decisions for product and process selection followed by simple and rigorous model simulations...... targets for further improvement. Finally, a process optimization problem is formulated and solved to obtain the more sustainable process design. The 12 tasks are explained in terms of input and output of each task and examples of application of this approach in an MSclevel course are reported....

  7. Teaching Case: MiHotel--Applicant Processing System Design Case

    Science.gov (United States)

    Miller, Robert E.; Dunn, Paul

    2018-01-01

    This teaching case describes the functionality of an applicant processing system designed for a fictitious hotel chain. The system detailed in the case includes a webform where applicants complete and submit job applications. The system also includes a desktop application used by hotel managers and Human Resources to track applications and process…

  8. Process Development of Porcelain Ceramic Material with Binder Jetting Process for Dental Applications

    Science.gov (United States)

    Miyanaji, Hadi; Zhang, Shanshan; Lassell, Austin; Zandinejad, Amirali; Yang, Li

    2016-03-01

    Custom ceramic structures possess significant potentials in many applications such as dentistry and aerospace where extreme environments are present. Specifically, highly customized geometries with adequate performance are needed for various dental prostheses applications. This paper demonstrates the development of process and post-process parameters for a dental porcelain ceramic material using binder jetting additive manufacturing (AM). Various process parameters such as binder amount, drying power level, drying time and powder spread speed were studied experimentally for their effect on geometrical and mechanical characteristics of green parts. In addition, the effects of sintering and printing parameters on the qualities of the densified ceramic structures were also investigated experimentally. The results provide insights into the process-property relationships for the binder jetting AM process, and some of the challenges of the process that need to be further characterized for the successful adoption of the binder jetting technology in high quality ceramic fabrications are discussed.

  9. Continuous-time Markov decision processes theory and applications

    CERN Document Server

    Guo, Xianping

    2009-01-01

    This volume provides the first book entirely devoted to recent developments on the theory and applications of continuous-time Markov decision processes (MDPs). The MDPs presented here include most of the cases that arise in applications.

  10. The Challenge of Timely, Responsive and Rigorous Ethics Review of Disaster Research: Views of Research Ethics Committee Members.

    Directory of Open Access Journals (Sweden)

    Matthew Hunt

    Full Text Available Research conducted following natural disasters such as earthquakes, floods or hurricanes is crucial for improving relief interventions. Such research, however, poses ethical, methodological and logistical challenges for researchers. Oversight of disaster research also poses challenges for research ethics committees (RECs, in part due to the rapid turnaround needed to initiate research after a disaster. Currently, there is limited knowledge available about how RECs respond to and appraise disaster research. To address this knowledge gap, we investigated the experiences of REC members who had reviewed disaster research conducted in low- or middle-income countries.We used interpretive description methodology and conducted in-depth interviews with 15 respondents. Respondents were chairs, members, advisors, or coordinators from 13 RECs, including RECs affiliated with universities, governments, international organizations, a for-profit REC, and an ad hoc committee established during a disaster. Interviews were analyzed inductively using constant comparative techniques.Through this process, three elements were identified as characterizing effective and high-quality review: timeliness, responsiveness and rigorousness. To ensure timeliness, many RECs rely on adaptations of review procedures for urgent protocols. Respondents emphasized that responsive review requires awareness of and sensitivity to the particularities of disaster settings and disaster research. Rigorous review was linked with providing careful assessment of ethical considerations related to the research, as well as ensuring independence of the review process.Both the frequency of disasters and the conduct of disaster research are on the rise. Ensuring effective and high quality review of disaster research is crucial, yet challenges, including time pressures for urgent protocols, exist for achieving this goal. Adapting standard REC procedures may be necessary. However, steps should be

  11. Stochastic processes and applications diffusion processes, the Fokker-Planck and Langevin equations

    CERN Document Server

    Pavliotis, Grigorios A

    2014-01-01

    This book presents various results and techniques from the theory of stochastic processes that are useful in the study of stochastic problems in the natural sciences. The main focus is analytical methods, although numerical methods and statistical inference methodologies for studying diffusion processes are also presented. The goal is the development of techniques that are applicable to a wide variety of stochastic models that appear in physics, chemistry and other natural sciences. Applications such as stochastic resonance, Brownian motion in periodic potentials and Brownian motors are studied and the connection between diffusion processes and time-dependent statistical mechanics is elucidated.                 The book contains a large number of illustrations, examples, and exercises. It will be useful for graduate-level courses on stochastic processes for students in applied mathematics, physics and engineering. Many of the topics covered in this book (reversible diffusions, convergence to eq...

  12. Scleroglucan: Fermentative Production, Downstream Processing and Applications

    Directory of Open Access Journals (Sweden)

    Shrikant A. Survase

    2007-01-01

    Full Text Available Exopolysaccharides produced by a variety of microorganisms find multifarious industrial applications in foods, pharmaceutical and other industries as emulsifiers, stabilizers, binders, gelling agents, lubricants, and thickening agents. One such exopolysaccharide is scleroglucan, produced by pure culture fermentation from filamentous fungi of genus Sclerotium. The review discusses the properties, fermentative production, downstream processing and applications of scleroglucan.

  13. The research on construction and application of machining process knowledge base

    Science.gov (United States)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  14. Application of on-line analytical processing technique in accelerator

    International Nuclear Information System (INIS)

    Xie Dong; Li Weimin; He Duohui; Liu Gongfa; Xuan Ke

    2005-01-01

    A method of application of the on-line analytical processing technique in accelerator is described, which includes data pre-processing, the process of constructing of data warehouse and on-line analytical processing. (authors)

  15. DATA AND PROCESSING IN DEVELOPING ECCONOMIC APPLICATIONS

    Directory of Open Access Journals (Sweden)

    ADRIAN GHENCEA

    2011-04-01

    Full Text Available Economic Informatics originates in the industry economy and the electronic processing of information. A clear distinction is made between IT and economic informatics, and further between general and particular economic informatics (the particular economic informatics meaning administration, industrial informatics etc. Economic informatics is deemed to be an applicative science relating to the conception, working modality and representation of IT and communication systems, oriented towards companies which are using electronic computers. This paper pursues to integrate applications allowing the information systems to interconnect at informational level, by information sharing, and at service level, considering the control of the related processes in real time.

  16. The influence of low temperature, type of muscle and electrical stimulation on the course of rigor mortis, ageing and tenderness of beef muscles.

    Science.gov (United States)

    Olsson, U; Hertzman, C; Tornberg, E

    1994-01-01

    The course of rigor mortis, ageing and tenderness have been evaluated for two beef muscles, M. semimembranosus (SM) and M. longissimus dorsi (LD), when entering rigor at constant temperatures in the cold-shortening region (1, 4, 7 and 10°C). The influence of electrical stimulation (ES) was also examined. Post-mortem changes were registered by shortening and isometric tension and by following the decline of pH, ATP and creatine phosphate. The effect of ageing on tenderness was recorded by measuring shear-force (2, 8 and 15 days post mortem) and the sensory properties were assessed 15 days post mortem. It was found that shortening increased with decreasing temperature, resulting in decreased tenderness. Tenderness for LD, but not for SM, was improved by ES at 1 and 4°C, whereas ES did not give rise to any decrease in the degree of shortening during rigor mortis development. This suggests that ES influences tenderization more than it prevents cold-shortening. The samples with a pre-rigor mortis temperature of 1°C could not be tenderized, when stored up to 15 days, whereas this was the case for the muscles entering rigor mortis at the other higher temperatures. The results show that under the conditions used in this study, the course of rigor mortis is more important for the ultimate tenderness than the course of ageing. Copyright © 1994. Published by Elsevier Ltd.

  17. A software application for the processing of students results | Ukem ...

    African Journals Online (AJOL)

    A software application for the processing of students results. ... In this work, a computer software application was developed to facilitate the automated processing of the results. ... Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  18. Robust Adaptive Modified Newton Algorithm for Generalized Eigendecomposition and Its Application

    Science.gov (United States)

    Yang, Jian; Yang, Feng; Xi, Hong-Sheng; Guo, Wei; Sheng, Yanmin

    2007-12-01

    We propose a robust adaptive algorithm for generalized eigendecomposition problems that arise in modern signal processing applications. To that extent, the generalized eigendecomposition problem is reinterpreted as an unconstrained nonlinear optimization problem. Starting from the proposed cost function and making use of an approximation of the Hessian matrix, a robust modified Newton algorithm is derived. A rigorous analysis of its convergence properties is presented by using stochastic approximation theory. We also apply this theory to solve the signal reception problem of multicarrier DS-CDMA to illustrate its practical application. The simulation results show that the proposed algorithm has fast convergence and excellent tracking capability, which are important in a practical time-varying communication environment.

  19. Robust Adaptive Modified Newton Algorithm for Generalized Eigendecomposition and Its Application

    Directory of Open Access Journals (Sweden)

    Yang Jian

    2007-01-01

    Full Text Available We propose a robust adaptive algorithm for generalized eigendecomposition problems that arise in modern signal processing applications. To that extent, the generalized eigendecomposition problem is reinterpreted as an unconstrained nonlinear optimization problem. Starting from the proposed cost function and making use of an approximation of the Hessian matrix, a robust modified Newton algorithm is derived. A rigorous analysis of its convergence properties is presented by using stochastic approximation theory. We also apply this theory to solve the signal reception problem of multicarrier DS-CDMA to illustrate its practical application. The simulation results show that the proposed algorithm has fast convergence and excellent tracking capability, which are important in a practical time-varying communication environment.

  20. Screening of synfuel processes for HTGR application

    International Nuclear Information System (INIS)

    1981-02-01

    The aim of this study is to select for further study, the several synfuel processes which are the most attractive for application of HTGR heat and energy. In pursuing this objective, the Working Group identified 34 candidate synfuel processes, cut the number of processes to 16 in an initial screening, established 11 prime criteria with weighting factors for use in screening the remaining processes, developed a screening methodology and assumptions, collected process energy requirement information, and performed a comparative rating of the processes. As a result of this, three oil shale retorting processes, two coal liquefaction processes and one coal gasification process were selected as those of most interest for further study at this time

  1. Influência do estresse causado pelo transporte e método de abate sobre o rigor mortis do tambaqui (Colossoma macropomum

    Directory of Open Access Journals (Sweden)

    Joana Maia Mendes

    2015-06-01

    Full Text Available ResumoO presente trabalho avaliou a influência do estresse pré-abate e do método de abate sobre o rigor mortis do tambaqui durante armazenamento em gelo. Foram estudadas respostas fisiológicas do tambaqui ao estresse durante o pré-abate, que foi dividido em quatro etapas: despesca, transporte, recuperação por 24 h e por 48 h. Ao final de cada etapa, os peixes foram amostrados para caracterização do estresse pré-abate por meio de análises dos parâmetros plasmáticos de glicose, lactato e amônia e, em seguida, os peixes foram abatidos por hipotermia ou por asfixia com gás carbônico para o estudo do rigor mortis. Verificou-se que o estado fisiológico de estresse dos peixes foi mais agudo logo após o transporte, implicando numa entrada em rigor mortis mais rápida: 60 minutos para tambaquis abatidos por hipotermia e 120 minutos para tambaquis abatidos por asfixia com gás carbônico. Nos viveiros, os peixes abatidos logo após a despesca apresentaram estado de estresse intermediário, sem diferença no tempo de entrada em rigor mortis em relação ao método de abate (135 minutos. Os peixes que passaram por recuperação ao estresse causado pelo transporte em condições simuladas de indústria apresentaram entrada em rigor mortis mais tardia: 225 minutos (com 24 h de recuperação e 255 minutos (com 48 h de recuperação, igualmente sem diferença em relação aos métodos de abate testados. A resolução do rigor mortis foi mais rápida nos peixes abatidos após o transporte, que foi de 12 dias. Nos peixes abatidos logo após a despesca, a resolução ocorreu com 16 dias e, nos peixes abatidos após recuperação, com 20 dias para 24 h de recuperação ao estresse pré-abate e 24 dias para 48 h de recuperação, sem influência do método de abate na resolução do rigor mortis. Assim, é desejável que o abate do tambaqui destinado à indústria seja feito após período de recuperação ao estresse, com vistas a aumentar sua

  2. Sequential specification of time-aware stream processing applications

    NARCIS (Netherlands)

    Geuns, S.J.; Hausmans, J.P.H.M.; Bekooij, Marco Jan Gerrit

    Automatic parallelization of Nested Loop Programs (NLPs) is an attractive method to create embedded real-time stream processing applications for multi-core systems. However, the description and parallelization of applications with a time dependent functional behavior has not been considered in NLPs.

  3. Biomimetic architectures by plasma processing fabrication and applications

    CERN Document Server

    Chattopadhyay, Surojit

    2014-01-01

    Photonic structures in the animal kingdom: valuable inspirations for bio-mimetic applications. Moth eye-type anti-reflecting nanostructures by an electron cyclotron resonance plasma. Plasma-processed biomimetic nano/microstructures. Wetting properties of natural and plasma processed biomimetic surfaces. Biomimetic superhydrophobic surface by plasma processing. Biomimetic interfaces of plasma modified titanium alloy.

  4. Finite Markov processes and their applications

    CERN Document Server

    Iosifescu, Marius

    2007-01-01

    A self-contained treatment of finite Markov chains and processes, this text covers both theory and applications. Author Marius Iosifescu, vice president of the Romanian Academy and director of its Center for Mathematical Statistics, begins with a review of relevant aspects of probability theory and linear algebra. Experienced readers may start with the second chapter, a treatment of fundamental concepts of homogeneous finite Markov chain theory that offers examples of applicable models.The text advances to studies of two basic types of homogeneous finite Markov chains: absorbing and ergodic ch

  5. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  6. Protein engineering of enzymes for process applications

    DEFF Research Database (Denmark)

    Woodley, John M

    2013-01-01

    opportunities will be targeted on modification to enable process application. This article discusses the challenges involved in enzyme modification focused on process requirements, such as the need to fulfill reaction thermodynamics, specific activity under the required conditions, kinetics at required...... concentrations, and stability. Finally, future research directions are discussed, including the integration of biocatalysis with neighboring chemical steps....

  7. Multiresolution approach to processing images for different applications interaction of lower processing with higher vision

    CERN Document Server

    Vujović, Igor

    2015-01-01

    This book presents theoretical and practical aspects of the interaction between low and high level image processing. Multiresolution analysis owes its popularity mostly to wavelets and is widely used in a variety of applications. Low level image processing is important for the performance of many high level applications. The book includes examples from different research fields, i.e. video surveillance; biomedical applications (EMG and X-ray); improved communication, namely teleoperation, telemedicine, animation, augmented/virtual reality and robot vision; monitoring of the condition of ship systems and image quality control.

  8. Applications of factorization embeddings for Lévy processes

    NARCIS (Netherlands)

    Dieker, A.B.

    2006-01-01

    We give three applications of the Pecherskii-Rogozin-Spitzer identity for Lévy processes. First, we find the joint distribution of the supremum and the epoch at which it is `attained' if a Lévy process has phase-type upward jumps. We also find the characteristics of the ladder process. Second, we

  9. Caracterização do processo de rigor mortis do músculo Ilio-ischiocaudalis de jacaré-do-pantanal (Caiman crocodilus yacare e maciez da carne Characterization of rigor mortis process of muscle Ilio-ischiocaudalis of pantanal alligator (Caiman crocodilus yacare and meat tenderness

    Directory of Open Access Journals (Sweden)

    Juliana Paulino Vieira

    2012-03-01

    Full Text Available Este trabalho utilizou seis carcaças de jacaré-do-pantanal (Caiman crocodilus yacare com o objetivo de caracterizar o processo de rigor mortis do músculo Ílio-ischiocaudalis durante o resfriamento industrial e avaliar a maciez dessa carne. Os jacarés foram escolhidos aleatoriamente e abatidos na Cooperativa de Criadores do Jacaré do Pantanal (COOCRIJAPAN, Cáceres, Mato Grosso. Após a sangria, aferiu-se as variações das temperaturas da câmara de resfriamento, das carcaças e o pH. Foram colhidas amostras para determinação do comprimento de sarcômero, da força de cisalhamento e perdas por cozimento em diferentes intervalos de tempo (0,5, 3, 5, 7, 10, 12, 15, 24 e 36h. A temperatura da câmara de resfriamento variou de 2,6°C (0,5h a 0,9°C (36h e a temperatura média das carcaças variou de 21,0°C a 4,2°C, respectivamente. O pH médio inicial do músculo foi de 6,7 e o final 5,6 e a contração máxima do sarcômero do músculo Ilio-ischiocaudalis ocorreu na 15ª hora após a sangria (1,5µm. Essa carne apresentou força de cisalhamento menor que 6,0kg.This paper studied six pantanal alligators (Caiman crocodilus yacare carcass with goal of rigor mortis process characterization of Ilio-ischiocaudalis muscle during industrial cooling and meat tenderness. The alligators were randomly assembled and slaughtered at Cooperativa de Criadores do Jacaré do Pantanal (COOCRIJAPAN - Cáceres- Mato Grosso After exsanguination, were mensured temperature of chilling room and carcasses, pH and samples were collected for determination the sarcomere length, shear force and cooking loss at different times (0.5, 3, 5, 7, 10, 12, 15, 24 and 36 hours. The temperature of chilling room varied from 2.6°C (0.5h to 0.9°C (36h and the mean carcass temperature from 21.0°C to 4.2°C, respectively. The mean initial pH of the muscle was 6.7 and the final was 5.6. The smallest sarcomere size ocurred at 15 hours after exsanguination (1.5µm. This meat presents

  10. A review on the processing accuracy of two-photon polymerization

    Directory of Open Access Journals (Sweden)

    Xiaoqin Zhou

    2015-03-01

    Full Text Available Two-photon polymerization (TPP is a powerful and potential technology to fabricate true three-dimensional (3D micro/nanostructures of various materials with subdiffraction-limit resolution. And it has been applied to microoptics, electronics, communications, biomedicine, microfluidic devices, MEMS and metamaterials. These applications, such as microoptics and photon crystals, put forward rigorous requirements on the processing accuracy of TPP, including the dimensional accuracy, shape accuracy and surface roughness and the processing accuracy influences their performance, even invalidate them. In order to fabricate precise 3D micro/nanostructures, the factors influencing the processing accuracy need to be considered comprehensively and systematically. In this paper, we review the basis of TPP micro/nanofabrication, including mechanism of TPP, experimental set-up for TPP and scaling laws of resolution of TPP. Then, we discuss the factors influencing the processing accuracy. Finally, we summarize the methods reported lately to improve the processing accuracy from improving the resolution and changing spatial arrangement of voxels.

  11. A review on the processing accuracy of two-photon polymerization

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Xiaoqin; Hou, Yihong [School of Mechanical Science and Engineering, Jilin University, Changchun, 130022 (China); Lin, Jieqiong, E-mail: linjieqiong@mail.ccut.edu.cn [School of Electromechanical Engineering, Changchun University of Technology, Changchun, 130012 (China)

    2015-03-15

    Two-photon polymerization (TPP) is a powerful and potential technology to fabricate true three-dimensional (3D) micro/nanostructures of various materials with subdiffraction-limit resolution. And it has been applied to microoptics, electronics, communications, biomedicine, microfluidic devices, MEMS and metamaterials. These applications, such as microoptics and photon crystals, put forward rigorous requirements on the processing accuracy of TPP, including the dimensional accuracy, shape accuracy and surface roughness and the processing accuracy influences their performance, even invalidate them. In order to fabricate precise 3D micro/nanostructures, the factors influencing the processing accuracy need to be considered comprehensively and systematically. In this paper, we review the basis of TPP micro/nanofabrication, including mechanism of TPP, experimental set-up for TPP and scaling laws of resolution of TPP. Then, we discuss the factors influencing the processing accuracy. Finally, we summarize the methods reported lately to improve the processing accuracy from improving the resolution and changing spatial arrangement of voxels.

  12. Abstractions for aperiodic multiprocessor scheduling of real-time stream processing applications

    NARCIS (Netherlands)

    Hausmans, J.P.H.M.

    2015-01-01

    Embedded multiprocessor systems are often used in the domain of real-time stream processing applications to keep up with increasing power and performance requirements. Examples of such real-time stream processing applications are digital radio baseband processing and WLAN transceivers. These stream

  13. Programmable lithography engine (ProLE) grid-type supercomputer and its applications

    Science.gov (United States)

    Petersen, John S.; Maslow, Mark J.; Gerold, David J.; Greenway, Robert T.

    2003-06-01

    There are many variables that can affect lithographic dependent device yield. Because of this, it is not enough to make optical proximity corrections (OPC) based on the mask type, wavelength, lens, illumination-type and coherence. Resist chemistry and physics along with substrate, exposure, and all post-exposure processing must be considered too. Only a holistic approach to finding imaging solutions will accelerate yield and maximize performance. Since experiments are too costly in both time and money, accomplishing this takes massive amounts of accurate simulation capability. Our solution is to create a workbench that has a set of advanced user applications that utilize best-in-class simulator engines for solving litho-related DFM problems using distributive computing. Our product, ProLE (Programmable Lithography Engine), is an integrated system that combines Petersen Advanced Lithography Inc."s (PAL"s) proprietary applications and cluster management software wrapped around commercial software engines, along with optional commercial hardware and software. It uses the most rigorous lithography simulation engines to solve deep sub-wavelength imaging problems accurately and at speeds that are several orders of magnitude faster than current methods. Specifically, ProLE uses full vector thin-mask aerial image models or when needed, full across source 3D electromagnetic field simulation to make accurate aerial image predictions along with calibrated resist models;. The ProLE workstation from Petersen Advanced Lithography, Inc., is the first commercial product that makes it possible to do these intensive calculations at a fraction of a time previously available thus significantly reducing time to market for advance technology devices. In this work, ProLE is introduced, through model comparison to show why vector imaging and rigorous resist models work better than other less rigorous models, then some applications of that use our distributive computing solution are shown

  14. High-flux solar photon processes: Opportunities for applications

    Energy Technology Data Exchange (ETDEWEB)

    Steinfeld, J.I.; Coy, S.L.; Herzog, H.; Shorter, J.A.; Schlamp, M.; Tester, J.W.; Peters, W.A. [Massachusetts Inst. of Tech., Cambridge, MA (United States)

    1992-06-01

    The overall goal of this study was to identify new high-flux solar photon (HFSP) processes that show promise of being feasible and in the national interest. Electric power generation and hazardous waste destruction were excluded from this study at sponsor request. Our overall conclusion is that there is promise for new applications of concentrated solar photons, especially in certain aspects of materials processing and premium materials synthesis. Evaluation of the full potential of these and other possible applications, including opportunities for commercialization, requires further research and testing. 100 refs.

  15. Upgrading geometry conceptual understanding and strategic competence through implementing rigorous mathematical thinking (RMT)

    Science.gov (United States)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-03-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.

  16. Seizing the Future: How Ohio's Career-Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Careers

    Science.gov (United States)

    Guarino, Heidi; Yoder, Shaun

    2015-01-01

    "Seizing the Future: How Ohio's Career and Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Work," demonstrates Ohio's progress in developing strong policies for career and technical education (CTE) programs to promote rigor, including college- and career-ready graduation…

  17. Cost reductions on a food processing plant identified by a process integration study at Cadbury Typhoo Ltd

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, R.W. (Energy Technology Support Unit, Harwell (UK))

    1986-08-01

    The purpose of a process integration study is to determine the minimum practical amount of energy required to operate a process and to identify the minimum cost schemes to maximise savings consistent with planning and operating criteria. The British-led development of Pinch Technology provides a more systematic approach. This technique involves the rigorous application of thermodynamic principles and cost accounting whilst taking account of practical engineering and operability constraints. Cadbury Typhoo Ltd is part of the Cadbury Schweppes Group. The company produces and markets hot and cold beverage products. A process integration study was carried out at the company's Knighton manufacturing site where 'instant' powder beverage ingredients are made and packaged. The total energy bill at the site is in the region of Pound 0.5 million/year. The process energy bill is around Pound 300-350,000/year. The first significant opportunity involves changing the actual process by a different piping arrangement. For a capital expenditure of around Pound 1,000, savings worth Pound 25,000/year can be achieved with a payback of approximately two weeks. (author).

  18. Calcium-dependence of Donnan potentials in glycerinated rabbit psoas muscle in rigor, at and beyond filament overlap; a role for titin in the contractile process

    DEFF Research Database (Denmark)

    Coomber, S J; Bartels, E M; Elliott, G F

    2011-01-01

    contracts and breaks the microelectrode. Therefore the rigor state was studied. There is no reason to suppose a priori that a similar voltage switch does not occur during contraction, however. Calcium dependence is still apparent in muscles stretched beyond overlap (sarcomere length>3.8 μm) and is also seen...... in the gap filaments between the A- and I-band ends; further stretching abolishes the dependence. These experiments strongly suggest that calcium dependence is controlled initially by the titin component, and that this control is lost when titin filaments break. We suppose that that effect is mediated...

  19. Application of Contact Mode AFM to Manufacturing Processes

    Science.gov (United States)

    Giordano, Michael A.; Schmid, Steven R.

    A review of the application of contact mode atomic force microscopy (AFM) to manufacturing processes is presented. A brief introduction to common experimental techniques including hardness, scratch, and wear testing is presented, with a discussion of challenges in the extension of manufacturing scale investigations to the AFM. Differences between the macro- and nanoscales tests are discussed, including indentation size effects and their importance in the simulation of processes such as grinding. The basics of lubrication theory are presented and friction force microscopy is introduced as a method of investigating metal forming lubrication on the nano- and microscales that directly simulates tooling/workpiece asperity interactions. These concepts are followed by a discussion of their application to macroscale industrial manufacturing processes and direct correlations are made.

  20. Rigorous time slicing approach to Feynman path integrals

    CERN Document Server

    Fujiwara, Daisuke

    2017-01-01

    This book proves that Feynman's original definition of the path integral actually converges to the fundamental solution of the Schrödinger equation at least in the short term if the potential is differentiable sufficiently many times and its derivatives of order equal to or higher than two are bounded. The semi-classical asymptotic formula up to the second term of the fundamental solution is also proved by a method different from that of Birkhoff. A bound of the remainder term is also proved. The Feynman path integral is a method of quantization using the Lagrangian function, whereas Schrödinger's quantization uses the Hamiltonian function. These two methods are believed to be equivalent. But equivalence is not fully proved mathematically, because, compared with Schrödinger's method, there is still much to be done concerning rigorous mathematical treatment of Feynman's method. Feynman himself defined a path integral as the limit of a sequence of integrals over finite-dimensional spaces which is obtained by...

  1. 7 CFR 1942.104 - Application processing.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 13 2010-01-01 2009-01-01 true Application processing. 1942.104 Section 1942.104 Agriculture Regulations of the Department of Agriculture (Continued) RURAL HOUSING SERVICE, RURAL BUSINESS... with subpart B of part 1900 of this chapter. The following statement will also be made on all...

  2. Application of Six Sigma methodology to a diagnostic imaging process.

    Science.gov (United States)

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  3. Potential Applications of Zeolite Membranes in Reaction Coupling Separation Processes

    Directory of Open Access Journals (Sweden)

    Tunde V. Ojumu

    2012-10-01

    Full Text Available Future production of chemicals (e.g., fine and specialty chemicals in industry is faced with the challenge of limited material and energy resources. However, process intensification might play a significant role in alleviating this problem. A vision of process intensification through multifunctional reactors has stimulated research on membrane-based reactive separation processes, in which membrane separation and catalytic reaction occur simultaneously in one unit. These processes are rather attractive applications because they are potentially compact, less capital intensive, and have lower processing costs than traditional processes. Therefore this review discusses the progress and potential applications that have occurred in the field of zeolite membrane reactors during the last few years. The aim of this article is to update researchers in the field of process intensification and also provoke their thoughts on further research efforts to explore and exploit the potential applications of zeolite membrane reactors in industry. Further evaluation of this technology for industrial acceptability is essential in this regard. Therefore, studies such as techno-economical feasibility, optimization and scale-up are of the utmost importance.

  4. Applications of Friction Stir Processing during Engraving of Soft Materials

    Directory of Open Access Journals (Sweden)

    V. Kočović

    2015-12-01

    Full Text Available Friction stir processing has extensive application in many technological operations. Application area of friction stir processing can be extended to the processing of non-metallic materials, such as wood. The paper examines the friction stir processing contact between a specially designed hard and temperature-resistant rotating tool and workpiece which is made of wood. Interval of speed slip and temperature level under which the combustion occurs and carbonization layer of soft material was determined. The results of the research can be applied in technological process of wood engraving operations which may have significant technological and aesthetic effects.

  5. Rigorous classification and carbon accounting principles for low and Zero Carbon Cities

    International Nuclear Information System (INIS)

    Kennedy, Scott; Sgouridis, Sgouris

    2011-01-01

    A large number of communities, new developments, and regions aim to lower their carbon footprint and aspire to become 'zero carbon' or 'Carbon Neutral.' Yet there are neither clear definitions for the scope of emissions that such a label would address on an urban scale, nor is there a process for qualifying the carbon reduction claims. This paper addresses the question of how to define a zero carbon, Low Carbon, or Carbon Neutral urban development by proposing hierarchical emissions categories with three levels: Internal Emissions based on the geographical boundary, external emissions directly caused by core municipal activities, and internal or external emissions due to non-core activities. Each level implies a different carbon management strategy (eliminating, balancing, and minimizing, respectively) needed to meet a Net Zero Carbon designation. The trade-offs, implications, and difficulties of implementing carbon debt accounting based upon these definitions are further analyzed. - Highlights: → A gap exists in comprehensive and standardized accounting methods for urban carbon emissions. → We propose a comprehensive and rigorous City Framework for Carbon Accounting (CiFCA). → CiFCA classifies emissions hierarchically with corresponding carbon management strategies. → Adoption of CiFCA allows for meaningful comparisons of claimed performance of eco-cities.

  6. Real-time digital signal processing fundamentals, implementations and applications

    CERN Document Server

    Kuo, Sen M; Tian, Wenshun

    2013-01-01

    Combines both the DSP principles and real-time implementations and applications, and now updated with the new eZdsp USB Stick, which is very low cost, portable and widely employed at many DSP labs. Real-Time Digital Signal Processing introduces fundamental digital signal processing (DSP) principles and will be updated to include the latest DSP applications, introduce new software development tools and adjust the software design process to reflect the latest advances in the field. In the 3rd edition of the book, the key aspect of hands-on experiments will be enhanced to make the DSP principle

  7. Application of the geothermal energy in the industrial processes

    International Nuclear Information System (INIS)

    Popovska-Vasilevska, Sanja

    2001-01-01

    In the worldwide practice, the geothermal energy application, as an alternative energy resource, can be of great importance. This is especially case in the countries where exceptional natural geothermal potential exists. Despite using geothermal energy for both greenhouses heating and balneology, the one can be successfully implemented in the heat requiring industrial processes. This kind of use always provides greater annual heat loading factor, since the industrial processes are not seasonal (or not the greater part of them). The quality of the geothermal resources that are available in Europe, dictates the use within the low-temperature range technological processes. However, these processes are significantly engaged in different groups of processing industries. But, beside this fact the industrial application of geothermal energy is at the beginning in the Europe. (Original)

  8. 8th International Conference on Robotic, Vision, Signal Processing & Power Applications

    CERN Document Server

    Mustaffa, Mohd

    2014-01-01

    The proceeding is a collection of research papers presented, at the 8th International Conference on Robotics, Vision, Signal Processing and Power Applications (ROVISP 2013), by researchers, scientists, engineers, academicians as well as industrial professionals from all around the globe. The topics of interest are as follows but are not limited to: • Robotics, Control, Mechatronics and Automation • Vision, Image, and Signal Processing • Artificial Intelligence and Computer Applications • Electronic Design and Applications • Telecommunication Systems and Applications • Power System and Industrial Applications  

  9. Rigorous modelling of light's intensity angular-profile in Abbe refractometers with absorbing homogeneous fluids

    International Nuclear Information System (INIS)

    García-Valenzuela, A; Contreras-Tello, H; Márquez-Islas, R; Sánchez-Pérez, C

    2013-01-01

    We derive an optical model for the light intensity distribution around the critical angle in a standard Abbe refractometer when used on absorbing homogenous fluids. The model is developed using rigorous electromagnetic optics. The obtained formula is very simple and can be used suitably in the analysis and design of optical sensors relying on Abbe type refractometry.

  10. A Generalized Method for the Comparable and Rigorous Calculation of the Polytropic Efficiencies of Turbocompressors

    Science.gov (United States)

    Dimitrakopoulos, Panagiotis

    2018-03-01

    The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.

  11. CONCEPTUAL NUANCES OF AMORTIZATION PROCESS TERMINOLOGY

    Directory of Open Access Journals (Sweden)

    Marioara Avram

    2016-12-01

    Full Text Available Often, both in literature and practice in economics, it is found that a number of terms related to the amortization process, such as wear, depreciation, physical lifetime, economic lifetime, useful lifetime, normal operation lifetime are used a manner that distorts their content and, therefore, affect the quality of information provided through them. In this article we intend to give opinion to contribute to clarifying the meaning of the terms mentioned, including the relationships between them, in order to provide a more rigorous conceptual base regarding the professional reasoning applicable to amortization. Given that this topic is a focal point of the spheres of interest of several categories of specialists, we will expose our work by comparing the national tax accounting regulations and, international accounting standards and international valuation standards.

  12. REVIEW ARTICLE: Spectrophotometric applications of digital signal processing

    Science.gov (United States)

    Morawski, Roman Z.

    2006-09-01

    Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.

  13. Quality of nuchal translucency measurements correlates with broader aspects of program rigor and culture of excellence.

    Science.gov (United States)

    Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W

    2013-01-01

    To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.

  14. Comparison of rigorous modelling of different structure profiles on photomasks for quantitative linewidth measurements by means of UV- or DUV-optical microscopy

    Science.gov (United States)

    Ehret, Gerd; Bodermann, Bernd; Woehler, Martin

    2007-06-01

    The optical microscopy is an important instrument for dimensional characterisation or calibration of micro- and nanostructures, e.g. chrome structures on photomasks. In comparison to scanning electron microscopy (possible contamination of the sample) and atomic force microscopy (slow, risk of damage) optical microscopy is a fast and non destructive metrology method. The precise quantitative determination of the linewidth from the microscope image is, however, only possible by knowledge of the geometry of the structures and their consideration in the optical modelling. We compared two different rigorous model approaches, the Rigorous Coupled Wave Analysis (RCWA) and the Finite Elements Method (FEM) for modelling of structures with different edge angles, linewidths, line to space ratios and polarisations. The RCWA method can adapt inclined edges profiles only by a staircase approximation leading to increased modelling errors of the RCWA method. Even today's sophisticated rigorous methods still show problems with TM-polarisation. Therefore both rigorous methods are compared in terms of their convergence for TE and TM- polarisation. Beyond that also the influence of typical illumination wavelengths (365 nm, 248 nm and 193 nm) on the microscope images and their contribution to the measuring uncertainty budget will be discussed.

  15. From everyday communicative figurations to rigorous audience news repertoires

    DEFF Research Database (Denmark)

    Kobbernagel, Christian; Schrøder, Kim Christian

    2016-01-01

    In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013), in which people build their cross-media news...... repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption......, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building...

  16. Using vignettes to study family consumption processes

    DEFF Research Database (Denmark)

    Grønhøj, Alice; Bech-Larsen, Tino

    2010-01-01

    The use of vignettes for qualitative consumer research is discussed in this article. More specifically, vignettes are proposed as a useful research technique for conducting systematic and rigorous studies of consumer interaction processes, in particular as these relate to family consumption issue...... for applying the vignette method are outlined and illustrated by two recent studies of proenvironmental consumer behavior in a family context. The paper concludes with a discussion of the benefits and the possible pitfalls of using vignettes.......The use of vignettes for qualitative consumer research is discussed in this article. More specifically, vignettes are proposed as a useful research technique for conducting systematic and rigorous studies of consumer interaction processes, in particular as these relate to family consumption issues....... Following an overview of methodological and practical problems of studying consumption interaction processes in families, a discussion of how vignettes may be used to enhance knowledge of family decision-making processes in real-life contexts is presented. Design implications are discussed and strategies...

  17. Novel applications of ionic liquids in materials processing

    International Nuclear Information System (INIS)

    Reddy, Ramana G

    2009-01-01

    Ionic liquids are mixtures of organic and inorganic salts which are liquids at room temperature. Several potential applications of ionic liquids in the field of materials processing are electrowinning and electrodeposition of metals and alloys, electrolysis of active metals at low temperature, liquid-liquid extraction of metals. Results using 1-butyl-3-methylimidazolium chloride with AlCl 3 at low temperatures yielded high purity aluminium deposits (>99.9% pure) and current efficiencies >98%. Titanium and aluminium were co-deposited with/without the addition of TiCl 4 with up to 27 wt% Ti in the deposit with current efficiencies in the range of 78-85 %. Certain ionic liquids are potential replacements for thermal oils and molten salts as heat transfer fluids in solar energy applications due to high thermal stability, very low corrosivity and substantial sensible heat retentivity. The calculated storage densities for several chloride and fluoride ionic liquids are in the range of 160-210 MJ/m 3 . A 3-D mathematical model was developed to simulate the large scale electrowinning of aluminium. Since ionic liquids processing results in their low energy consumption, low pollutant emissions many more materials processing applications are expected in future.

  18. Advances in the Application of Image Processing Fruit Grading

    OpenAIRE

    Fang , Chengjun; Hua , Chunjian

    2013-01-01

    International audience; In the perspective of actual production, the paper presents the advances in the application of image processing fruit grading from several aspects, such as processing precision and processing speed of image processing technology. Furthermore, the different algorithms about detecting size, shape, color and defects are combined effectively to reduce the complexity of each algorithm and achieve a balance between the processing precision and processing speed are keys to au...

  19. New applications and novel processing of refractory metal alloys

    International Nuclear Information System (INIS)

    Briant, C.L.

    2001-01-01

    Refractory metals have often been limited in their application because of their propensity to oxidize and to undergo a loos of yield strength at elevated temperatures. However, recent developments in both processing and alloy composition have opened the possibility that these materials might be used in structural applications that were not considered possible in the past. At the same time, the use of refractory metals in the electronics industry is growing, particularly with the use of tantalum as a diffusion barrier for copper metallization. Finally, the application of grain boundary engineering to the problem of intergranular fracture in these materials may allow processes to be developed that will produce alloys with a greater resistance to fracture. (author)

  20. Rigorous lower bound on the dynamic critical exponent of some multilevel Swendsen-Wang algorithms

    International Nuclear Information System (INIS)

    Li, X.; Sokal, A.D.

    1991-01-01

    We prove the rigorous lower bound z exp ≥α/ν for the dynamic critical exponent of a broad class of multilevel (or ''multigrid'') variants of the Swendsen-Wang algorithm. This proves that such algorithms do suffer from critical slowing down. We conjecture that such algorithms in fact lie in the same dynamic universality class as the stanard Swendsen-Wang algorithm

  1. Beyond the RCT: Integrating Rigor and Relevance to Evaluate the Outcomes of Domestic Violence Programs

    Science.gov (United States)

    Goodman, Lisa A.; Epstein, Deborah; Sullivan, Cris M.

    2018-01-01

    Programs for domestic violence (DV) victims and their families have grown exponentially over the last four decades. The evidence demonstrating the extent of their effectiveness, however, often has been criticized as stemming from studies lacking scientific rigor. A core reason for this critique is the widespread belief that credible evidence can…

  2. 9th International Conference on Robotics, Vision, Signal Processing & Power Applications

    CERN Document Server

    Iqbal, Shahid; Teoh, Soo; Mustaffa, Mohd

    2017-01-01

     The proceeding is a collection of research papers presented, at the 9th International Conference on Robotics, Vision, Signal Processing & Power Applications (ROVISP 2016), by researchers, scientists, engineers, academicians as well as industrial professionals from all around the globe to present their research results and development activities for oral or poster presentations. The topics of interest are as follows but are not limited to:   • Robotics, Control, Mechatronics and Automation • Vision, Image, and Signal Processing • Artificial Intelligence and Computer Applications • Electronic Design and Applications • Telecommunication Systems and Applications • Power System and Industrial Applications • Engineering Education.

  3. Benefits of important industrial tracer applications in the GDR

    International Nuclear Information System (INIS)

    Leonhardt, J.W.; Goeldner, R.; Koennecke, H.G.; Kupsch, H.; Luther, D.; Otto, R.; Reinhardt, R.; Ulrich, H.

    1990-01-01

    Tracers can be used to label substances or objects in order to discriminate between them, to follow their movement, to record changes of concentration and distribution between phases, etc. The main advantages of tracer investigations are the contactless recording of signals without influencing the observed process (also under rigorous operation conditions), the high detection sensitivity, the large number of available tracer nuclides (problems of all branches of industry can be solved) and the fact that tracer investigation can be carried out on operating production units, so that they provide valuable checks of the validity of design and process data. The cost-to-benefit ratio can be as low as 1:50. In the following some selected examples of tracer applications and their benefits will be presented. (orig./BBR) [de

  4. Image processing using pulse-coupled neural networks applications in Python

    CERN Document Server

    Lindblad, Thomas

    2013-01-01

    Image processing algorithms based on the mammalian visual cortex are powerful tools for extraction information and manipulating images. This book reviews the neural theory and translates them into digital models. Applications are given in areas of image recognition, foveation, image fusion and information extraction. The third edition reflects renewed international interest in pulse image processing with updated sections presenting several newly developed applications. This edition also introduces a suite of Python scripts that assist readers in replicating results presented in the text and to further develop their own applications.

  5. Dosimetric effects of edema in permanent prostate seed implants: a rigorous solution

    International Nuclear Information System (INIS)

    Chen Zhe; Yue Ning; Wang Xiaohong; Roberts, Kenneth B.; Peschel, Richard; Nath, Ravinder

    2000-01-01

    Purpose: To derive a rigorous analytic solution to the dosimetric effects of prostate edema so that its impact on the conventional pre-implant and post-implant dosimetry can be studied for any given radioactive isotope and edema characteristics. Methods and Materials: The edema characteristics observed by Waterman et al (Int. J. Rad. Onc. Biol. Phys, 41:1069-1077; 1998) was used to model the time evolution of the prostate and the seed locations. The total dose to any part of prostate tissue from a seed implant was calculated analytically by parameterizing the dose fall-off from a radioactive seed as a single inverse power function of distance, with proper account of the edema-induced time evolution. The dosimetric impact of prostate edema was determined by comparing the dose calculated with full consideration of prostate edema to that calculated with the conventional dosimetry approach where the seed locations and the target volume are assumed to be stationary. Results: A rigorous analytic solution on the relative dosimetric effects of prostate edema was obtained. This solution proved explicitly that the relative dosimetric effects of edema, as found in the previous numerical studies by Yue et. al. (Int. J. Radiat. Oncol. Biol. Phys. 43, 447-454, 1999), are independent of the size and the shape of the implant target volume and are independent of the number and the locations of the seeds implanted. It also showed that the magnitude of relative dosimetric effects is independent of the location of dose evaluation point within the edematous target volume. It implies that the relative dosimetric effects of prostate edema are universal with respect to a given isotope and edema characteristic. A set of master tables for the relative dosimetric effects of edema were obtained for a wide range of edema characteristics for both 125 I and 103 Pd prostate seed implants. Conclusions: A rigorous analytic solution of the relative dosimetric effects of prostate edema has been

  6. 3rd Workshop on Branching Processes and their Applications

    CERN Document Server

    González, Miguel; Gutiérrez, Cristina; Martínez, Rodrigo; Minuesa, Carmen; Molina, Manuel; Mota, Manuel; Ramos, Alfonso; WBPA15

    2016-01-01

    This volume gathers papers originally presented at the 3rd Workshop on Branching Processes and their Applications (WBPA15), which was held from 7 to 10 April 2015 in Badajoz, Spain (http://branching.unex.es/wbpa15/index.htm). The papers address a broad range of theoretical and practical aspects of branching process theory. Further, they amply demonstrate that the theoretical research in this area remains vital and topical, as well as the relevance of branching concepts in the development of theoretical approaches to solving new problems in applied fields such as Epidemiology, Biology, Genetics, and, of course, Population Dynamics. The topics covered can broadly be classified into the following areas: 1. Coalescent Branching Processes 2. Branching Random Walks 3. Population Growth Models in Varying and Random Environments 4. Size/Density/Resource-Dependent Branching Models 5. Age-Dependent Branching Models 6. Special Branching Models 7. Applications in Epidemiology 8. Applications in Biology and Genetics Offer...

  7. Advanced Process Control Application and Optimization in Industrial Facilities

    Directory of Open Access Journals (Sweden)

    Howes S.

    2015-01-01

    Full Text Available This paper describes application of the new method and tool for system identification and PID tuning/advanced process control (APC optimization using the new 3G (geometric, gradient, gravity optimization method. It helps to design and implement control schemes directly inside the distributed control system (DCS or programmable logic controller (PLC. Also, the algorithm helps to identify process dynamics in closed-loop mode, optimizes controller parameters, and helps to develop adaptive control and model-based control (MBC. Application of the new 3G algorithm for designing and implementing APC schemes is presented. Optimization of primary and advanced control schemes stabilizes the process and allows the plant to run closer to process, equipment and economic constraints. This increases production rates, minimizes operating costs and improves product quality.

  8. High temperature heat exchange: nuclear process heat applications

    International Nuclear Information System (INIS)

    Vrable, D.L.

    1980-09-01

    The unique element of the HTGR system is the high-temperature operation and the need for heat exchanger equipment to transfer nuclear heat from the reactor to the process application. This paper discusses the potential applications of the HTGR in both synthetic fuel production and nuclear steel making and presents the design considerations for the high-temperature heat exchanger equipment

  9. "Snow White" Coating Protects SpaceX Dragon's Trunk Against Rigors of Space

    Science.gov (United States)

    McMahan, Tracy

    2013-01-01

    He described it as "snow white." But NASA astronaut Don Pettit was not referring to the popular children's fairy tale. Rather, he was talking about the white coating of the Space Exploration Technologies Corp. (SpaceX) Dragon spacecraft that reflected from the International Space Station s light. As it approached the station for the first time in May 2012, the Dragon s trunk might have been described as the "fairest of them all," for its pristine coating, allowing Pettit to clearly see to maneuver the robotic arm to grab the Dragon for a successful nighttime berthing. This protective thermal control coating, developed by Alion Science and Technology Corp., based in McLean, Va., made its bright appearance again with the March 1 launch of SpaceX's second commercial resupply mission. Named Z-93C55, the coating was applied to the cargo portion of the Dragon to protect it from the rigors of space. "For decades, Alion has produced coatings to protect against the rigors of space," said Michael Kenny, senior chemist with Alion. "As space missions evolved, there was a growing need to dissipate electrical charges that build up on the exteriors of spacecraft, or there could be damage to the spacecraft s electronics. Alion's research led us to develop materials that would meet this goal while also providing thermal controls. The outcome of this research was Alion's proprietary Z-93C55 coating."

  10. Study design elements for rigorous quasi-experimental comparative effectiveness research.

    Science.gov (United States)

    Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan

    2013-03-01

    Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.

  11. Programming signal processing applications on heterogeneous wireless sensor platforms

    NARCIS (Netherlands)

    Buondonno, L.; Fortino, G.; Galzarano, S.; Giannantonio, R.; Giordano, A.; Gravina, R.; Guerrieri, A.

    2009-01-01

    This paper proposes the SPINE frameworks (SPINE1.x and SPINE2) for the programming of signal processing applications on heterogeneous wireless sensor platforms. In particular, two integrable approaches based on the proposed frameworks are described that allow to develop applications for wireless

  12. 49 CFR 555.7 - Processing of applications.

    Science.gov (United States)

    2010-10-01

    ... Transportation Other Regulations Relating to Transportation (Continued) NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION TEMPORARY EXEMPTION FROM MOTOR VEHICLE SAFETY AND BUMPER STANDARDS General § 555.7 Processing of applications. (a) The NHTSA publishes in the Federal Register, affording...

  13. Do problem-solving skills affect success in nursing process applications? An application among Turkish nursing students.

    Science.gov (United States)

    Bayindir Çevik, Ayfer; Olgun, Nermin

    2015-04-01

    This study aimed to determine the relationship between problem-solving and nursing process application skills of nursing. This is a longitudinal and correlational study. The sample included 71 students. An information form, Problem-Solving Inventory, and nursing processes the students presented at the end of clinical courses were used for data collection. Although there was no significant relationship between problem-solving skills and nursing process grades, improving problem-solving skills increased successful grades. Problem-solving skills and nursing process skills can be concomitantly increased. Students were suggested to use critical thinking, practical approaches, and care plans, as well as revising nursing processes in order to improve their problem-solving skills and nursing process application skills. © 2014 NANDA International, Inc.

  14. Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames

    Science.gov (United States)

    Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.

    2017-12-01

    Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.

  15. Rigorous vector wave propagation for arbitrary flat media

    Science.gov (United States)

    Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.

    2017-08-01

    Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.

  16. Dynamics of harmonically-confined systems: Some rigorous results

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Zhigang, E-mail: zwu@physics.queensu.ca; Zaremba, Eugene, E-mail: zaremba@sparky.phy.queensu.ca

    2014-03-15

    In this paper we consider the dynamics of harmonically-confined atomic gases. We present various general results which are independent of particle statistics, interatomic interactions and dimensionality. Of particular interest is the response of the system to external perturbations which can be either static or dynamic in nature. We prove an extended Harmonic Potential Theorem which is useful in determining the damping of the centre of mass motion when the system is prepared initially in a highly nonequilibrium state. We also study the response of the gas to a dynamic external potential whose position is made to oscillate sinusoidally in a given direction. We show in this case that either the energy absorption rate or the centre of mass dynamics can serve as a probe of the optical conductivity of the system. -- Highlights: •We derive various rigorous results on the dynamics of harmonically-confined atomic gases. •We derive an extension of the Harmonic Potential Theorem. •We demonstrate the link between the energy absorption rate in a harmonically-confined system and the optical conductivity.

  17. Effective application of statistical process control (SPC) on the lengthwise tonsure rolled plates process

    OpenAIRE

    Noskievičová, Darja; Kucharczyk, Radim

    2012-01-01

    This paper deals with the eff ective application of SPC on the lengthwise tonsure rolled plates process on double side scissors. After explanation of the SPC fundamentals, goals and mistakes during the SPC implementation, the methodical framework for the eff ective SPC application is defi ned. In the next part of the paper the description of practical application of SPC and its analysis from the point of view of this framework is accomplished. Ovaj članak opisuje djelotvornu primj...

  18. 47 CFR 74.1233 - Processing FM translator and booster station applications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Processing FM translator and booster station... SERVICES FM Broadcast Translator Stations and FM Broadcast Booster Stations § 74.1233 Processing FM translator and booster station applications. (a) Applications for FM translator and booster stations are...

  19. Digital Light Processing update: status and future applications

    Science.gov (United States)

    Hornbeck, Larry J.

    1999-05-01

    Digital Light Processing (DLP) projection displays based on the Digital Micromirror Device (DMD) were introduced to the market in 1996. Less than 3 years later, DLP-based projectors are found in such diverse applications as mobile, conference room, video wall, home theater, and large-venue. They provide high-quality, seamless, all-digital images that have exceptional stability as well as freedom from both flicker and image lag. Marked improvements have been made in the image quality of DLP-based projection display, including brightness, resolution, contrast ratio, and border image. DLP-based mobile projectors that weighted about 27 pounds in 1996 now weight only about 7 pounds. This weight reduction has been responsible for the definition of an entirely new projector class, the ultraportable. New applications are being developed for this important new projection display technology; these include digital photofinishing for high process speed minilab and maxilab applications and DLP Cinema for the digital delivery of films to audiences around the world. This paper describes the status of DLP-based projection display technology, including its manufacturing, performance improvements, and new applications, with emphasis on DLP Cinema.

  20. The application of radiation technology in industrial processes

    International Nuclear Information System (INIS)

    Silvermann, J.

    1974-01-01

    The author makes a general survey of current applications for radiation processing such as sterilization of biological and medical supplies, crosslinking of polymers, production of durable press fabrics, radiation-cured coating, production of wood-plastic composites, radiation degradation and chemical synthesis. The adoption of radiation processing on large scale by Western Electric is presented. The trend in costs and the environmental problems has a profound effect on the future of radiation processing. (M.S.)

  1. Applications Of Laser Processing For Automotive Manufacturing In Japan

    Science.gov (United States)

    Ito, Masashi; Ueda, Katsuhiko; Takagi, Soya

    1986-11-01

    Recently in Japan, laser processing is increasingly being employed for production, so that laser cutting, laser welding and other laser material processing have begun to be used in various industries. As a result, the number of lasers sold has been increasing year by year in Japan. In the Japanese automotive industry, a number applications have been introduced in laboratories and production lines. In this paper, several current instances of such laser applications will be introduced. In the case of welding, studies have been conducted on applying laser welding to automatic transmission components, in place of electron beam welding. Another example of application, the combination of lasers and robots to form highly flexible manufacturing systems, has been adopted for trimming steel panel and plastic components.

  2. Nuclear analytical techniques and applications to materials processing

    International Nuclear Information System (INIS)

    Blondiaux, G.; Debrun, J.L.

    1993-01-01

    This paper will present the application of Rutherford backscattering spectrometry to thin film steochiometry determination and application to optimization of the film process elaboration in the case of dielectric films (Ge,Pb,O) and ionic conductors films (Na,Al,O). After we shall present the application of particles induced gamma emission (PIGE) for the characterization of ternary compounds (B,Si,C) used as coating to protect composites materials. The last part of this paper will describe the determination of oxygen in the bulk of fluoride glasses with charged particles activation analysis. (orig.)

  3. Community historians and the dilemma of rigor vs relevance : A comment on Danziger and van Rappard

    NARCIS (Netherlands)

    Dehue, Trudy

    1998-01-01

    Since the transition from finalism to contextualism, the history of science seems to be caught up in a basic dilemma. Many historians fear that with the new contextualist standards of rigorous historiography, historical research can no longer be relevant to working scientists themselves. The present

  4. Student’s rigorous mathematical thinking based on cognitive style

    Science.gov (United States)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  5. The application of mean control chart in managing industrial processes

    Directory of Open Access Journals (Sweden)

    Papić-Blagojević Nataša

    2013-01-01

    Full Text Available Along with the advent of mass production comes the problem of monitoring and maintaining the quality of the product, which stressed the need for the application of selected statistical and mathematical methods in the control process. The main objective of applying the methods of statistical control is continuous quality improvement through permanent monitoring of the process in order to discover the causes of errors. Shewart charts are the most popular method of statistical process control, which performs separation of controlled and uncontrolled variations along with detection of increased variations. This paper presents the example of Shewart mean control chart with application in managing industrial process.

  6. Lessons learned from a rigorous peer-review process for building the Climate Literacy and Energy Awareness (CLEAN) collection of high-quality digital teaching materials

    Science.gov (United States)

    Gold, A. U.; Ledley, T. S.; McCaffrey, M. S.; Buhr, S. M.; Manduca, C. A.; Niepold, F.; Fox, S.; Howell, C. D.; Lynds, S. E.

    2010-12-01

    The topic of climate change permeates all aspects of our society: the news, household debates, scientific conferences, etc. To provide students with accurate information about climate science and energy awareness, educators require scientifically and pedagogically robust teaching materials. To address this need, the NSF-funded Climate Literacy & Energy Awareness Network (CLEAN) Pathway has assembled a new peer-reviewed digital collection as part of the National Science Digital Library (NSDL) featuring teaching materials centered on climate and energy science for grades 6 through 16. The scope and framework of the collection is defined by the Essential Principles of Climate Science (CCSP 2009) and a set of energy awareness principles developed in the project. The collection provides trustworthy teaching materials on these socially relevant topics and prepares students to become responsible decision-makers. While a peer-review process is desirable for curriculum developer as well as collection builder to ensure quality, its implementation is non-trivial. We have designed a rigorous and transparent peer-review process for the CLEAN collection, and our experiences provide general guidelines that can be used to judge the quality of digital teaching materials across disciplines. Our multi-stage review process ensures that only resources with teaching goals relevant to developing climate literacy and energy awareness are considered. Each relevant resource is reviewed by two individuals to assess the i) scientific accuracy, ii) pedagogic effectiveness, and iii) usability/technical quality. A science review by an expert ensures the scientific quality and accuracy. Resources that pass all review steps are forwarded to a review panel of educators and scientists who make a final decision regarding inclusion of the materials in the CLEAN collection. Results from the first panel review show that about 20% (~100) of the resources that were initially considered for inclusion

  7. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  8. Newly Developed Software Application for Multiple Access Process Planning

    Directory of Open Access Journals (Sweden)

    Katarina Monkova

    2014-11-01

    Full Text Available The purchase of a complex system for computer aided process planning (CAPP can be expensive for little and some middle sized plants, sometimes an inaccessible investment, with a long recoupment period. According to this fact and the author's experience with Eastern European plants, they decided to design a new database application which is suitable for production, stock, and economic data holding as well as processing and exploitation within the manufacturing process. The application can also be used to process a plan according to the selected criteria, for technological documentation and NC program creation. It was based on the theory of a multivariant approach to computer aided plan generation. Its fundamental features, the internal mathematical structure and new code system of processed objects, were prepared by the authors. The verification of the designed information system in real practice has shown that it enables about 30% cost and production time reduction and decreases input material assortment variability.

  9. Assessment of very high temperature reactors in process applications

    International Nuclear Information System (INIS)

    Jones, J.E. Jr.; Spiewak, I.; Gambill, W.R.

    1976-01-01

    In April 1974, the United States Energy Research and Development Administration (ERDA) authorized General Atomic Company, General Electric Company, and Westinghouse Astronuclear Laboratory to assess the available technology for producing process heat utilizing a very high temperature nuclear reactor (VHTR). The VHTR is defined as a gas-cooled graphite-moderated reactor. Oak Ridge National Laboratory has been given a lead role in evaluating the VHTR reactor studies and potential applications of the VHTR. Process temperatures up to the 760 to 871 0 C range appear to be achievable with near-term technology. The major development considerations are high temperature materials, the safety questions (especially regarding the need for an intermediate heat exchanger) and the process heat exchanger. The potential advantages of the VHTR over competing fossil energy sources are conservation of fossil fuels and reduced atmospheric impacts. Costs are developed for nuclear process heat supplied from a 3000-MW(th) VHTR. The range of cost in process applications is competitive with current fossil fuel alternatives

  10. Image processing in radiology. Current applications

    International Nuclear Information System (INIS)

    Neri, E.; Caramella, D.; Bartolozzi, C.

    2008-01-01

    Few fields have witnessed such impressive advances as image processing in radiology. The progress achieved has revolutionized diagnosis and greatly facilitated treatment selection and accurate planning of procedures. This book, written by leading experts from many countries, provides a comprehensive and up-to-date description of how to use 2D and 3D processing tools in clinical radiology. The first section covers a wide range of technical aspects in an informative way. This is followed by the main section, in which the principal clinical applications are described and discussed in depth. To complete the picture, a third section focuses on various special topics. The book will be invaluable to radiologists of any subspecialty who work with CT and MRI and would like to exploit the advantages of image processing techniques. It also addresses the needs of radiographers who cooperate with clinical radiologists and should improve their ability to generate the appropriate 2D and 3D processing. (orig.)

  11. Applications of sonochemistry in Russian food processing industry.

    Science.gov (United States)

    Krasulya, Olga; Shestakov, Sergey; Bogush, Vladimir; Potoroko, Irina; Cherepanov, Pavel; Krasulya, Boris

    2014-11-01

    In food industry, conventional methodologies such as grinding, mixing, and heat treatment are used for food processing and preservation. These processes have been well studied for many centuries and used in the conversion of raw food materials to consumable food products. This report is dedicated to the application of a cost-efficient method of energy transfer caused by acoustic cavitation effects in food processing, overall, having significant impacts on the development of relatively new area of food processing such as food sonochemistry. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Applicability and trends of anaerobic granular sludge treatment processes

    International Nuclear Information System (INIS)

    Lim, Seung Joo; Kim, Tak-Hyun

    2014-01-01

    Anaerobic granular sludge treatment processes have been continuously developed, although the anaerobic sludge granulation process was not clearly understood. In this review, an upflow anaerobic sludge blanket (UASB), an expanded granule sludge blanket (EGSB), and a static granular bed reactor (SGBR) were introduced as components of a representative anaerobic granular sludge treatment processes. The characteristics and application trends of each reactor were presented. The UASB reactor was developed in the late 1970s and its use has been rapidly widespread due to the excellent performance. With the active granules, this reactor is able to treat various high-strength wastewaters as well as municipal wastewater. Most soluble industrial wastewaters can be efficiently applied using a UASB. The EGSB reactor was developed owing to give more chance to contact between wastewater and the granules. Dispersed sludge is separated from mature granules using the rapid upward velocity in this reactor. The EGSB reactor shows the excellent performance in treating low-strength and/or high-strength wastewater, especially under low temperatures. The SGBR, developed at Iowa State University, is one of anaerobic granular sludge treatment processes. Although the configuration of the SGBR is very simple, the performance of this system is similar to that of the UASB or EGSB reactor. The anaerobic sludge granulation processes showed excellent performance for various wastewaters at a broad range of organic loading rate in lab-, pilot-scale tests. This leads to erect thousands of full-scale granular processes, which has been widely operated around the world. -- Highlights: • Anaerobic sludge granulation is a key parameter for maintaining granular processes. • Anaerobic granular digestion processes are applicable for various wastewaters. • The UASB is an economic high-rate anaerobic granular process. • The EGSB can treat high-strength wastewater using expanding granules. • The SGBR is

  13. Plasma spray technology process parameters and applications

    International Nuclear Information System (INIS)

    Sreekumar, K.P.; Karthikeyan, J.; Ananthapadmanabhan, P.V.; Venkatramani, N.; Chatterjee, U.K.

    1991-01-01

    The current trend in the structural design philosophy is based on the use of substrate with the necessary mechanical properties and a thin coating to exhibit surface properties. Plasma spray process is a versatile surface coating technique which finds extensive application in meeting advance technologies. This report describes the plasma spray technique and its use in developing coatings for various applications. The spray system is desribed in detail including the different variables such as power input to the torch, gas flow rate, powder properties, powder injection, etc. and their interrelation in deciding the quality of the coating. A brief write-up on the various plasma spray coatings developed for different applications is also included. (author). 15 refs., 15 figs., 2 tabs

  14. Linear circuits, systems and signal processing: theory and application

    International Nuclear Information System (INIS)

    Byrnes, C.I.; Saeks, R.E.; Martin, C.F.

    1988-01-01

    In part because of its universal role as a first approximation of more complicated behaviour and in part because of the depth and breadth of its principle paradigms, the study of linear systems continues to play a central role in control theory and its applications. Enhancing more traditional applications to aerospace and electronics, application areas such as econometrics, finance, and speech and signal processing have contributed to a renaissance in areas such as realization theory and classical automatic feedback control. Thus, the last few years have witnessed a remarkable research effort expended in understanding both new algorithms and new paradigms for modeling and realization of linear processes and in the analysis and design of robust control strategies. The papers in this volume reflect these trends in both the theory and applications of linear systems and were selected from the invited and contributed papers presented at the 8th International Symposium on the Mathematical Theory of Networks and Systems held in Phoenix on June 15-19, 1987

  15. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part I

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    This work presents a new strategy for fault diagnosis in large chemical processes (E.E. Tarifa, Fault diagnosis in complex chemistries plants: plants of large dimensions and batch processes. Ph.D. thesis, Universidad Nacional del Litoral, Santa Fe, 1995). A special decomposition of the plant is made in sectors. Afterwards each sector is studied independently. These steps are carried out in the off-line mode. They produced vital information for the diagnosis system. This system works in the on-line mode and is based on a two-tier strategy. When a fault is produced, the upper level identifies the faulty sector. Then, the lower level carries out an in-depth study that focuses only on the critical sectors to identify the fault. The loss of information produced by the process partition may cause spurious diagnosis. This problem is overcome at the second level using qualitative simulation and fuzzy logic. In the second part of this work, the new methodology is tested to evaluate its performance in practical cases. A multiple stage flash desalination system (MSF) is chosen because it is a complex system, with many recycles and variables to be supervised. The steps for the knowledge base generation and all the blocks included in the diagnosis system are analyzed. Evaluation of the diagnosis performance is carried out using a rigorous dynamic simulator

  16. A flexible software architecture for scalable real-time image and video processing applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  17. Meat quality and rigor mortis development in broiler chickens with gas-induced anoxia and postmortem electrical stimulation.

    Science.gov (United States)

    Sams, A R; Dzuik, C S

    1999-10-01

    This study was conducted to evaluate the combined rigor-accelerating effects of postmortem electrical stimulation (ES) and argon-induced anoxia (Ar) of broiler chickens. One hundred broilers were processed in the following treatments: untreated controls, ES, Ar, or Ar with ES (Ar + ES). Breast fillets were harvested at 1 h postmortem for all treatments or at 1 and 6 h postmortem for the control carcasses. Fillets were sampled for pH and ratio of inosine to adenosine (R-value) and were then individually quick frozen (IQF) or aged on ice (AOI) until 24 h postmortem. Color was measured in the AOI fillets at 24 h postmortem. All fillets were then cooked and evaluated for Allo-Kramer shear value. The Ar treatment accelerated the normal pH decline, whereas the ES and AR + ES treatments yielded even lower pH values at 1 h postmortem. The Ar + ES treatment had a greater R-value than the ES treatment, which was greater than either the Ar or 1-h controls, which, in turn, were not different from each other. The ES treatment had the lowest L* value, and ES, Ar, and Ar + ES produced significantly higher a* values than the 1-h controls. For the IQF fillets, the ES and Ar + ES treatments were not different in shear value but were lower than Ar, which was lower than the 1-h controls. The same was true for the AOI fillets except that the ES and the Ar treatments were not different. These results indicated that although ES and Ar had rigor-accelerating and tenderizing effects, ES seemed to be more effective than Ar; there was little enhancement when Ar was added to the ES treatment and fillets were deboned at 1 h postmortem.

  18. Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance.

    Directory of Open Access Journals (Sweden)

    Sophie Marchal

    Full Text Available Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs' greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately.

  19. Incorporation of environmental impact criteria in the design and operation of chemical processes

    Directory of Open Access Journals (Sweden)

    P.E. Bauer

    2004-09-01

    Full Text Available Environmental impact assessment is becoming indispensable for the design and operation of chemical plants. Structured and consistent methods for this purpose have experienced a rapid development. The more rigorous and sophisticated these methods become, the greater is the demand for convenient tools. On the other hand, despite the incredible advances in process simulators, some aspects have still not been sufficiently covered. To date, applications of these programs to quantify environmental impacts have been restricted to straightforward examples of steady-state processes. In this work, a life-cycle assessment implementation with the aim of process design will be described, with a brief discussion of a dynamic simulation for analysis of transient state operations, such as process start-up. A case study shows the importance of this analysis in making possible operation at a high performance level with reduced risks to the environment.

  20. 18 CFR 1304.4 - Application review and approval process.

    Science.gov (United States)

    2010-04-01

    ... hearing is requested by the USACE pursuant to the TVA/Corps joint processing Memorandum of Understanding... the application as appropriate. (b) If a hearing is held for any of the reasons described in paragraph... contained in the hearing notice. (c) Hearings concerning approval of applications are conducted (in...

  1. Application of CAD/CAE/CAM in forging process: a review

    International Nuclear Information System (INIS)

    Ahmad Baharuddin Abdullah; Hamouda, A.M.S.

    2005-01-01

    Forging can be described as the process in which metal is plastically deformed with application of huge pressure. The process not only changes the shape but also improves the properties of the forged parts due to grain size refinement. Conventionally, the empirical trial and error method has been applied, but recently there are various tools are employed to improved product quality and economic of the process. For example, Computer Aided Design (CAD) is widely used in modeling of the process, while Computer Aided Engineering (CAE) tools have been utilized in analyzing the process. To physically demonstrate the process, Computer Aided Manufacturing (CAM) such as CNC machine has been exploited. In order to improve forging process efficiency, an integrated system that combines all advantages of CAD, CAM and CAE need to be developed. This paper presents an overview of computer aided simulation such as CAD, CAE and CAM application in forging process. (Author)

  2. A systematic process for persuasive mobile healthcare applications

    Science.gov (United States)

    Qasim, Mustafa Moosa; Ahmad, Mazida; Omar, Mazni; Zulkifli, Abdul Nasir; Bakar, Juliana Aida Abu

    2017-10-01

    In recent years there has been an increased focus on persuasive design of mobile in the healthcare domain. However, most of the studies did not follow systematic processes while analysis and designing the persuasive technology applications, and they also failed to provide some of the relevant information needed to design the persuasive applications. Adding to this is a need for more guidance in order to set how the persuasive guidelines can be implemented, which also means that there is a need for a way to transform the persuasive components into software requirements and functionalities. Therefore, this paper proposes a general systematic process to be used independently of the problem domain in order to analyze the customers' significant requirements. Such domain is the obesity problem among Malaysian children, and the most significant treatment of this case is parents' involvement. To this end, this paper will apply a systematic process in monitoring the children's obesity status among parents.

  3. Design and qualification of a RTOS for safety class nuclear C and I applications

    International Nuclear Information System (INIS)

    Wakankar, A.; Khan, Arindam; Kalra, Mohit; Mitra, Raka; Aravamuthan, G.; Bhattacharjee, A.K.; Vaidya, U.W.; Mayya, Anuradha

    2014-01-01

    Real Time Operating System (RTOS) is a critical component of embedded systems. International standards such as IEC60880 used for development of Instrumentation and Control (I and C) system in nuclear power plants require rigorous qualification of all software components. In this paper, we describe our experience in the design and qualification of ESOS; an in-house configured RTOS from a commodity RTOS available with source code. The qualification activities include static and dynamic analysis, timing analysis and rigorous program analysis. We discuss how rigorous program analysis was used to uncover a subtle bug in the implementation. We also discuss the applications where this qualified RTOS has been successfully used. (author)

  4. Perceptions of teachers of the application of science process skills in ...

    African Journals Online (AJOL)

    This article reports on teachers' perceptions of the application of science process skills in the teaching of Geography in secondary schools in the Free State province. A teachers' questionnaire on the application of the science process skills in the teaching of Geography was constructed and the questionnaire was content ...

  5. Cost reductions on a titanium dioxide plant identified by a process integration study at Tioxide UK Ltd

    Energy Technology Data Exchange (ETDEWEB)

    1985-08-01

    The purpose of a process integration study is to determine the minimum practical amount of energy required to operate a process and to identify the most appropriate investment strategy which will realise the maximum energy cost savings consistent with a particular company's financial and operating criteria. The process integration method involves the rigorous application of thermodynamics and cost accounting, tempered by practical plant engineering and operability considerations. Tioxide UK Ltd is part of Tioxide Group plc and operates two UK sites for the production of titanium dioxide pigment. The site in question, Greatham works near Hartlepool, produces pigment via the chloride route. The energy costs at Greatham works can amount to pound5 - 6 million/year depending on production levels. (author).

  6. FineLine{sup TM} welding process for power plant applications

    Energy Technology Data Exchange (ETDEWEB)

    Offer, H.; Chapman, T.; Grycko, L.; Mahoney, P. [GE Nuclear Energy, San Jose, CA (United States)

    1996-12-31

    This paper discusses the technical development and current applications of the FineLine{sup TM} Welding (FLW) process. FineLine Welding is a modified Gas Tungsten-Arc Welding mechanized process recently developed by GE Nuclear Energy for a thin-wall piping application. Based on its unique combination of high thermal and volumetric efficiencies, the FLW process offers significant technical and productivity improvements over both standard orbital V-groove and narrow-gap welding procedures. The FLW process is suitable for many common structural materials, and has been successfully applied to unstabilized and stabilized grades of austenitic stainless steels, martensitic stainless steels, nickel-base (Inconel) alloys, carbon steels, as well as ferritic higher alloy steels. (orig./MM)

  7. Rigorous Multicomponent Reactive Separations Modelling: Complete Consideration of Reaction-Diffusion Phenomena

    International Nuclear Information System (INIS)

    Ahmadi, A.; Meyer, M.; Rouzineau, D.; Prevost, M.; Alix, P.; Laloue, N.

    2010-01-01

    This paper gives the first step of the development of a rigorous multicomponent reactive separation model. Such a model is highly essential to further the optimization of acid gases removal plants (CO 2 capture, gas treating, etc.) in terms of size and energy consumption, since chemical solvents are conventionally used. Firstly, two main modelling approaches are presented: the equilibrium-based and the rate-based approaches. Secondly, an extended rate-based model with rigorous modelling methodology for diffusion-reaction phenomena is proposed. The film theory and the generalized Maxwell-Stefan equations are used in order to characterize multicomponent interactions. The complete chain of chemical reactions is taken into account. The reactions can be kinetically controlled or at chemical equilibrium, and they are considered for both liquid film and liquid bulk. Thirdly, the method of numerical resolution is described. Coupling the generalized Maxwell-Stefan equations with chemical equilibrium equations leads to a highly non-linear Differential-Algebraic Equations system known as DAE index 3. The set of equations is discretized with finite-differences as its integration by Gear method is complex. The resulting algebraic system is resolved by the Newton- Raphson method. Finally, the present model and the associated methods of numerical resolution are validated for the example of esterification of methanol. This archetype non-electrolytic system permits an interesting analysis of reaction impact on mass transfer, especially near the phase interface. The numerical resolution of the model by Newton-Raphson method gives good results in terms of calculation time and convergence. The simulations show that the impact of reactions at chemical equilibrium and that of kinetically controlled reactions with high kinetics on mass transfer is relatively similar. Moreover, the Fick's law is less adapted for multicomponent mixtures where some abnormalities such as counter

  8. Cost reductions of fuel cells for transport applications: fuel processing options

    Energy Technology Data Exchange (ETDEWEB)

    Teagan, W P; Bentley, J; Barnett, B [Arthur D. Little, Inc., Cambridge, MA (United States)

    1998-03-15

    The highly favorable efficiency/environmental characteristics of fuel cell technologies have now been verified by virtue of recent and ongoing field experience. The key issue regarding the timing and extent of fuel cell commercialization is the ability to reduce costs to acceptable levels in both stationary and transport applications. It is increasingly recognized that the fuel processing subsystem can have a major impact on overall system costs, particularly as ongoing R and D efforts result in reduction of the basic cost structure of stacks which currently dominate system costs. The fuel processing subsystem for polymer electrolyte membrane fuel cell (PEMFC) technology, which is the focus of transport applications, includes the reformer, shift reactors, and means for CO reduction. In addition to low cost, transport applications require a fuel processor that is compact and can start rapidly. This paper describes the impact of factors such as fuel choice operating temperature, material selection, catalyst requirements, and controls on the cost of fuel processing systems. There are fuel processor technology paths which manufacturing cost analyses indicate are consistent with fuel processor subsystem costs of under $150/kW in stationary applications and $30/kW in transport applications. As such, the costs of mature fuel processing subsystem technologies should be consistent with their use in commercially viable fuel cell systems in both application categories. (orig.)

  9. Rigorous Quantum Field Theory A Festschrift for Jacques Bros

    CERN Document Server

    Monvel, Anne Boutet; Iagolnitzer, Daniel; Moschella, Ugo

    2007-01-01

    Jacques Bros has greatly advanced our present understanding of rigorous quantum field theory through numerous fundamental contributions. This book arose from an international symposium held in honour of Jacques Bros on the occasion of his 70th birthday, at the Department of Theoretical Physics of the CEA in Saclay, France. The impact of the work of Jacques Bros is evident in several articles in this book. Quantum fields are regarded as genuine mathematical objects, whose various properties and relevant physical interpretations must be studied in a well-defined mathematical framework. The key topics in this volume include analytic structures of Quantum Field Theory (QFT), renormalization group methods, gauge QFT, stability properties and extension of the axiomatic framework, QFT on models of curved spacetimes, QFT on noncommutative Minkowski spacetime. Contributors: D. Bahns, M. Bertola, R. Brunetti, D. Buchholz, A. Connes, F. Corbetta, S. Doplicher, M. Dubois-Violette, M. Dütsch, H. Epstein, C.J. Fewster, K....

  10. Quantum Computation-Based Image Representation, Processing Operations and Their Applications

    Directory of Open Access Journals (Sweden)

    Fei Yan

    2014-10-01

    Full Text Available A flexible representation of quantum images (FRQI was proposed to facilitate the extension of classical (non-quantum-like image processing applications to the quantum computing domain. The representation encodes a quantum image in the form of a normalized state, which captures information about colors and their corresponding positions in the images. Since its conception, a handful of processing transformations have been formulated, among which are the geometric transformations on quantum images (GTQI and the CTQI that are focused on the color information of the images. In addition, extensions and applications of FRQI representation, such as multi-channel representation for quantum images (MCQI, quantum image data searching, watermarking strategies for quantum images, a framework to produce movies on quantum computers and a blueprint for quantum video encryption and decryption have also been suggested. These proposals extend classical-like image and video processing applications to the quantum computing domain and offer a significant speed-up with low computational resources in comparison to performing the same tasks on traditional computing devices. Each of the algorithms and the mathematical foundations for their execution were simulated using classical computing resources, and their results were analyzed alongside other classical computing equivalents. The work presented in this review is intended to serve as the epitome of advances made in FRQI quantum image processing over the past five years and to simulate further interest geared towards the realization of some secure and efficient image and video processing applications on quantum computers.

  11. Guar gum: processing, properties and food applications-A Review.

    Science.gov (United States)

    Mudgil, Deepak; Barak, Sheweta; Khatkar, Bhupendar Singh

    2014-03-01

    Guar gum is a novel agrochemical processed from endosperm of cluster bean. It is largely used in the form of guar gum powder as an additive in food, pharmaceuticals, paper, textile, explosive, oil well drilling and cosmetics industry. Industrial applications of guar gum are possible because of its ability to form hydrogen bonding with water molecule. Thus, it is chiefly used as thickener and stabilizer. It is also beneficial in the control of many health problems like diabetes, bowel movements, heart disease and colon cancer. This article focuses on production, processing, composition, properties, food applications and health benefits of guar gum.

  12. Solution-Processed Carbon Nanotube True Random Number Generator.

    Science.gov (United States)

    Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C

    2017-08-09

    With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.

  13. Improving Knowledge and Process for International Emergency Medicine Fellowship Applicants: A Call for a Uniform Application

    Directory of Open Access Journals (Sweden)

    Gabrielle A. Jacquet

    2013-01-01

    Full Text Available Background. There are currently 34 International Emergency Medicine (IEM fellowship programs. Applicants and programs are increasing in number and diversity. Without a standardized application, applicants have a difficulty approaching programs in an informed and an organized method; a streamlined application system is necessary. Objectives. To measure fellows’ knowledge of their programs’ curricula prior to starting fellowship and to determine what percent of fellows and program directors would support a universal application system. Methods. A focus group of program directors, recent, and current fellows convened to determine the most important features of an IEM fellowship application process. A survey was administered electronically to a convenience sample of 78 participants from 34 programs. Respondents included fellowship directors, fellows, and recent graduates. Results. Most fellows (70% did not know their program’s curriculum prior to starting fellowship. The majority of program directors and fellows support a uniform application service (81% and 67%, resp. and deadline (85% for both. A minority of program directors (35% and fellows (30% support a formal match. Conclusions. Program directors and fellows support a uniform application service and deadline, but not a formalized match. Forums for disseminating IEM fellowship information and for administering a uniform application service and deadline are currently in development to improve the process.

  14. A new method for deriving rigorous results on ππ scattering

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.

    1979-06-01

    We develop a new approach to the problem of constraining the ππ scattering amplitudes by means of the axiomatically proved properties of unitarity, analyticity and crossing symmetry. The method is based on the solution of an extremal problem on a convex set of analytic functions and provides a global description of the domain of values taken by any finite number of partial waves at an arbitrary set of unphysical energies, compatible with unitarity, the bounds at complex energies derived from generalized dispersion relations and the crossing integral relations. From this doma domain we obtain new absolute bounds for the amplitudes as well as rigorous correlations between the values of various partial waves. (author)

  15. A method of network topology optimization design considering application process characteristic

    Science.gov (United States)

    Wang, Chunlin; Huang, Ning; Bai, Yanan; Zhang, Shuo

    2018-03-01

    Communication networks are designed to meet the usage requirements of users for various network applications. The current studies of network topology optimization design mainly considered network traffic, which is the result of network application operation, but not a design element of communication networks. A network application is a procedure of the usage of services by users with some demanded performance requirements, and has obvious process characteristic. In this paper, we first propose a method to optimize the design of communication network topology considering the application process characteristic. Taking the minimum network delay as objective, and the cost of network design and network connective reliability as constraints, an optimization model of network topology design is formulated, and the optimal solution of network topology design is searched by Genetic Algorithm (GA). Furthermore, we investigate the influence of network topology parameter on network delay under the background of multiple process-oriented applications, which can guide the generation of initial population and then improve the efficiency of GA. Numerical simulations show the effectiveness and validity of our proposed method. Network topology optimization design considering applications can improve the reliability of applications, and provide guidance for network builders in the early stage of network design, which is of great significance in engineering practices.

  16. Applications of radiation processing: SRI experiences

    International Nuclear Information System (INIS)

    Rajput, Sanjay

    2014-01-01

    Shriram Applied Radiation Centre (SARC) is a part of Shriram Institute for Industrial Research (SRI), and was established in 1986, in collaboration with Bhabha Atomic Research Centre (BARC), Board of Radiation and Isotope Technology (BRIT), Department of Atomic Energy (DAE), Atomic Energy Regulatory Board (AERB). SARC was established with a objective to popularize the radiation processing technology for various applications. SARC is a fully automatic, computerized plant setup as per the design and norms of BRIT/AERB for round the clock fail safe operations. The capacity of SARC Irradiator is 800 kCi of Cobalt -60 source which can process up to 10,000 cubic meters of material (0.1g/cc) at 25 kGy level

  17. Gas processing at DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Jacox, J.

    1995-02-01

    The term {open_quotes}Gas Processing{close_quotes} has many possible meanings and understandings. In this paper, and panel, we will be using it to generally mean the treatment of gas by methods other than those common to HVAC and Nuclear Air Treatment. This is only a working guideline not a rigorous definition. Whether a rigorous definition is desirable, or even possible is a question for some other forum. Here we will be discussing the practical aspects of what {open_quotes}Gas Processing{close_quotes} includes and how existing Codes, Standards and industry experience can, and should, apply to DOE and NRC Licensed facilities. A major impediment to use of the best engineering and technology in many nuclear facilities is the administrative mandate that only systems and equipment that meet specified {open_quotes}nuclear{close_quotes} documents are permissible. This paper will highlight some of the limitations created by this approach.

  18. Real analysis and applications

    CERN Document Server

    Botelho, Fabio Silva

    2018-01-01

    This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.

  19. Atmospheric plasma processes for environmental applications

    OpenAIRE

    Shapoval, Volodymyr

    2012-01-01

    Plasma chemistry is a rapidly growing field which covers applications ranging from technological processing of materials, including biological tissues, to environmental remediation and energy production. The so called atmospheric plasma, produced by electric corona or dielectric barrier discharges in a gas at atmospheric pressure, is particularly attractive for the low costs and ease of operation and maintenance involved. The high concentrations of energetic and chemically active species (e.g...

  20. Potential applications of helium-cooled high-temperature reactors to process heat use

    International Nuclear Information System (INIS)

    Gambill, W.R.; Kasten, P.R.

    1981-01-01

    High-Temperature Gas-Cooled Reactors (HTRs) permit nuclear energy to be applied to a number of processes presently utilizing fossil fuels. Promising applications of HTRs involve cogeneration, thermal energy transport using molten salt systems, steam reforming of methane for production of chemicals, coal and oil shale liquefaction or gasification, and - in the longer term - energy transport using a chemical heat pipe. Further, HTRs might be used in the more distant future as the energy source for thermochemical hydrogen production from water. Preliminary results of ongoing studies indicate that the potential market for Process Heat HTRs by the year 2020 is about 150 to 250 GW(t) for process heat/cogeneration application, plus approximately 150 to 300 GW(t) for application to fossil conversion processes. HTR cogeneration plants appear attractive in the near term for new industrial plants using large amounts of process heat, possibly for present industrial plants in conjunction with molten-salt energy distribution systems, and also for some fossil conversion processes. HTR reformer systems will take longer to develop, but are applicable to chemicals production, a larger number of fossil conversion processes, and to chemical heat pipes

  1. Searching Process with Raita Algorithm and its Application

    Science.gov (United States)

    Rahim, Robbi; Saleh Ahmar, Ansari; Abdullah, Dahlan; Hartama, Dedy; Napitupulu, Darmawan; Putera Utama Siahaan, Andysah; Hasan Siregar, Muhammad Noor; Nasution, Nurliana; Sundari, Siti; Sriadhi, S.

    2018-04-01

    Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.

  2. Plasma technologies: applications to waste processing

    International Nuclear Information System (INIS)

    Fauchais, P.

    2007-01-01

    Since the 1990's, plasma technologies have found applications in the processing of toxic wastes of military and industrial origin, like the treatment of contaminated solids and low level radioactive wastes, the decontamination of soils etc.. Since the years 2000, this development is becoming exponential, in particular for the processing of municipal wastes and the recovery of their synthesis gas. The advantage of thermal plasmas with respect to conventional combustion techniques are: a high temperature (more than 6000 K), a pyrolysis capability (CO formation instead of CO 2 ), about 90% of available energy above 1500 K (with respect to 23% with flames), a greater energy density, lower gas flow rates, and plasma start-up and shut-down times of only few tenth of seconds. This article presents: 1 - the present day situation of thermal plasmas development; 2 - some general considerations about plasma waste processing; 3 - the plasma processes: liquid toxic wastes, solid wastes (contaminated soils and low level radioactive wastes, military wastes, vitrification of incinerators fly ash, municipal wastes processing, treatment of asbestos fibers, treatment of chlorinated industrial wastes), metallurgy wastes (dusts, aluminium slags), medical and ship wastes, perspectives; 4 -conclusion. (J.S.)

  3. The application and interviewing process for surgical house officership.

    Science.gov (United States)

    Rutkow, I M; Imbembo, A L; Zuidema, G D

    1979-02-01

    The application and interviewing procedure for surgical house officership is an important process to both the medical student and the clinical department. Up-to-date, informative, and honest appraisals of the training programs under evaluation must be obtained. A survey was undertaken to compare and contrast students' and surgical department members' perceptions of nationwide surgical residency application procedures. It is concluded from this sampling that the majority of medical students applying to university-sponsored surgical training programs and the training institutions themselves generally are satisfied with the present application and interviewing experience. Certain areas in need of reform were elucidated, and the following recommendations are offered to aid in the development of a more effective process: (1) if possible, the descriptive information brochure should be updated on a yearly basis and must be comprehensive in scope; (2) when "en masse" interviewing is conducted, it should be held on a number of dates during the year, not just one, and a limited time for "walk-in" interviews should be allowed; (3) an opportunity should be available for the spouse or fiance'(e) to accompany the applicant; (4) an interviewer should prepare for an interview by having read the applicant's file beforehand; and (5) the interviewing schedule should be arranged so that the applicant is able to meet either the department chairperson and/or program director.

  4. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    International Nuclear Information System (INIS)

    Taylor, J'Tia Patrice; Shropshire, David E.

    2009-01-01

    This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated system

  5. An Application of Business Process Management to Health Care Facilities.

    Science.gov (United States)

    Hassan, Mohsen M D

    The purpose of this article is to help health care facility managers and personnel identify significant elements of their facilities to address, and steps and actions to follow, when applying business process management to them. The ABPMP (Association of Business Process Management Professionals) life-cycle model of business process management is adopted, and steps from Lean, business process reengineering, and Six Sigma, and actions from operations management are presented to implement it. Managers of health care facilities can find in business process management a more comprehensive approach to improving their facilities than Lean, Six Sigma, business process reengineering, and ad hoc approaches that does not conflict with them because many of their elements can be included under its umbrella. Furthermore, the suggested application of business process management can guide and relieve them from selecting among these approaches, as well as provide them with specific steps and actions that they can follow. This article fills a gap in the literature by presenting a much needed comprehensive application of business process management to health care facilities that has specific steps and actions for implementation.

  6. Documenting the Engineering Design Process

    Science.gov (United States)

    Hollers, Brent

    2017-01-01

    Documentation of ideas and the engineering design process is a critical, daily component of a professional engineer's job. While patent protection is often cited as the primary rationale for documentation, it can also benefit the engineer, the team, company, and stakeholders through creating a more rigorously designed and purposeful solution.…

  7. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  8. 49 CFR 107.709 - Processing of an application for approval, including an application for renewal or modification.

    Science.gov (United States)

    2010-10-01

    ... TRANSPORTATION HAZARDOUS MATERIALS AND OIL TRANSPORTATION HAZARDOUS MATERIALS PROGRAM PROCEDURES Approvals... before the disposition of an application. (b) At any time during the processing of an application, the...

  9. 40 CFR 408.330 - Applicability; description of the abalone processing subcategory.

    Science.gov (United States)

    2010-07-01

    ... abalone processing subcategory. 408.330 Section 408.330 Protection of Environment ENVIRONMENTAL PROTECTION... CATEGORY Abalone Processing Subcategory § 408.330 Applicability; description of the abalone processing... abalone in the contiguous states. ...

  10. THE WAY OF PROCESSING DATA IN APPROACHING ECONOMIC APPLICATIONS

    Directory of Open Access Journals (Sweden)

    ADRIAN GHENCEA

    2012-05-01

    Full Text Available Economic Informatics originates in the industry economy and the electronic processing of information. A clear distinction is made between IT and economic informatics, and further between general and particular economic informatics (the particular economic informatics meaning administration, industrial informatics etc. Economic informatics is deemed to be an applicative science relating to the conception, working modality and representation of IT and communication systems, oriented towards companies which are using electronic computers. This paper pursues to integrate applications allowing the information systems to interconnect at informational level, by information sharing, and at service level, considering the control of the related processes in real time.

  11. Aspects Concerning the Optimization of Authentication Process for Distributed Applications

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2008-06-01

    Full Text Available There will be presented the types of distributed applications. The quality characteristics for distributed applications will be analyzed. There will be established the ways to assign access rights. The authentication category will be analyzed. We will propose an algorithm for the optimization of authentication process.For the application “Evaluation of TIC projects” the algorithm proposed will be applied.

  12. 30 CFR 285.907 - How will MMS process my decommissioning application?

    Science.gov (United States)

    2010-07-01

    ... application? 285.907 Section 285.907 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY ALTERNATE USES OF EXISTING FACILITIES ON THE OUTER CONTINENTAL SHELF Decommissioning Decommissioning Applications § 285.907 How will MMS process my decommissioning application? (a...

  13. A trajectory description of quantum processes. II. Applications. A Bohmian perspective

    Energy Technology Data Exchange (ETDEWEB)

    Sanz, Angel S.; Miret-Artes, Salvador [CSIC, Madrid (Spain). Inst. de Fisica Fundamental (IFF-CSIC)

    2014-07-01

    Presents a thorough introduction to, and treatment of, trajectory-based quantum-mechanical calculations. Useful for a wide range of scattering problems. Presents the applications of the trajectory description of basic quantum processes. Trajectory-based formalisms are an intuitively appealing way of describing quantum processes because they allow the use of ''classical'' concepts. Beginning as an introductory level suitable for students, this two-volume monograph presents (1) the fundamentals and (2) the applications of the trajectory description of basic quantum processes. This second volume is focussed on simple and basic applications of quantum processes such as interference and diffraction of wave packets, tunneling, diffusion and bound-state and scattering problems. The corresponding analysis is carried out within the Bohmian framework. By stressing its interpretational aspects, the book leads the reader to an alternative and complementary way to better understand the underlying quantum dynamics.

  14. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    Science.gov (United States)

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  15. Software Process Improvement Journey: IBM Australia Application Management Services

    Science.gov (United States)

    2005-03-01

    See Section 5.1.2) - Client Relationship Management ( CRM ) processes-specifically, Solution Design and Solution Delivery - Worldwide Project Management ...plex systems life-cycle management , rapid solutions development, custom development, package selection and implementation, maintenance, minor...CarnegieMellon ___ Software Engineering Institute Software Process Improvement Journey: IBM Australia Application Management Services Robyn Nichols

  16. The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.

    Science.gov (United States)

    Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno

    2016-01-01

    Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to

  17. 40 CFR 74.17 - Application requirements for process sources. [Reserved

    Science.gov (United States)

    2010-07-01

    ... requirements for process sources. [Reserved] ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Application requirements for process sources. [Reserved] 74.17 Section 74.17 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  18. Improving Business Processes : Does Anybody have an Idea?

    NARCIS (Netherlands)

    Vanwersch, Rob; Vanderfeesten, Irene; Rietzschel, Eric; Reijers, Hajo; Motahari-Nezhad, Hamid; Recker, Jan; Weidlich, Matthias

    2015-01-01

    As part of process redesign initiatives, substantial time is spent on the systematic description and analysis of the as-is process. By contrast, to-be scenarios are often generated in a less rigorous way. Only one or a few workshops are organized for this purpose, which rely on the use of techniques

  19. Improving business processes : does anybody have an idea?

    NARCIS (Netherlands)

    Vanwersch, R.J.B.; Vanderfeesten, I.T.P.; Rietzschel, E.F.; Reijers, H.A.; Motahari-Nezhad, H.R.; Recker, J.; Weidlich, M.

    2015-01-01

    As part of process redesign initiatives, substantial time is spent on the systematic description and analysis of the as-is process. By contrast, to-be scenarios are often generated in a less rigorous way. Only one or a few workshops are organized for this purpose, which rely on the use of techniques

  20. Rigorous covariance propagation of geoid errors to geodetic MDT estimates

    Science.gov (United States)

    Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.

    2012-04-01

    The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.

  1. Applicability of Effective Medium Approximations to Modelling of Mesocrystal Optical Properties

    Directory of Open Access Journals (Sweden)

    Oleksandr Zhuromskyy

    2016-12-01

    Full Text Available Rigorous superposition T-matrix method is used to compute light interaction with mesocrystalline structures. The results are used to validate the applicability of effective medium theories for computing the effective optical constants of mesocrystal structures composed of optically isotropic materials. It is demonstrated that the Maxwell-Garnett theory can fit the rigorous simulation results with an average accuracy of 2%. The thus obtained refractive indexes can be used with any electromagnetic simulation software to represent the response of mesocrystals composed of optically small primary particles arranged into a cubic type lattice structures.

  2. Applications of Biopolymers Modified by Radiation Processing. Chapter 12

    Energy Technology Data Exchange (ETDEWEB)

    Tamada, M. [Takasaki Advanced Radiation Research Institute, Japan Atomic Energy Agency, Takasaki (Japan)

    2014-07-15

    Radiation processing using quantum beam such as electron beam and gamma rays is a clean process. Using this process, biopolymers with low environmental burden were modified for agricultural and environmental applications. High performance materials such as soil conditioner for arid area, spray coating Washi (Japanese paper), biodegradable dummy lens, chemically-induced biodegradable plastic, biodiesel catalyst, and plant growth promoter were developed by radiationinduced crosslinking, graft polymerization, and degradation. (author)

  3. Processing of complex shapes with single-mode resonant frequency microwave applicators

    International Nuclear Information System (INIS)

    Fellows, L.A.; Delgado, R.; Hawley, M.C.

    1994-01-01

    Microwave processing is an alternative to conventional composite processing techniques. Single-mode microwave applicators efficiently couple microwave energy into the composite. The application of the microwave energy is greatly affected by the geometry of the composite. In the single mode microwave applicator, two types of modes are available. These modes are best suited to processing flat planar samples or cylindrical samples with geometries that align with the electric fields. Mode-switching is alternating between different electromagnetic modes with the intelligent selection of the modes to alleviate undesirable temperature profiles. This method has improved the microwave heating profiles of materials with complex shapes that do not align with either type of electric field. Parts with two different complex geometries were fabricated from a vinyl toluene/vinyl ester resin with a continuous glass fiber reinforcement by autoclaving and by microwave techniques. The flexural properties of the microwave processed samples were compared to the flexural properties of autoclaved samples. The trends of the mechanical properties for the complex shapes were consistent with the results of experiments with flat panels. This demonstrated that mode-switching techniques are as applicable for the complex shapes as they are for the simpler flat panel geometry

  4. Using declarative workflow languages to develop process-centric web applications

    NARCIS (Netherlands)

    Bernardi, M.L.; Cimitile, M.; Di Lucca, G.A.; Maggi, F.M.

    2012-01-01

    Nowadays, process-centric Web Applications (WAs) are extensively used in contexts where multi-user, coordinated work is required. Recently, Model Driven Engineering (MDE) techniques have been investigated for the development of this kind of applications. However, there are still some open issues.

  5. Sequential Specification of Time-aware Stream Processing Applications (Extended Abstract)

    NARCIS (Netherlands)

    Geuns, S.J.; Hausmans, J.P.H.M.; Bekooij, Marco Jan Gerrit

    2012-01-01

    Automatic parallelization of Nested Loop Programs (NLPs) is an attractive method to create embedded real-time stream processing applications for multi-core systems. However, the description and parallelization of applications with a time dependent functional behavior has not been considered in NLPs.

  6. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    Science.gov (United States)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  7. Smoothing of Transport Plans with Fixed Marginals and Rigorous Semiclassical Limit of the Hohenberg-Kohn Functional

    Science.gov (United States)

    Cotar, Codina; Friesecke, Gero; Klüppelberg, Claudia

    2018-06-01

    We prove rigorously that the exact N-electron Hohenberg-Kohn density functional converges in the strongly interacting limit to the strictly correlated electrons (SCE) functional, and that the absolute value squared of the associated constrained search wavefunction tends weakly in the sense of probability measures to a minimizer of the multi-marginal optimal transport problem with Coulomb cost associated to the SCE functional. This extends our previous work for N = 2 ( Cotar etal. in Commun Pure Appl Math 66:548-599, 2013). The correct limit problem has been derived in the physics literature by Seidl (Phys Rev A 60 4387-4395, 1999) and Seidl, Gorigiorgi and Savin (Phys Rev A 75:042511 1-12, 2007); in these papers the lack of a rigorous proofwas pointed out.We also give amathematical counterexample to this type of result, by replacing the constraint of given one-body density—an infinite dimensional quadratic expression in the wavefunction—by an infinite-dimensional quadratic expression in the wavefunction and its gradient. Connections with the Lawrentiev phenomenon in the calculus of variations are indicated.

  8. Revisiting the constant growth angle: Estimation and verification via rigorous thermal modeling

    Science.gov (United States)

    Virozub, Alexander; Rasin, Igal G.; Brandon, Simon

    2008-12-01

    Methods for estimating growth angle ( θgr) values, based on the a posteriori analysis of directionally solidified material (e.g. drops) often involve assumptions of negligible gravitational effects as well as a planar solid/liquid interface during solidification. We relax both of these assumptions when using experimental drop shapes from the literature to estimate the relevant growth angles at the initial stages of solidification. Assumed to be constant, we use these values as input into a rigorous heat transfer and solidification model of the growth process. This model, which is shown to reproduce the experimental shape of a solidified sessile water drop using the literature value of θgr=0∘, yields excellent agreement with experimental profiles using our estimated values for silicon ( θgr=10∘) and germanium ( θgr=14.3∘) solidifying on an isotropic crystalline surface. The effect of gravity on the solidified drop shape is found to be significant in the case of germanium, suggesting that gravity should either be included in the analysis or that care should be taken that the relevant Bond number is truly small enough in each measurement. The planar solidification interface assumption is found to be unjustified. Although this issue is important when simulating the inflection point in the profile of the solidified water drop, there are indications that solidified drop shapes (at least in the case of silicon) may be fairly insensitive to the shape of this interface.

  9. Applying advanced digital signal processing techniques in industrial radioisotopes applications

    International Nuclear Information System (INIS)

    Mahmoud, H.K.A.E.

    2012-01-01

    Radioisotopes can be used to obtain signals or images in order to recognize the information inside the industrial systems. The main problems of using these techniques are the difficulty of identification of the obtained signals or images and the requirement of skilled experts for the interpretation process of the output data of these applications. Now, the interpretation of the output data from these applications is performed mainly manually, depending heavily on the skills and the experience of trained operators. This process is time consuming and the results typically suffer from inconsistency and errors. The objective of the thesis is to apply the advanced digital signal processing techniques for improving the treatment and the interpretation of the output data from the different Industrial Radioisotopes Applications (IRA). This thesis focuses on two IRA; the Residence Time Distribution (RTD) measurement and the defect inspection of welded pipes using a gamma source (gamma radiography). In RTD measurement application, this thesis presents methods for signal pre-processing and modeling of the RTD signals. Simulation results have been presented for two case studies. The first case study is a laboratory experiment for measuring the RTD in a water flow rig. The second case study is an experiment for measuring the RTD in a phosphate production unit. The thesis proposes an approach for RTD signal identification in the presence of noise. In this approach, after signal processing, the Mel Frequency Cepstral Coefficients (MFCCs) and polynomial coefficients are extracted from the processed signal or from one of its transforms. The Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT), and Discrete Sine Transform (DST) have been tested and compared for efficient feature extraction. Neural networks have been used for matching of the extracted features. Furthermore, the Power Density Spectrum (PDS) of the RTD signal has been also used instead of the discrete

  10. Visas for Switzerland and France - Time needed to process applications

    CERN Multimedia

    2012-01-01

    Please note that any person required to be in possession of a visa in order to take up functions at CERN must start the application process sufficiently early to allow the visa to be issued in time.   The submission of an incomplete application, local circumstances and an increase in applications before the summer holiday period can all result in considerable variation in the time needed to process your application and issue the visa. You are therefore recommended to submit your visa application at least three months, and not later than 21 days, prior to your departure date. We would also like to remind you that the Swiss Consulate in Paris and the French Consulate in Geneva can issue visas exclusively to people resident within their respective spheres of competence (i.e. those who are holders of a French or Swiss residence permit respectively). You must therefore obtain all visas required for stays longer than three months in France or Switzerland from the visa-issuing authority competent for ...

  11. Rapid deposition process for zinc oxide film applications in pyroelectric devices

    International Nuclear Information System (INIS)

    Hsiao, Chun-Ching; Yu, Shih-Yuan

    2012-01-01

    Aerosol deposition (AD) is a rapid process for the deposition of films. Zinc oxide is a low toxicity and environmentally friendly material, and it possesses properties such as semiconductivity, pyroelectricity and piezoelectricity without the poling process. Therefore, AD is used to accelerate the manufacturing process for applications of ZnO films in pyroelectric devices. Increasing the temperature variation rate in pyroelectric films is a useful method for enhancing the responsivity of pyroelectric devices. In the present study, a porous ZnO film possessing the properties of large heat absorption and high temperature variation rate is successfully produced by the AD rapid process and laser annealing for application in pyroelectric devices. (paper)

  12. Application of parallel processing for automatic inspection of printed circuits

    International Nuclear Information System (INIS)

    Lougheed, R.M.

    1986-01-01

    Automated visual inspection of printed electronic circuits is a challenging application for image processing systems. Detailed inspection requires high speed analysis of gray scale imagery along with high quality optics, lighting, and sensing equipment. A prototype system has been developed and demonstrated at the Environmental Research Institute of Michigan (ERIM) for inspection of multilayer thick-film circuits. The central problem of real-time image processing is solved by a special-purpose parallel processor which includes a new high-speed Cytocomputer. In this chapter the inspection process and the algorithms used are summarized, along with the functional requirements of the machine vision system. Next, the parallel processor is described in detail and then performance on this application is given

  13. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  14. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    Science.gov (United States)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  15. Automatic dataflow model extraction from modal real-time stream processing applications

    NARCIS (Netherlands)

    Geuns, S.J.; Hausmans, J.P.H.M.; Bekooij, Marco Jan Gerrit

    2013-01-01

    Many real-time stream processing applications are initially described as a sequential application containing while-loops, which execute for an unknown number of iterations. These modal applications have to be executed in parallel on an MPSoC system in order to meet their real-time throughput

  16. Application of Safeguards-by-Design to a Reactor Design Process

    International Nuclear Information System (INIS)

    Whitlock, J.J.

    2010-01-01

    The application of 'Safeguards-by-Design' (SBD) to a reactor design process is described. The SBD concept seeks to improve the efficiency and effectiveness of IAEA safeguards by incorporating the needs of safeguards at an early stage of reactor design. Understanding and accommodating safeguards in the design process requires a set of 'design requirements for safeguards'; however, such requirements (a) do not traditionally exist, and (b) must exist alongside other more traditional design requirements based upon compliance and operational goals. In the absence of design requirements, a 'Design Guide' for safeguards was created, consisting of recommendations based on best practices. To acquire an understanding of safeguards requirements at the design level, a systematic accounting of diversion pathways was required. However, because of the crowded field of other design requirements, this process needed a methodology that was also flexible in interpretation. The GenIV Proliferation Resistance and Physical Protection (PR and PP) methodology (Rev.5, 2005) was chosen for this exercise. The PR and PP methodology is a general approach and therefore it was necessary to restrict its application; in effect, turning 'off' various options so as to simplify the process. The results of this exercise were used to stimulate discussions with the design team and initiate changes that accommodate safeguards without negatively impacting other design requirements. The process yielded insights into the effective application of SBD, and highlighted issues that must be resolved for effective incorporation of an 'SBD culture' within the design process. (author)

  17. Viewpoints on Medical Image Processing: From Science to Application

    Science.gov (United States)

    Deserno (né Lehmann), Thomas M.; Handels, Heinz; Maier-Hein (né Fritzsche), Klaus H.; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas

    2013-01-01

    Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment. PMID:24078804

  18. Chitosan as a bioactive polymer: Processing, properties and applications.

    Science.gov (United States)

    Muxika, A; Etxabide, A; Uranga, J; Guerrero, P; de la Caba, K

    2017-12-01

    Chitin is one of the most abundant natural polysaccharides in the world and it is mainly used for the production of chitosan by a deacetylation process. Chitosan is a bioactive polymer with a wide variety of applications due to its functional properties such as antibacterial activity, non-toxicity, ease of modification, and biodegradability. This review summarizes the most common chitosan processing methods and highlights some applications of chitosan in various industrial and biomedical fields. Finally, environmental concerns of chitosan-based films, considering the stages from raw materials extraction up to the end of life after disposal, are also discussed with the aim of finding more eco-friendly alternatives. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Viewpoints on Medical Image Processing: From Science to Application.

    Science.gov (United States)

    Deserno Né Lehmann, Thomas M; Handels, Heinz; Maier-Hein Né Fritzsche, Klaus H; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas

    2013-05-01

    Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment.

  20. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    Energy Technology Data Exchange (ETDEWEB)

    J' Tia Patrice Taylor; David E. Shropshire

    2009-09-01

    Abstract This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated

  1. Solar feasibility study for site-specific industrial-process-heat applications. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Murray, O.L.

    1980-03-18

    This study addresses the technical feasibility of solar energy in industrial process heat (IPH) applications in Mid-America. The study was one of two contracted efforts covering the MASEC 12-state region comprised of: Illinois, Michigan, North Dakota, Indiana, Minnesota, Ohio, Iowa, Missouri, South Dakota, Kansas, Nebraska, Wisconsin. The results of our study are encouraging to the potential future role of solar energy in supplying process heat to a varied range of industries and applications. We identified and developed Case Study documentation of twenty feasible solar IPH applications covering eight major SIC groups within the Mid-American region. The geographical distribution of these applications for the existing range of solar insolation levels are shown and the characteristics of the applications are summarized. The results of the study include process identification, analysis of process heat requirements, selection of preliminary solar system characteristics, and estimation of system performance and cost. These are included in each of the 20 Case Studies. The body of the report is divided into two primary discussion sections dealing with the Study Methodology employed in the effort and the Follow-On Potential of the identified applications with regard to possible demonstration projects. The 20 applications are rated with respect to their relative overall viability and procedures are discussed for possible demonstration project embarkment. Also, a possible extension of this present feasibility study for late-comer industrial firms expressing interest appears worthy of consideration.

  2. Introduction to computational mass transfer with applications to chemical engineering

    CERN Document Server

    Yu, Kuo-Tsong

    2014-01-01

    This book presents a new computational methodology called Computational Mass Transfer (CMT). It offers an approach to rigorously simulating the mass, heat and momentum transfer under turbulent flow conditions with the help of two newly published models, namely the C’2—εC’ model and the Reynolds  mass flux model, especially with regard to predictions of concentration, temperature and velocity distributions in chemical and related processes. The book will also allow readers to understand the interfacial phenomena accompanying the mass transfer process and methods for modeling the interfacial effect, such as the influences of Marangoni convection and Rayleigh convection. The CMT methodology is demonstrated by means of its applications to typical separation and chemical reaction processes and equipment, including distillation, absorption, adsorption and chemical reactors. Professor Kuo-Tsong Yu is a Member of the Chinese Academy of Sciences. Dr. Xigang Yuan is a Professor at the School of Chemical Engine...

  3. Process error rates in general research applications to the Human ...

    African Journals Online (AJOL)

    Objective. To examine process error rates in applications for ethics clearance of health research. Methods. Minutes of 586 general research applications made to a human health research ethics committee (HREC) from April 2008 to March 2009 were examined. Rates of approval were calculated and reasons for requiring ...

  4. An examination of professional and ethical issues in the fellowship application process in pathology.

    Science.gov (United States)

    Domen, Ronald E; Wehler, Amanda Brehm

    2008-04-01

    Approximately 34 medical specialty and subspecialty fellowship programs in the United States have formalized the application process through the National Resident Matching Program. This approach sets standards for the application process, offers a formalized match similar to that for residency programs, functions within a specific timeline, and establishes binding rules of behavior for both applicants and programs. For fellowship programs that operate outside the National Resident Matching Program, such as those in pathology, no published guidelines exist to help programs and applicants address the many questions and problems that can arise. As a result, programs are free to set their own timelines for interviews, application requirements, contract negotiations and finalizations, and other details. Consequently, applicants often feel pressured to apply earlier and earlier in their residency for competitive fellowship programs, are often required to fill out multiple unique applications, may feel no "loyalty" toward honoring an acceptance without a contract, and often feel disenfranchised by the whole process. This article addresses professional and ethical aspects of the current application process and offers possible solutions for improving it.

  5. Some computer applications and digital image processing in nuclear medicine

    International Nuclear Information System (INIS)

    Lowinger, T.

    1981-01-01

    Methods of digital image processing are applied to problems in nuclear medicine imaging. The symmetry properties of central nervous system lesions are exploited in an attempt to determine the three-dimensional radioisotope density distribution within the lesions. An algorithm developed by astronomers at the end of the 19th century to determine the distribution of matter in globular clusters is applied to tumors. This algorithm permits the emission-computed-tomographic reconstruction of spherical lesions from a single view. The three-dimensional radioisotope distribution derived by the application of the algorithm can be used to characterize the lesions. The applicability to nuclear medicine images of ten edge detection methods in general usage in digital image processing were evaluated. A general model of image formation by scintillation cameras is developed. The model assumes that objects to be imaged are composed of a finite set of points. The validity of the model has been verified by its ability to duplicate experimental results. Practical applications of this work involve quantitative assessment of the distribution of radipharmaceuticals under clinical situations and the study of image processing algorithms

  6. Licence renewal in the United States - enhancing the process through lessons learned

    International Nuclear Information System (INIS)

    Walters, D.J.

    2000-01-01

    The Nuclear Energy Institute (NEI) is the Washington based policy organisation representing the broad and varied interests of the diverse nuclear energy industry. It comprises nearly 300 corporate members in 15 countries with a budget last year of about USD 26.5 million. It has been working for 10 years with the Nuclear Regulatory Commission (NRC), colleagues in the industry and others to demonstrate that license renewal is a safe and workable process. The first renewed license was issued on 24 March to BGE for the the Calvert Cliffs plant. One month later the NRC issued the renewed license for the Ocoenne plant. By 'Enhancing the process through lessons learned', we mean reducing the uncertainty in the license renewal process. This is achieved through lessons learned from the net wave of applicants and the reviews of the Calvert Cliffs and Ocoenne applications. Three areas will be covered: - Incentive for minimising uncertainty as industry interest in license renewal is growing dramatically. - Rigorous reviews by Nuclear Regulatory Commission assure continued safety: process put in place by the Nuclear Regulatory Commission to assure safety throughout the license renewal term, specifically areas where the lessons learned suggest improvements can be made. - Lessons learned have identified enhancements to the process: numerous benefits associated with renewal of nuclear power plant licenses for consumers of electricity, the environment, the nuclear operating companies and the nation. (author)

  7. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  8. Application of Bf-EVALPSN to Real-time Process Order Control

    International Nuclear Information System (INIS)

    Nakamatsu, Kazumi; Akama, Seiki; Abe, Jair M.

    2009-01-01

    We have already proposed a paraconsistent annotated logic program called EVALPSN. In this paper, EVALPSN is developed to deal with before-after relations between two processes (time intervals), and its application to real-time process order control based on logical safety verification.

  9. Plasmid fermentation process for DNA immunization applications.

    Science.gov (United States)

    Carnes, Aaron E; Williams, James A

    2014-01-01

    Plasmid DNA for immunization applications must be of the highest purity and quality. The ability of downstream purification to efficiently produce a pure final product is directly influenced by the performance of the upstream fermentation process. While several clinical manufacturing facilities already have validated fermentation processes in place to manufacture plasmid DNA for use in humans, a simple and inexpensive laboratory-scale fermentation process can be valuable for in-house production of plasmid DNA for use in animal efficacy studies. This chapter describes a simple fed-batch fermentation process for producing bacterial cell paste enriched with high-quality plasmid DNA. A constant feeding strategy results in a medium cell density culture with continuously increasing plasmid amplification towards the end of the process. Cell banking and seed culture preparation protocols, which can dramatically influence final product yield and quality, are also described. These protocols are suitable for production of research-grade plasmid DNA at the 100 mg-to-1.5 g scale from a typical 10 L laboratory benchtop fermentor.

  10. Application of the wavelet transform for speech processing

    Science.gov (United States)

    Maes, Stephane

    1994-01-01

    Speaker identification and word spotting will shortly play a key role in space applications. An approach based on the wavelet transform is presented that, in the context of the 'modulation model,' enables extraction of speech features which are used as input for the classification process.

  11. 10 CFR 451.9 - Procedures for processing applications.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Procedures for processing applications. 451.9 Section 451.9 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION RENEWABLE ENERGY PRODUCTION INCENTIVES § 451.9... or operators of qualified renewable energy facilities using solar, wind, ocean, geothermal, and...

  12. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    Science.gov (United States)

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from

  13. Rigorous numerical study of strong microwave photon-magnon coupling in all-dielectric magnetic multilayers

    Energy Technology Data Exchange (ETDEWEB)

    Maksymov, Ivan S., E-mail: ivan.maksymov@uwa.edu.au [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); ARC Centre of Excellence for Nanoscale BioPhotonics, School of Applied Sciences, RMIT University, Melbourne, VIC 3001 (Australia); Hutomo, Jessica; Nam, Donghee; Kostylev, Mikhail [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia)

    2015-05-21

    We demonstrate theoretically a ∼350-fold local enhancement of the intensity of the in-plane microwave magnetic field in multilayered structures made from a magneto-insulating yttrium iron garnet (YIG) layer sandwiched between two non-magnetic layers with a high dielectric constant matching that of YIG. The enhancement is predicted for the excitation regime when the microwave magnetic field is induced inside the multilayer by the transducer of a stripline Broadband Ferromagnetic Resonance (BFMR) setup. By means of a rigorous numerical solution of the Landau-Lifshitz-Gilbert equation consistently with the Maxwell's equations, we investigate the magnetisation dynamics in the multilayer. We reveal a strong photon-magnon coupling, which manifests itself as anti-crossing of the ferromagnetic resonance magnon mode supported by the YIG layer and the electromagnetic resonance mode supported by the whole multilayered structure. The frequency of the magnon mode depends on the external static magnetic field, which in our case is applied tangentially to the multilayer in the direction perpendicular to the microwave magnetic field induced by the stripline of the BFMR setup. The frequency of the electromagnetic mode is independent of the static magnetic field. Consequently, the predicted photon-magnon coupling is sensitive to the applied magnetic field and thus can be used in magnetically tuneable metamaterials based on simultaneously negative permittivity and permeability achievable thanks to the YIG layer. We also suggest that the predicted photon-magnon coupling may find applications in microwave quantum information systems.

  14. Application of pervaporation to IS process (Joint research)

    International Nuclear Information System (INIS)

    Kanagawa, Akihiro; Fukui, Hiroshi; Nishibayashi, Toshiki; Iwatsuki, Jin; Tanaka, Nobuyuki; Onuki, Kaoru

    2007-12-01

    Separation of hydrogen iodide from HIx solution (HI-I 2 -H 2 O mixture) is one of the technical issues in the development of thermochemical IS process. Application of pervaporation (PV) to the concentration of HIx solution in the IS process pilot test plant was discussed from the viewpoints of process heat mass balance, conceptual design of the apparatus, and the corrosion resistance of the membrane module. Compared with the electro-electrodialysis system, the PV system enables the downsizing of apparatus by using hollow fiber membranes, although it does not improve the thermal efficiency of IS process. Immersion tests of commercially available Nafion hollow fiber membrane module in the HIx solution at 100degC indicated the necessity of improving the corrosion resistance of bundle materials. (author)

  15. Towards Process Support for Migrating Applications to Cloud Computing

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2012-01-01

    Cloud computing is an active area of research for industry and academia. There are a large number of organizations providing cloud computing infrastructure and services. In order to utilize these infrastructure resources and services, existing applications need to be migrated to clouds. However...... for supporting migration to cloud computing based on our experiences from migrating an Open Source System (OSS), Hackystat, to two different cloud computing platforms. We explained the process by performing a comparative analysis of our efforts to migrate Hackystate to Amazon Web Services and Google App Engine....... We also report the potential challenges, suitable solutions, and lesson learned to support the presented process framework. We expect that the reported experiences can serve guidelines for those who intend to migrate software applications to cloud computing....

  16. Low-Temperature Solution Processable Electrodes for Piezoelectric Sensors Applications

    Science.gov (United States)

    Tuukkanen, Sampo; Julin, Tuomas; Rantanen, Ville; Zakrzewski, Mari; Moilanen, Pasi; Lupo, Donald

    2013-05-01

    Piezoelectric thin-film sensors are suitable for a wide range of applications from physiological measurements to industrial monitoring systems. The use of flexible materials in combination with high-throughput printing technologies enables cost-effective manufacturing of custom-designed, highly integratable piezoelectric sensors. This type of sensor can, for instance, improve industrial process control or enable the embedding of ubiquitous sensors in our living environment to improve quality of life. Here, we discuss the benefits, challenges and potential applications of piezoelectric thin-film sensors. The piezoelectric sensor elements are fabricated by printing electrodes on both sides of unmetallized poly(vinylidene fluoride) film. We show that materials which are solution processable in low temperatures, biocompatible and environmental friendly are suitable for use as electrode materials in piezoelectric sensors.

  17. 24 CFR 55.11 - Applicability of subpart C decision making process.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Applicability of subpart C decision making process. 55.11 Section 55.11 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development FLOODPLAIN MANAGEMENT Application of Executive Order on Floodplain Management § 55.11 Applicability of subpart C...

  18. Licensing process for future applications of advanced-design nuclear reactors

    International Nuclear Information System (INIS)

    Miller, C.L.

    1990-01-01

    The existing 10CFR50 two-step licensing process in the Code of Federal Regulations can continue to be a viable licensing vehicle for future applications, at least for the near future. The US Nuclear Regulatory Commission (NRC) Commissioners and staff, the public, and the utilities (along with supporting architect/engineers and nuclear steam supply system vendors) have a vast body of experience and knowledge of the existing part 50 licensing process. All these participants are familiar with their respective roles in this process, and history shows this process to be a workable licensing vehicle. Nevertheless, the use of 10CFR52 should be encouraged for future applications. This proposed new rule is intended to achieve the early resolution of licensing issues, to reduce the complexity and uncertainty of the licensing process, and enhance the safety and reliability of nuclear power plants. Part 52's overall purpose is to improve reactor safety and streamline the licensing process by encouraging the use of standard reactor designs and by allowing the early resolution of site environmental and reactor safety issues. The public should be afforded an earlier entry into the licensing process as a result of design certification rulemaking process and combined construction permit/operating license hearings

  19. Analysing the Logic and Rigor in the Process of Verification of HACCP Plan%论验证HACCP计划过程中的逻辑性和严密性

    Institute of Scientific and Technical Information of China (English)

    秦红

    2013-01-01

    HACCP体系是一种系统性的食品安全预防控制体系,现已经越来越受到出口食品加工企业的重视,在许多企业中广泛运用并且质量提升效果明显。但是随着HACCP的不断发展,无论国内外官方有无要求,正有越来越多的企业申请HACCP验证或认证。现在企业制订HACCP计划,基本是采用美国国家水产品HACCP培训和教育联盟编写的“HACCP教程”给出的模式,但这种模式的要求是非常严格的。笔者将从逻辑性和严密性两个方面论证此模式下的HACCP计划的验证。旨在帮助指导审核人员完善HACCP计划验证过程,确保HACCP计划的有效实施。%HACCP system is a systematic preventive food safety control system, nowadays,the export food processing enterprises pay more and more attention to it , which has been widely used in many enterprises and the quality improvement effect is obvious. But with the development of HACCP , more and more enterprises apply for HACCP verification or certification. Basically , the enterprise use the model of the"HACCP Guidance"writing by the SHA, to make its HACCP plan, however. This model's requirement is very strict. I will analyse the logic and rigor in the process of verification of HACCP plan. In order to help guiding the personnel perfecting the HACCP plan Verification process, ensure the effective implementation of the HACCP plan.

  20. Is Collaborative, Community-Engaged Scholarship More Rigorous than Traditional Scholarship? On Advocacy, Bias, and Social Science Research

    Science.gov (United States)

    Warren, Mark R.; Calderón, José; Kupscznk, Luke Aubry; Squires, Gregory; Su, Celina

    2018-01-01

    Contrary to the charge that advocacy-oriented research cannot meet social science research standards because it is inherently biased, the authors of this article argue that collaborative, community-engaged scholarship (CCES) must meet high standards of rigor if it is to be useful to support equity-oriented, social justice agendas. In fact, they…

  1. Text mining from ontology learning to automated text processing applications

    CERN Document Server

    Biemann, Chris

    2014-01-01

    This book comprises a set of articles that specify the methodology of text mining, describe the creation of lexical resources in the framework of text mining and use text mining for various tasks in natural language processing (NLP). The analysis of large amounts of textual data is a prerequisite to build lexical resources such as dictionaries and ontologies and also has direct applications in automated text processing in fields such as history, healthcare and mobile applications, just to name a few. This volume gives an update in terms of the recent gains in text mining methods and reflects

  2. Guidelines for conducting rigorous health care psychosocial cross-cultural/language qualitative research.

    Science.gov (United States)

    Arriaza, Pablo; Nedjat-Haiem, Frances; Lee, Hee Yun; Martin, Shadi S

    2015-01-01

    The purpose of this article is to synthesize and chronicle the authors' experiences as four bilingual and bicultural researchers, each experienced in conducting cross-cultural/cross-language qualitative research. Through narrative descriptions of experiences with Latinos, Iranians, and Hmong refugees, the authors discuss their rewards, challenges, and methods of enhancing rigor, trustworthiness, and transparency when conducting cross-cultural/cross-language research. The authors discuss and explore how to effectively manage cross-cultural qualitative data, how to effectively use interpreters and translators, how to identify best methods of transcribing data, and the role of creating strong community relationships. The authors provide guidelines for health care professionals to consider when engaging in cross-cultural qualitative research.

  3. Treatment of waters before use. Processes and applications

    International Nuclear Information System (INIS)

    Mouchet, P.

    2006-01-01

    Some industrial processes require a water without any particulate in suspension and stable with respect to various aspects: no post-precipitations, no interference with storage and distribution equipments (corrosion or fouling), no development of bacterial, algal or other type of fauna (no chemical nutrients) etc. The water preparation process used will be different depending on the origin of the water (surface or underground). This article describes, first, the different type of treatments depending on the origin of the water and on the quality requested (clear and stable water, drinkable water, specific complementary processes, different processing files). Then, in a second part, the application of these processes to some industries are given (beverage, food, textile, paper, steel-making, aerospace and automotive, petroleum, power plants, ultra-pure waters) and in particular the preparation of demineralized water for nuclear power plants is described. (J.S.)

  4. Reconsideration of the sequence of rigor mortis through postmortem changes in adenosine nucleotides and lactic acid in different rat muscles.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Iwadate, K; Nakajima, M

    1996-10-25

    We examined the changes in adenosine triphosphate (ATP), lactic acid, adenosine diphosphate (ADP) and adenosine monophosphate (AMP) in five different rat muscles after death. Rigor mortis has been thought to occur simultaneously in dead muscles and hence to start in small muscles sooner than in large muscles. In this study we found that the rate of decrease in ATP was significantly different in each muscle. The greatest drop in ATP was observed in the masseter muscle. These findings contradict the conventional theory of rigor mortis. Similarly, the rates of change in ADP and lactic acid, which are thought to be related to the consumption or production of ATP, were different in each muscle. However, the rate of change of AMP was the same in each muscle.

  5. 47 CFR 1.572 - Processing TV broadcast and translator station applications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Processing TV broadcast and translator station applications. 1.572 Section 1.572 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND... and translator station applications. See § 73.3572. ...

  6. 47 CFR 1.573 - Processing FM broadcast and translator station applications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Processing FM broadcast and translator station applications. 1.573 Section 1.573 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND... and translator station applications. See § 73.3573. ...

  7. Generic process model structures: towards a standard notation for abstract representations

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2007-10-01

    Full Text Available in the case of objects, or repositories in the case of process models. The creation of the MIT Process Handbook was a step in this direction. However, although the authors used object-oriented concepts in the abstract representations, they did not rigorously...

  8. Optimization algorithms and applications

    CERN Document Server

    Arora, Rajesh Kumar

    2015-01-01

    Choose the Correct Solution Method for Your Optimization ProblemOptimization: Algorithms and Applications presents a variety of solution techniques for optimization problems, emphasizing concepts rather than rigorous mathematical details and proofs. The book covers both gradient and stochastic methods as solution techniques for unconstrained and constrained optimization problems. It discusses the conjugate gradient method, Broyden-Fletcher-Goldfarb-Shanno algorithm, Powell method, penalty function, augmented Lagrange multiplier method, sequential quadratic programming, method of feasible direc

  9. Safety applications of computer based systems for the process industry

    International Nuclear Information System (INIS)

    Bologna, Sandro; Picciolo, Giovanni; Taylor, Robert

    1997-11-01

    Computer based systems, generally referred to as Programmable Electronic Systems (PESs) are being increasingly used in the process industry, also to perform safety functions. The process industry as they intend in this document includes, but is not limited to, chemicals, oil and gas production, oil refining and power generation. Starting in the early 1970's the wide application possibilities and the related development problems of such systems were recognized. Since then, many guidelines and standards have been developed to direct and regulate the application of computers to perform safety functions (EWICS-TC7, IEC, ISA). Lessons learnt in the last twenty years can be summarised as follows: safety is a cultural issue; safety is a management issue; safety is an engineering issue. In particular, safety systems can only be properly addressed in the overall system context. No single method can be considered sufficient to achieve the safety features required in many safety applications. Good safety engineering approach has to address not only hardware and software problems in isolation but also their interfaces and man-machine interface problems. Finally, the economic and industrial aspects of the safety applications and development of PESs in process plants are evidenced throughout all the Report. Scope of the Report is to contribute to the development of an adequate awareness of these problems and to illustrate technical solutions applied or being developed

  10. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  11. Applications and fabrication processes of superconducting composite materials

    International Nuclear Information System (INIS)

    Gregory, E.

    1984-01-01

    This paper discusses the most recent applications and manufacturing considerations in the field of superconductivity. The constantly changing requirements of a growing number of users encourage development in fabrication and inspection techniques. For the first time, superconductors are being used commercially in large numbers and superconducting magnets are no longer just laboratory size. Although current demand for these conductors represents relatively small quantities of material, advances in the production of high-quality composites may accelerate technological growth into several new markets. Three large-scale application areas for superconductors are discussed: accelerator magnets for high-energy physics research, magnetic confinement for thermonuclear fusion, and magnetic resonance imaging for health care. Each application described is accompanied by a brief description of the conductors used and fabrication processes employed to make them

  12. Applications of evolutionary computation in image processing and pattern recognition

    CERN Document Server

    Cuevas, Erik; Perez-Cisneros, Marco

    2016-01-01

    This book presents the use of efficient Evolutionary Computation (EC) algorithms for solving diverse real-world image processing and pattern recognition problems. It provides an overview of the different aspects of evolutionary methods in order to enable the reader in reaching a global understanding of the field and, in conducting studies on specific evolutionary techniques that are related to applications in image processing and pattern recognition. It explains the basic ideas of the proposed applications in a way that can also be understood by readers outside of the field. Image processing and pattern recognition practitioners who are not evolutionary computation researchers will appreciate the discussed techniques beyond simple theoretical tools since they have been adapted to solve significant problems that commonly arise on such areas. On the other hand, members of the evolutionary computation community can learn the way in which image processing and pattern recognition problems can be translated into an...

  13. Application of the JDL data fusion process model for cyber security

    Science.gov (United States)

    Giacobe, Nicklaus A.

    2010-04-01

    A number of cyber security technologies have proposed the use of data fusion to enhance the defensive capabilities of the network and aid in the development of situational awareness for the security analyst. While there have been advances in fusion technologies and the application of fusion in intrusion detection systems (IDSs), in particular, additional progress can be made by gaining a better understanding of a variety of data fusion processes and applying them to the cyber security application domain. This research explores the underlying processes identified in the Joint Directors of Laboratories (JDL) data fusion process model and further describes them in a cyber security context.

  14. Useful, Used, and Peer Approved: The Importance of Rigor and Accessibility in Postsecondary Research and Evaluation. WISCAPE Viewpoints

    Science.gov (United States)

    Vaade, Elizabeth; McCready, Bo

    2012-01-01

    Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…

  15. Improved rigorous upper bounds for transport due to passive advection described by simple models of bounded systems

    International Nuclear Information System (INIS)

    Kim, Chang-Bae; Krommes, J.A.

    1988-08-01

    The work of Krommes and Smith on rigorous upper bounds for the turbulent transport of a passively advected scalar [/ital Ann. Phys./ 177:246 (1987)] is extended in two directions: (1) For their ''reference model,'' improved upper bounds are obtained by utilizing more sophisticated two-time constraints which include the effects of cross-correlations up to fourth order. Numerical solutions of the model stochastic differential equation are also obtained; they show that the new bounds compare quite favorably with the exact results, even at large Reynolds and Kubo numbers. (2) The theory is extended to take account of a finite spatial autocorrelation length L/sub c/. As a reasonably generic example, the problem of particle transport due to statistically specified stochastic magnetic fields in a collisionless turbulent plasma is revisited. A bound is obtained which reduces for small L/sub c/ to the quasilinear limit and for large L/sub c/ to the strong turbulence limit, and which provides a reasonable and rigorous interpolation for intermediate values of L/sub c/. 18 refs., 6 figs

  16. An introduction to stochastic processes with applications to biology

    CERN Document Server

    Allen, Linda J S

    2010-01-01

    An Introduction to Stochastic Processes with Applications to Biology, Second Edition presents the basic theory of stochastic processes necessary in understanding and applying stochastic methods to biological problems in areas such as population growth and extinction, drug kinetics, two-species competition and predation, the spread of epidemics, and the genetics of inbreeding. Because of their rich structure, the text focuses on discrete and continuous time Markov chains and continuous time and state Markov processes.New to the Second EditionA new chapter on stochastic differential equations th

  17. Application of information and communication technology in process reengineering

    Directory of Open Access Journals (Sweden)

    Đurović Aleksandar M.

    2014-01-01

    Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.

  18. Ethnic and gender differences in applicants' decision-making processes: An application of the theory of reasoned action

    NARCIS (Netherlands)

    van Hooft, E.A.J.; Born, M.Ph.; Taris, T.W.; van der Flier, H.

    2006-01-01

    Although a growing proportion of the new entrants into the workforce consist of women and ethnic minorities, relatively little is known about the recruitment and job choice processes of these applicant groups. Therefore, this study investigated cultural and gender differences in job application

  19. Application of hydrometallurgy techniques in quartz processing and purification: a review

    Science.gov (United States)

    Lin, Min; Lei, Shaomin; Pei, Zhenyu; Liu, Yuanyuan; Xia, Zhangjie; Xie, Feixiang

    2018-04-01

    Although there have been numerous studies on separation and purification of metallic minerals by hydrometallurgy techniques, applications of the chemical techniques in separation and purification of non-metallic minerals are rarely reported. This paper reviews disparate areas of study into processing and purification of quartz (typical non-metallic ore) in an attempt to summarize current work, as well as to suggest potential for future consolidation in the field. The review encompasses chemical techniques of the quartz processing including situations, progresses, leaching mechanism, scopes of application, advantages and drawbacks of micro-bioleaching, high temperature leaching, high temperature pressure leaching and catalyzed high temperature pressure leaching. Traditional leaching techniques including micro-bioleaching and high temperature leaching are unequal to demand of modern glass industry for quality of quartz concentrate because the quartz products has to be further processed. High temperature pressure leaching and catalyzed high temperature pressure leaching provide new ways to produce high-grade quartz sand with only one process and lower acid consumption. Furthermore, the catalyzed high temperature pressure leaching realizes effective purification of quartz with extremely low acid consumption (no using HF or any fluoride). It is proposed that, by integrating the different chemical processes of quartz processing and expounding leaching mechanisms and scopes of application, the research field as a monopolized industry would benefit.

  20. Review process for license renewal applications

    International Nuclear Information System (INIS)

    Craig, John W.; Kuo, P.T.

    1991-01-01

    In preparation for license renewal reviews, the Nuclear Regulatory Commission has recently published for public review and comment a proposed rule for license renewal and a draft Standard Review Plan as well as a draft Regulatory Guide relating to the implementation of the proposed rule. In support of future license renewal applications, the nuclear industry has also submitted 11 industry reports for NRC review and approval. This paper briefly describe how these parallel regulatory and industry activities will be factored into the NRC review process for license renewal. (author)