WorldWideScience

Sample records for active length verification

  1. Inspector measurement verification activities

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  2. Application of gel dosimetry - A preliminary study on verification of uniformity of activity and length of source used in Beta-Cath system

    Recently the intraluminal irradiation of coronary arteries following balloon angioplasty is found to reduce proliferation of smooth muscle cells and restenosis. Among the isotopes used for the intracoronary irradiation, 90Sr/Y appears to be ideal (H I Almos et al, 1996). In 1984 Gore et al proposed that radiation induced changes in the well-established Fricke solution could be probed with Nuclear Magnetic Resonance (NMR) relaxation measurements rather than using conventional spectrophotometry measurements. This was a major step in the development of gel dosimetry and since then gel dosimetry has been one of the major advances in the dosimetry of complex radiation fields has been in the area of gel dosimetry. In this preliminary work on gel dosimetry we present the verification of uniformity of activity along the length of the source train and verification of the length of the source used in the Beta-Cath system used for intracoronary brachytherapy with ferrous gel dosimeter. The Beta-Cath system obtained from Novoste, Norcross, GA was used in this study. It consists of a source train of 16 90Sr/Y sources each of length 2.5mm. The total length of the source train is 40mm. For preparation of the Ferrous-Gelatin Gel, the recipe provided by the London Regional Cancer Center, London Ontario, Canada was used. Stock solutions of 50mM H2SO4, 0.3 mM ferrous ammonium sulphate, 0.05mM Xylenol orange was first prepared. The gel was prepared by mixing 4% gelatin with distilled water while stirring in a water bath at 40-42 deg. C. Acid solution, Ferrous ammonium sulphate solution and Xylenol orange were added and stirred in the water bath for about an hour to allow aeration. The mixture was poured in to three 20ml syringes to form the gel and stored in the refrigerator at 5 deg. C. For irradiation with Beta-Cath, the gel was prepared in three cylindrical 20ml syringes. A nylon tube having the same dimension as that of the delivery catheter used in intra-coronary was placed at

  3. Role of independent inspections in verification activities

    Verification activities have an important place in the activities associated with the implementation of a quality assurance programme and concern the compliance of all work with clearly defined requirements. In the first part of this paper the author reviews these requirements, classifying them into four groups: specific requirements of the product in question, particular requirements of the organization responsible for manufacturing the product, requirements of successive customers, and regulatory requirements. The second part of the paper examines the two approaches which can be adopted to establish verification systems outside the organizational structure. The first approach is to pool or organize the monitoring resources of different bodies with common requirements (electricity producer and its principal contractors); the second consists in using an external monitoring body. The third part of the paper describes the system used in France, which is the first of the methods described above. It requires constant co-operation between the different parties involved, and these have established two associations for the purpose of applying the system - AFCEN (nuclear) and AFCEC (conventional). The advantages and disadvantages of the two possible approaches to verification of activities must be assessed within their industrial and commercial regulatory context. In France the best method has proved to be the pooling of resources. This has led to a direct and fruitful dialogue between customers and suppliers aimed at defining common requirements (Design and Construction Regulations (RCC)) and monitoring their application. (author)

  4. Active Thermal Control Experiments for LISA Ground Verification Testing

    Higuchi, Sei; DeBra, Daniel B.

    2006-11-01

    The primary mission goal of LISA is detecting gravitational waves. LISA uses laser metrology to measure the distance between proof masses in three identical spacecrafts. The total acceleration disturbance to each proof mass is required to be below 3 × 10-15 m/s2√Hz . Optical path length variations on each optical bench must be kept below 40 pm/√Hz over 1 Hz to 0.1 mHz. Thermal variations due to, for example, solar radiation or temperature gradients across the proof mass housing will distort the spacecraft causing changes in the mass attraction and sensor location. We have developed a thermal control system developed for the LISA gravitational reference sensor (GRS) ground verification testing which provides thermal stability better than 1 mK/√Hz to f < 1 mHz and which by extension is suitable for in-flight thermal control for the LISA spacecraft to compensate solar irradiation. Thermally stable environment is very demanded for LISA performance verification. In a lab environment specifications can be met with considerable amount of insulation and thermal mass. For spacecraft, the very limited thermal mass calls for an active control system which can meet disturbance rejection and stability requirements simultaneously in the presence of long time delay. A simple proportional plus integral control law presently provides approximately 1 mK/√Hz of thermal stability for over 80 hours. Continuing development of a model predictive feed-forward algorithm will extend performance to below 1 mK/√Hz at f < 1 mHz and lower.

  5. Remedial activities effectiveness verification in tailing areas

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. (authors)

  6. Formal Verification of Effectiveness of Control Activities in Business Processes

    Arimoto, Yasuhito; Iida, Shusaku; Futatsugi, Kokichi

    It has been an important issue to deal with risks in business processes for achieving companies' goals. This paper introduces a method for applying a formal method to analysis of risks and control activities in business processes in order to evaluate control activities consistently, exhaustively, and to give us potential to have scientific discussion on the result of the evaluation. We focus on document flows in business activities and control activities and risks related to documents because documents play important roles in business. In our method, document flows including control activities are modeled and it is verified by OTS/CafeOBJ Method that risks about falsification of documents are avoided by control activities in the model. The verification is done by interaction between humans and CafeOBJ system with theorem proving, and it raises potential to discuss the result scientifically because the interaction gives us rigorous reasons why the result is derived from the verification.

  7. Telomerase activity and telomere length in human hepatocellular carcinoma.

    Huang, G T; Lee, H S; Chen, C H; Chiou, L L; Lin, Y W; Lee, C Z; Chen, D S; Sheu, J C

    1998-11-01

    Telomerase activity is activated and telomere length altered in various types of cancers, including hepatocellular carcinoma (HCC). A total of 39 HCC tissues and the corresponding non-tumour livers were analysed and correlated with clinical parameters. Telomere length was determined by terminal restriction fragment assay, and telomerase activity was assayed by telomeric repeat amplification protocol. Telomerase activity was positive in 24 of the 39 tumour tissues (1.15-285.13 total product generated (TPG) units) and in six of the 39 non-tumour liver tissues (1.05-1.73 TPG units). In the 28 cases analysed for telomere length, telomere length was shortened in 11 cases, lengthened in six cases, and unaltered in 11 cases compared with non-tumour tissues. Neither telomere length nor telomerase activity was correlated to any clinical parameters. PMID:10023320

  8. The verification of neutron activation analysis support system (cooperative research)

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k0 method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k0 method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  9. Prototype test article verification of the Space Station Freedom active thermal control system microgravity performance

    Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.

    1993-01-01

    To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.

  10. [Telomere length and telomerase activity in hepatocellular carcinoma].

    Nakashio, R; Kitamoto, M; Nakanishi, T; Takaishi, H; Takahashi, S; Kajiyama, G

    1998-05-01

    Telomerase activity and terminal restriction fragment (TRF) length were examined in hepatocellular carcinoma (HCC). Telomerase activity was assayed by telomeric repeat amplification protocol (TRAP) connected with an internal telomerase assay standard (ITAS). The incidence of strong telomerase activity (highly variable level compared with the activity of non-cancerous liver tissue) was 79% in well, 84% in moderately, and 100% in poorly differentiated HCC, while 0% in non-cancerous liver tissues. The incidence of TRF length alteration (reduction or elongation) was 53% in HCC. The incidence of TRF alteration was significantly higher in HCC exceeding 3 cm in diameter, moderately or poorly differentiated in histology. Telomerase activity was not associated with TRF length alteration in HCC. In conclusion, strong telomerase activity and TRF length alteration increased with HCC tumor progressions. PMID:9613130

  11. Investigation of an implantable dosimeter for single-point water equivalent path length verification in proton therapy

    Lu, Hsiao-Ming; Mann, Greg; Cascio, Ethan [Francis H. Burr Proton Therapy Center, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States); Sicel Technologies, Inc., Morrisville, North Carolina 27560 (United States); Francis H. Burr Proton Therapy Center, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States)

    2010-11-15

    Purpose: In vivo range verification in proton therapy is highly desirable. A recent study suggested that it was feasible to use point dose measurement for in vivo beam range verification in proton therapy, provided that the spread-out Bragg peak dose distribution is delivered in a different and rather unconventional manner. In this work, the authors investigate the possibility of using a commercial implantable dosimeter with wireless reading for this particular application. Methods: The traditional proton treatment technique delivers all the Bragg peaks required for a SOBP field in a single sequence, producing a constant dose plateau across the target volume. As a result, a point dose measurement anywhere in the target volume will produce the same value, thus providing no information regarding the water equivalent path length to the point of measurement. However, the same constant dose distribution can be achieved by splitting the field into a complementary pair of subfields, producing two oppositely ''sloped'' depth-dose distributions, respectively. The ratio between the two distributions can be a sensitive function of depth and measuring this ratio at a point inside the target volume can provide the water equivalent path length to the dosimeter location. Two types of field splits were used in the experiment, one achieved by the technique of beam current modulation and the other by manipulating the location and width of the beam pulse relative to the range modulator track. Eight MOSFET-based implantable dosimeters at four different depths in a water tank were used to measure the dose ratios for these field pairs. A method was developed to correct the effect of the well-known LET dependence of the MOSFET detectors on the depth-dose distributions using the columnar recombination model. The LET-corrected dose ratios were used to derive the water equivalent path lengths to the dosimeter locations to be compared to physical measurements. Results

  12. Benchmark verification of a method for calculating leakage from partial-length shield assembly modified cores

    Over the past several years, plant-life extension programs have been implemented at many U.S. plants. One method of pressure vessel (PV) fluence rate reduction being used in several of the older reactors involves partial replacement of the oxide fuel with metallic rods in those peripheral assemblies located at critical azimuths. This substitution extends axially over a region that depends on the individual plant design, but covers the most critical PV weld and plate locations, which may be subject to pressurized thermal shock. In order to analyze the resulting PV dosimetry using these partial-length shield assemblies (PLSA), a relatively simple but accurate method needs to be formulated and qualified that treats the axially asymmetric core leakage. Accordingly, an experiment was devised and performed at the VENUS critical facility in Mol, Belgium. The success of the proposed method bodes well for the accuracy of future analyses of on-line plants using PLSAs

  13. Implementation Practices of Finland in Facilitating IAEA Verification Activities

    The Member States provide the information to the IAEA according to the Safeguards Agreements and Additional Protocols. For example, the requirements to provide the reports and declarations are very general and there are no explanation what the IAEA is looking for from that information. It is important for the States to understand how their efforts to collect and provide information, and to facilitate IAEA verification activities, contribute to the achievement of objectives and finally to draw conclusions on the exclusively peaceful use of nuclear materials in a State. The IAEA is producing a new series of guidance called Safeguards Implementation Practices, SIP, guides, which are shedding light on the requirements and sharing the good practices of States. It is hoped that the SIP Guides will create a better understanding of the needs of the IAEA and the important role of States and facility operators in achieving safeguards objectives. The guides are also important for the States to share their lessons learned and good practices for the benefit of other States that might be developing their capabilities or enhancing their processes and procedures. The way is very wide and long, when a State decides to start up a new nuclear programme. At first there is a need for legislation, regulatory body, contact point, international agreements and then finally practical implementation of the safeguards in the nuclear facilities. There are a lot of issues to be prepared in advance to facilitate the IAEA's implementation of verification activities successfully, effectively and with the good quality. Using the structure of the IAEA's draft SIP Guide on Facilitating Verification Activities as a framework, this paper will describe the most relevant implementation practices and experiences in Finland. (author)

  14. Verification of Monte Carlo transport codes by activation experiments

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  15. Investigation of an implantable dosimeter for single-point water equivalent path length verification in proton therapy

    Lu, Hsiao-Ming; Mann, Greg; Cascio, Ethan

    2010-01-01

    Purpose:In vivo range verification in proton therapy is highly desirable. A recent study suggested that it was feasible to use point dose measurement for in vivo beam range verification in proton therapy, provided that the spread-out Bragg peak dose distribution is delivered in a different and rather unconventional manner. In this work, the authors investigate the possibility of using a commercial implantable dosimeter with wireless reading for this particular application.

  16. Activity of telomerase and telomeric length in Apis mellifera.

    Korandová, Michala; Frydrychová, Radmila Čapková

    2016-06-01

    Telomerase is an enzyme that adds repeats of DNA sequences to the ends of chromosomes, thereby preventing their shortening. Telomerase activity is associated with proliferative status of cells, organismal development, and aging. We report an analysis of telomerase activity and telomere length in the honeybee, Apis mellifera. Telomerase activity was found to be regulated in a development and caste-specific manner. During the development of somatic tissues of larval drones and workers, telomerase activity declined to 10 % of its level in embryos and remained low during pupal and adult stages but was upregulated in testes of late pupae, where it reached 70 % of the embryo level. Upregulation of telomerase activity was observed in the ovaries of late pupal queens, reaching 160 % of the level in embryos. Compared to workers and drones, queens displayed higher levels of telomerase activity. In the third larval instar of queens, telomerase activity reached the embryo level, and an enormous increase was observed in adult brains of queens, showing a 70-fold increase compared to a brain of an adult worker. Southern hybridization of terminal TTAGG fragments revealed a high variability of telomeric length between different individuals, although the same pattern of hybridization signals was observed in different tissues of each individual. PMID:26490169

  17. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  18. First Exon Length Controls Active Chromatin Signatures and Transcription

    Nicole I. Bieberstein

    2012-07-01

    Full Text Available Here, we explore the role of splicing in transcription, employing both genome-wide analysis of human ChIP-seq data and experimental manipulation of exon-intron organization in transgenic cell lines. We show that the activating histone modifications H3K4me3 and H3K9ac map specifically to first exon-intron boundaries. This is surprising, because these marks help recruit general transcription factors (GTFs to promoters. In genes with long first exons, promoter-proximal levels of H3K4me3 and H3K9ac are greatly reduced; consequently, GTFs and RNA polymerase II are low at transcription start sites (TSSs and exhibit a second, promoter-distal peak from which transcription also initiates. In contrast, short first exons lead to increased H3K4me3 and H3K9ac at promoters, higher expression levels, accuracy in TSS usage, and a lower frequency of antisense transcription. Therefore, first exon length is predictive for gene activity. Finally, splicing inhibition and intron deletion reduce H3K4me3 levels and transcriptional output. Thus, gene architecture and splicing determines transcription quantity and quality as well as chromatin signatures.

  19. Length-scale dependent mechanical properties of Al-Cu eutectic alloy: Molecular dynamics based model and its experimental verification

    Tiwary, C. S., E-mail: cst.iisc@gmail.com; Chattopadhyay, K. [Department of Materials Engineering, Indian Institute of Science, Bangalore 560012 (India); Chakraborty, S.; Mahapatra, D. R. [Department of Aerospace Engineering, Indian Institute of Science, Bangalore 560012 (India)

    2014-05-28

    This paper attempts to gain an understanding of the effect of lamellar length scale on the mechanical properties of two-phase metal-intermetallic eutectic structure. We first develop a molecular dynamics model for the in-situ grown eutectic interface followed by a model of deformation of Al-Al{sub 2}Cu lamellar eutectic. Leveraging the insights obtained from the simulation on the behaviour of dislocations at different length scales of the eutectic, we present and explain the experimental results on Al-Al{sub 2}Cu eutectic with various different lamellar spacing. The physics behind the mechanism is further quantified with help of atomic level energy model for different length scale as well as different strain. An atomic level energy partitioning of the lamellae and the interface regions reveals that the energy of the lamellae core are accumulated more due to dislocations irrespective of the length-scale. Whereas the energy of the interface is accumulated more due to dislocations when the length-scale is smaller, but the trend is reversed when the length-scale is large beyond a critical size of about 80 nm.

  20. Length-scale dependent mechanical properties of Al-Cu eutectic alloy: Molecular dynamics based model and its experimental verification

    This paper attempts to gain an understanding of the effect of lamellar length scale on the mechanical properties of two-phase metal-intermetallic eutectic structure. We first develop a molecular dynamics model for the in-situ grown eutectic interface followed by a model of deformation of Al-Al2Cu lamellar eutectic. Leveraging the insights obtained from the simulation on the behaviour of dislocations at different length scales of the eutectic, we present and explain the experimental results on Al-Al2Cu eutectic with various different lamellar spacing. The physics behind the mechanism is further quantified with help of atomic level energy model for different length scale as well as different strain. An atomic level energy partitioning of the lamellae and the interface regions reveals that the energy of the lamellae core are accumulated more due to dislocations irrespective of the length-scale. Whereas the energy of the interface is accumulated more due to dislocations when the length-scale is smaller, but the trend is reversed when the length-scale is large beyond a critical size of about 80 nm.

  1. Single-Cell Telomere-Length Quantification Couples Telomere Length to Meristem Activity and Stem Cell Development in Arabidopsis

    Mary-Paz González-García

    2015-05-01

    Full Text Available Telomeres are specialized nucleoprotein caps that protect chromosome ends assuring cell division. Single-cell telomere quantification in animals established a critical role for telomerase in stem cells, yet, in plants, telomere-length quantification has been reported only at the organ level. Here, a quantitative analysis of telomere length of single cells in Arabidopsis root apex uncovered a heterogeneous telomere-length distribution of different cell lineages showing the longest telomeres at the stem cells. The defects in meristem and stem cell renewal observed in tert mutants demonstrate that telomere lengthening by TERT sets a replicative limit in the root meristem. Conversely, the long telomeres of the columella cells and the premature stem cell differentiation plt1,2 mutants suggest that differentiation can prevent telomere erosion. Overall, our results indicate that telomere dynamics are coupled to meristem activity and continuous growth, disclosing a critical association between telomere length, stem cell function, and the extended lifespan of plants.

  2. Single-cell telomere-length quantification couples telomere length to meristem activity and stem cell development in Arabidopsis.

    González-García, Mary-Paz; Pavelescu, Irina; Canela, Andrés; Sevillano, Xavier; Leehy, Katherine A; Nelson, Andrew D L; Ibañes, Marta; Shippen, Dorothy E; Blasco, Maria A; Caño-Delgado, Ana I

    2015-05-12

    Telomeres are specialized nucleoprotein caps that protect chromosome ends assuring cell division. Single-cell telomere quantification in animals established a critical role for telomerase in stem cells, yet, in plants, telomere-length quantification has been reported only at the organ level. Here, a quantitative analysis of telomere length of single cells in Arabidopsis root apex uncovered a heterogeneous telomere-length distribution of different cell lineages showing the longest telomeres at the stem cells. The defects in meristem and stem cell renewal observed in tert mutants demonstrate that telomere lengthening by TERT sets a replicative limit in the root meristem. Conversely, the long telomeres of the columella cells and the premature stem cell differentiation plt1,2 mutants suggest that differentiation can prevent telomere erosion. Overall, our results indicate that telomere dynamics are coupled to meristem activity and continuous growth, disclosing a critical association between telomere length, stem cell function, and the extended lifespan of plants. PMID:25937286

  3. Summary of LHC MD 398: Verification of the dependence of the BCTF measurements on beam position and bunch length

    Krupa, Michal; Gasior, Marek; Lefevre, Thibaut; Soby, Lars; CERN. Geneva. ATS Department

    2015-01-01

    The main aim of the MD was to study the dependency of bunch-by-bunch intensity measurements to beam position and bunch length variations. Large beam position offsets in IR4 and varying bunch length were introduced to compare the performance of the presently installed Fast Beam Current Transformers with the new Integrating Current Transformer and the new Wall Current Transformer. This note explains all the procedures of the LHC MD 398, which took place on 20/07/2015, and presents the obtained results.

  4. Active Learning of Markov Decision Processes for System Verification

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    Formal model verification has proven a powerful tool for verifying and validating the properties of a system. Central to this class of techniques is the construction of an accurate formal model for the system being investigated. Unfortunately, manual construction of such models can be a resource...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences of...... input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  5. 78 FR 6849 - Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review

    2013-01-31

    ... AFFAIRS Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review AGENCY... abstracted below to the Office of Management and Budget (OMB) for review and comment. The PRA submission... VA Benefits, VA Form 26-8937. OMB Control Number: 2900-0406. ] Type of Review: Extension of...

  6. 77 FR 20889 - Proposed Information Collection (Request One-VA Identification Verification Card) Activity...

    2012-04-06

    ... of Veterans Affairs, 810 Vermont Avenue NW., Washington, DC 20420 or email: john.hancock@va.gov... AFFAIRS Proposed Information Collection (Request One-VA Identification Verification Card) Activity... Veterans Affairs (VA), is announcing an opportunity for public comment on the proposed collection...

  7. 77 FR 38396 - Agency Information Collection (One-VA Identification Verification Card) Activities Under OMB Review

    2012-06-27

    ... information through www.Regulations.gov or to VA's OMB Desk Officer, Office of Information and Regulatory... 20420, (202) 632-7479, Fax (202) 632-7583 or email denise.mclamb@va.gov . Please refer to ``OMB Control... AFFAIRS Agency Information Collection (One-VA Identification Verification Card) Activities Under...

  8. Length and activation dependent variations in muscle shear wave speed

    Muscle stiffness is known to vary as a result of a variety of disease states, yet current clinical methods for quantifying muscle stiffness have limitations including cost and availability. We investigated the capability of shear wave elastography (SWE) to measure variations in gastrocnemius shear wave speed induced via active contraction and passive stretch. Ten healthy young adults were tested. Shear wave speeds were measured using a SWE transducer positioned over the medial gastrocnemius at ankle angles ranging from maximum dorsiflexion to maximum plantarflexion. Shear wave speeds were also measured during voluntary plantarflexor contractions at a fixed ankle angle. Average shear wave speed increased significantly from 2.6 to 5.6 m s–1 with passive dorsiflexion and the knee in an extended posture, but did not vary with dorsiflexion when the gastrocnemius was shortened in a flexed knee posture. During active contractions, shear wave speed monotonically varied with the net ankle moment generated, reaching 8.3 m s–1 in the maximally contracted condition. There was a linear correlation between shear wave speed and net ankle moment in both the active and passive conditions; however, the slope of this linear relationship was significantly steeper for the data collected during passive loading conditions. The results show that SWE is a promising approach for quantitatively assessing changes in mechanical muscle loading. However, the differential effect of active and passive loading on shear wave speed makes it important to carefully consider the relevant loading conditions in which to use SWE to characterize in vivo muscle properties. (paper)

  9. Modelling and Verification of Web Services Business Activity Protocol

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show ...

  10. Length adaptation of smooth muscle contractile filaments in response to sustained activation.

    Stålhand, Jonas; Holzapfel, Gerhard A

    2016-05-21

    Airway and bladder smooth muscles are known to undergo length adaptation under sustained contraction. This adaptation process entails a remodelling of the intracellular actin and myosin filaments which shifts the peak of the active force-length curve towards the current length. Smooth muscles are therefore able to generate the maximum force over a wide range of lengths. In contrast, length adaptation of vascular smooth muscle has attracted very little attention and only a handful of studies have been reported. Although their results are conflicting on the existence of a length adaptation process in vascular smooth muscle, it seems that, at least, peripheral arteries and arterioles undergo such adaptation. This is of interest since peripheral vessels are responsible for pressure regulation, and a length adaptation will affect the function of the cardiovascular system. It has, e.g., been suggested that the inward remodelling of resistance vessels associated with hypertension disorders may be related to smooth muscle adaptation. In this study we develop a continuum mechanical model for vascular smooth muscle length adaptation by assuming that the muscle cells remodel the actomyosin network such that the peak of the active stress-stretch curve is shifted towards the operating point. The model is specialised to hamster cheek pouch arterioles and the simulated response to stepwise length changes under contraction. The results show that the model is able to recover the salient features of length adaptation reported in the literature. PMID:26925813

  11. Examples of verification knowledge and testing of the secondary students through the worksheet. Suggestions for leisure time activities

    In this chapter some examples of verification knowledge and testing of the secondary students through the worksheet as well as suggestions for leisure time activities are presented. Used and recommended literature is included.

  12. At tank Low Activity Feed Homogeneity Analysis Verification

    DOUGLAS, J.G.

    2000-09-28

    This report evaluates the merit of selecting sodium, aluminum, and cesium-137 as analytes to indicate homogeneity of soluble species in low-activity waste (LAW) feed and recommends possible analytes and physical properties that could serve as rapid screening indicators for LAW feed homogeneity. The three analytes are adequate as screening indicators of soluble species homogeneity for tank waste when a mixing pump is used to thoroughly mix the waste in the waste feed staging tank and when all dissolved species are present at concentrations well below their solubility limits. If either of these conditions is violated, then the three indicators may not be sufficiently chemically representative of other waste constituents to reliably indicate homogeneity in the feed supernatant. Additional homogeneity indicators that should be considered are anions such as fluoride, sulfate, and phosphate, total organic carbon/total inorganic carbon, and total alpha to estimate the transuranic species. Physical property measurements such as gamma profiling, conductivity, specific gravity, and total suspended solids are recommended as possible at-tank methods for indicating homogeneity. Indicators of LAW feed homogeneity are needed to reduce the U.S. Department of Energy, Office of River Protection (ORP) Program's contractual risk by assuring that the waste feed is within the contractual composition and can be supplied to the waste treatment plant within the schedule requirements.

  13. At-tank Low-Activity Feed Homogeneity Analysis Verification

    This report evaluates the merit of selecting sodium, aluminum, and cesium-137 as analytes to indicate homogeneity of soluble species in low-activity waste (LAW) feed and recommends possible analytes and physical properties that could serve as rapid screening indicators for LAW feed homogeneity. The three analytes are adequate as screening indicators of soluble species homogeneity for tank waste when a mixing pump is used to thoroughly mix the waste in the waste feed staging tank and when all dissolved species are present at concentrations well below their solubility limits. If either of these conditions is violated, then the three indicators may not be sufficiently chemically representative of other waste constituents to reliably indicate homogeneity in the feed supernatant. Additional homogeneity indicators that should be considered are anions such as fluoride, sulfate, and phosphate, total organic carbon/total inorganic carbon, and total alpha to estimate the transuranic species. Physical property measurements such as gamma profiling, conductivity, specific gravity, and total suspended solids are recommended as possible at-tank methods for indicating homogeneity. Indicators of LAW feed homogeneity are needed to reduce the U.S. Department of Energy, Office of River Protection (ORP) Program's contractual risk by assuring that the waste feed is within the contractual composition and can be supplied to the waste treatment plant within the schedule requirements

  14. Verification and disarmament

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  15. Verification and disarmament

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  16. Amplification of Frequency-Modulated Similariton Pulses in Length-Inhomogeneous Active Fibers

    I. O. Zolotovskii

    2012-01-01

    Full Text Available The possibility of an effective gain of the self-similar frequency-modulated (FM wave packets is studied in the length-inhomogeneous active fibers. The dynamics of parabolic pulses with the constant chirp has been considered. The optimal profile for the change of the group-velocity dispersion corresponding to the optimal similariton pulse amplification has been obtained. It is shown that the use of FM pulses in the active (gain and length-inhomogeneous optical fibers with the normal group-velocity dispersion can provide subpicosecond optical pulse amplification up to the energies higher than 1 nJ.

  17. Association of day length and weather conditions with physical activity levels in older community dwelling people.

    Miles D Witham

    Full Text Available BACKGROUND: Weather is a potentially important determinant of physical activity. Little work has been done examining the relationship between weather and physical activity, and potential modifiers of any relationship in older people. We therefore examined the relationship between weather and physical activity in a cohort of older community-dwelling people. METHODS: We analysed prospectively collected cross-sectional activity data from community-dwelling people aged 65 and over in the Physical Activity Cohort Scotland. We correlated seven day triaxial accelerometry data with daily weather data (temperature, day length, sunshine, snow, rain, and a series of potential effect modifiers were tested in mixed models: environmental variables (urban vs rural dwelling, percentage of green space, psychological variables (anxiety, depression, perceived behavioural control, social variables (number of close contacts and health status measured using the SF-36 questionnaire. RESULTS: 547 participants, mean age 78.5 years, were included in this analysis. Higher minimum daily temperature and longer day length were associated with higher activity levels; these associations remained robust to adjustment for other significant associates of activity: age, perceived behavioural control, number of social contacts and physical function. Of the potential effect modifier variables, only urban vs rural dwelling and the SF-36 measure of social functioning enhanced the association between day length and activity; no variable modified the association between minimum temperature and activity. CONCLUSIONS: In older community dwelling people, minimum temperature and day length were associated with objectively measured activity. There was little evidence for moderation of these associations through potentially modifiable health, environmental, social or psychological variables.

  18. SU-E-J-141: Activity-Equivalent Path Length Approach for the 3D PET-Based Dose Reconstruction in Proton Therapy

    Purpose: Ion beam therapy is sensitive to uncertainties from treatment planning and dose delivery. PET imaging of induced positron emitter distributions is a practical approach for in vivo, in situ verification of ion beam treatments. Treatment verification is usually done by comparing measured activity distributions with reference distributions, evaluated in nominal conditions. Although such comparisons give valuable information on treatment quality, a proper clinical evaluation of the treatment ultimately relies on the knowledge of the actual delivered dose. Analytical deconvolution methods relating activity and dose have been studied in this context, but were not clinically applied. In this work we present a feasibility study of an alternative approach for dose reconstruction from activity data, which is based on relating variations in accumulated activity to tissue density variations. Methods: First, reference distributions of dose and activity were calculated from the treatment plan and CT data. Then, the actual measured activity data were cumulatively matched with the reference activity distributions to obtain a set of activity-equivalent path lengths (AEPLs) along the rays of the pencil beams. Finally, these AEPLs were used to deform the original dose distribution, yielding the actual delivered dose. The method was tested by simulating a proton therapy treatment plan delivering 2 Gy on a homogeneous water phantom (the reference), which was compared with the same plan delivered on a phantom containing inhomogeneities. Activity and dose distributions were were calculated by means of the FLUKA Monte Carlo toolkit. Results: The main features of the observed dose distribution in the inhomogeneous situation were reproduced using the AEPL approach. Variations in particle range were reproduced and the positions, where these deviations originated, were properly identified. Conclusions: For a simple inhomogeneous phantom the 3D dose reconstruction from PET-activity

  19. AN APPROACH FOR ACTIVE SEGMENTATION OF UNCONSTRAINED HANDWRITTEN KOREAN STRINGS USING RUN-LENGTH CODE

    JeongSuk, J.; Kim, G.

    2004-01-01

    We propose an active handwritten Hangul segmentation method. A manageable structure based on Run-length code is defined in order to apply to preprocessing and segmentation. Also three fundamental candidate estimation functions are in- troduced to detect the clues on touching points, and the classifi

  20. Day length and weather effects on children's physical activity and participation in play, sports, and active travel

    Goodman, A.; Paskins, J.; MacKett, R

    2012-01-01

    BACKGROUND: Children in primary school are more physically active in the spring/summer. Little is known about the relative contributions of day length and weather, however, or about the underlying behavioral mediators. METHODS: 325 British children aged 8 to 11 wore accelerometers as an objective measure of physical activity, measured in terms of mean activity counts. Children simultaneously completed diaries in which we identified episodes of out-of-home play, structured sports, and active t...

  1. Software Verification and Validation Plan Activities, 2011, Project Number: N6423, SAPHIRE Version 8

    Kurt G. Vedros; Curtis L. Smith

    2011-11-01

    The SV&V Plan experienced changes over the past year to bring it into the operational software life cycle of SAPHIRE 8 and to maintain its sections on design features. Peer review of the SVVP with the former IV&V members identified the need for the operational use of metrics as a tool for quality maintenance and improvement. New tests were added to the SVVP to verify the operation of the new design features incorporated into SAPHIRE 8. Other additions to the SVVP were the addition of software metrics and the PDR and CDR processes. Audit support was provided for the NRC Technical Manager and Project Manager for the NRC OIG Audit performed throughout 2011. The SVVP is considered to be an up to date reference and useful roadmap of verification and validation activities going forward.

  2. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the

  3. ANATOMY OF SOLAR CYCLE LENGTH AND SUNSPOT NUMBER: DEPENDENCE OF AVERAGE GLOBAL TEMPERATURE ON SOLAR ACTIVITY

    A. B. BHATTACHARYA

    2011-11-01

    Full Text Available The paper examines thoroughly all the past 23 sunspot cycles and the associated 11 hale cycles. It is noticed that solar cycle 23 had a deep minimum with longest decline phase. When solar cycles 20 to 23 are compared with solar cycles 1 to 4, the forthcoming Dalton minimum can be expected. The predicted variation of sunspot number for the present solar cycle 24 is examined at length and it appears that the peak monthly sunspot number of the solar cycle 24 will be around 80. We have correlated the solar cycle length and peak sunspot number witha priority to the solar cycle 24. From an elaborate analysis it appears that the most common cycle length is around 10.5 years, with few cycles in the range 11.5 to 12.5 years. Global temperature depends upon the total solar irradiance which in turn depends on duration of solar cycle. Also cloud cover directly depends on the solar irradiance. Our analysis supports that the global temperature is governed by the length of the predicted cycle.From the increased length of solar cycle 23, we have estimated the temperature variation of cycle 24. The predicted result reassures that average global temperature will be decreased for next few solar cycles due totypical solar activity. The results have been interpreted emphasizing the formation of type III solar radio bursts caused by plasma excitation.

  4. Verification of relationships between anthropometric variables among ureteral stents recipients and ureteric lengths: a challenge for Vitruvian-da Vinci theory

    Acelam PA

    2015-08-01

    Full Text Available Philip A Acelam Walden University, College of Health Sciences, Minneapolis, MN, USA Objective: To determine and verify how anthropometric variables correlate to ureteric lengths and how well statistical models approximate the actual ureteric lengths. Materials and methods: In this work, 129 charts of endourological patients (71 females and 58 males were studied retrospectively. Data were gathered from various research centers from North and South America. Continuous data were studied using descriptive statistics. Anthropometric variables (age, body surface area, body weight, obesity, and stature were utilized as predictors of ureteric lengths. Linear regressions and correlations were used for studying relationships between the predictors and the outcome variables (ureteric lengths; P-value was set at 0.05. To assess how well statistical models were capable of predicting the actual ureteric lengths, percentages (or ratios of matched to mismatched results were employed. Results: The results of the study show that anthropometric variables do not correlate well to ureteric lengths. Statistical models can partially estimate ureteric lengths. Out of the five anthropometric variables studied, three of them: body frame, stature, and weight, each with a P<0.0001, were significant. Two of the variables: age (R2=0.01; P=0.20 and obesity (R2=0.03; P=0.06, were found to be poor estimators of ureteric lengths. None of the predictors reached the expected (match:above:below ratio of 1:0:0 to qualify as reliable predictors of ureteric lengths. Conclusion: There is not sufficient evidence to conclude that anthropometric variables can reliably predict ureteric lengths. These variables appear to lack adequate specificity as they failed to reach the expected (match:above:below ratio of 1:0:0. Consequently, selections of ureteral stents continue to remain a challenge. However, height (R2=0.68 with the (match:above:below ratio of 3:3:4 appears suited for use as

  5. Relationship between metabolism and ovarian activity in dairy cows with different dry period lengths.

    Chen, J; Soede, N M; van Dorland, H A; Remmelink, G J; Bruckmaier, R M; Kemp, B; van Knegsel, A T M

    2015-11-01

    The objectives of the present study were to evaluate the effects of dry period length on ovarian activity in cows fed a lipogenic or a glucogenic diet within 100 days in milk (DIM) and to determine relationships between ovarian activity and energy balance and metabolic status in early lactation. Holstein-Friesian dairy cows (n = 167) were randomly assigned to one of three dry period lengths (0, 30, or 60 days) and one of two diets in early lactation (glucogenic or lipogenic diet) resulting in a 3 × 2 factorial design. Cows were monitored for body condition score, milk yield, dry matter intake, and energy balance from calving to week 8 postpartum, and blood was sampled weekly from 95 cows from calving to week 8 postpartum. Milk samples were collected three times a week until 100 DIM postpartum for determination of progesterone concentration. At least two succeeding milk samples with progesterone concentration of 2 ng/mL or greater were used to indicate the occurrence of luteal activity. Normal resumption of ovarian cyclicity was defined as the onset of luteal activity (OLA) occurring at 45 DIM or less, followed by regular ovarian cycles of 18 to 24 days in length. Within 100 DIM postpartum, cows with a 0-day dry period had greater incidence of normal resumption of ovarian cyclicity (53.2%; 25 out of 47 cows) compared with cows with a 60-day dry period (26.0%; 13 out of 50 cows, P = 0.02). Independent of dry period length or diet, cows with OLA at less than 21 DIM had a greater body condition score during weeks 1 and 2 (P = 0.01) and weeks 1 through 8 (P = 0.01) postpartum compared with cows with OLA at greater than 30 DIM. Cows with the first ovarian cycle of medium length (18-24 days) had greater energy balance (P = 0.03), plasma concentrations of insulin (P = 0.03), glucose (P = 0.04), and insulin-like growth factor I (P = 0.04) than cows with long ovarian cycle lengths (>24 days) but had lower plasma β-hydroxybutyrate (P cows with

  6. Optimization of Active Muscle Force-Length Models Using Least Squares Curve Fitting.

    Mohammed, Goran Abdulrahman; Hou, Ming

    2016-03-01

    The objective of this paper is to propose an asymmetric Gaussian function as an alternative to the existing active force-length models, and to optimize this model along with several other existing models by using the least squares curve fitting method. The minimal set of coefficients is identified for each of these models to facilitate the least squares curve fitting. Sarcomere simulated data and one set of rabbits extensor digitorum II experimental data are used to illustrate optimal curve fitting of the selected force-length functions. The results shows that all the curves fit reasonably well with the simulated and experimental data, while the Gordon-Huxley-Julian model and asymmetric Gaussian function are better than other functions in terms of statistical test scores root mean squared error and R-squared. However, the differences in RMSE scores are insignificant (0.3-6%) for simulated data and (0.2-5%) for experimental data. The proposed asymmetric Gaussian model and the method of parametrization of this and the other force-length models mentioned above can be used in the studies on active force-length relationships of skeletal muscles that generate forces to cause movements of human and animal bodies. PMID:26276984

  7. Length and activity of the root apical meristem revealed in vivo by infrared imaging.

    Bizet, François; Hummel, Irène; Bogeat-Triboulot, Marie-Béatrice

    2015-03-01

    Understanding how cell division and cell elongation influence organ growth and development is a long-standing issue in plant biology. In plant roots, most of the cell divisions occur in a short and specialized region, the root apical meristem (RAM). Although RAM activity has been suggested to be of high importance to understand how roots grow and how the cell cycle is regulated, few experimental and numeric data are currently available. The characterization of the RAM is difficult and essentially based upon cell length measurements through destructive and time-consuming microscopy approaches. Here, a new non-invasive method is described that couples infrared light imaging and kinematic analyses and that allows in vivo measurements of the RAM length. This study provides a detailed description of the RAM activity, especially in terms of cell flux and cell division rate. We focused on roots of hydroponic grown poplars and confirmed our method on maize roots. How the RAM affects root growth rate is studied by taking advantage of the high inter-individual variability of poplar root growth. An osmotic stress was applied and did not significantly affect the RAM length, highlighting its homeostasis in short to middle-term responses. The methodology described here simplifies a lot experimental procedures, allows an increase in the number of individuals that can be taken into account in experiments, and means new experiments can be formulated that allow temporal monitoring of the RAM length. PMID:25540436

  8. Statistical analysis and verification of 3-hourly geomagnetic activity probability predictions

    Wang, Jingjing; Zhong, Qiuzhen; Liu, Siqing; Miao, Juan; Liu, Fanghua; Li, Zhitao; Tang, Weiwei

    2015-12-01

    The Space Environment Prediction Center (SEPC) has classified geomagnetic activity into four levels: quiet to unsettled (Kp 6). The 3-hourly Kp index prediction product provided by the SEPC is updated half hourly. In this study, the statistical conditional forecast models for the 3-hourly geomagnetic activity level were developed based on 10 years of data and applied to more than 3 years of data, using the previous Kp index, interplanetary magnetic field, and solar wind parameters measured by the Advanced Composition Explorer as conditional parameters. The quality of the forecast models was measured and compared against verifications of accuracy, reliability, discrimination capability, and skill of predicting all geomagnetic activity levels, especially the probability of reaching the storm level given a previous "calm" (nonstorm level) or "storm" (storm level) condition. It was found that the conditional models that used the previous Kp index, the peak value of BtV (the product of the total interplanetary magnetic field and speed), the average value of Bz (the southerly component of the interplanetary magnetic field), and BzV (the product of the southerly component of the interplanetary magnetic field and speed) over the last 6 h as conditional parameters provide a relative operating characteristic area of 0.64 and can be an appropriate predictor for the probability forecast of geomagnetic activity level.

  9. Active aging as a way of keeping diseases at arm’s length

    Lassen, Aske Juul

    good for their quality of life, health, functionality and the economy (Sundhedsstyrelsen 2008, EC 2006, WHO 2002). At the same time active aging is inscribed into a general health care focus, which individualizes the responsibility for health and disease. This requires subjects ready to self-care, by...... paying attention to the signals of the body and leading healthy lives (Rose 2001). However, active aging seems to contain an ambiguity in this aspect, as the practice of active aging is often a way for elderly to keep diseases at arm’s length, and not a way to sense the possible abnormalities in the body...

  10. Experimental verification of the effect of cable length on voltage distribution in stator winding of an induction motor under surge condition

    Oyegoke, B.S. [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Electromechanics

    1997-12-31

    This paper presents the results of surge distribution tests performed on a stator of a 6 kV induction motor. The primary aim of these tests was to determine the wave propagation properties of the machine winding fed via cables of different lengths. Considering the measured resorts, conclusions are derived regarding the effect of cable length on the surge distribution within the stator winding of an ac motor. (orig.) 15 refs.

  11. Effect of grass silage chop length on chewing activity and digestibility

    Garmo, T.H.; Randby, Å.T.; Eknæs, M.;

    2008-01-01

    Round bale grass silage harvested early (D-value 757 g kg-1 DM) or at a normal (D-value 696 g kg-1 DM) time was used to study the effect of harvesting time, chop length and their interaction on chewing activity and digestibility by dairy cows. Six early lactating Norwegian Red cows were used in a 6...... x 6 Latin square with 3-week periods. Chewing activity was measured using IGER Behaviour recorders, and digestibility was measured by total collection of faeces. The two silages were fed long (170 mm), coarsely chopped (55 mm), or finely chopped (24 mm median particle length). Cows were fed silage...... ad libitum and supplemented with 6 kg concentrate. Early harvested silage significantly decreased total ration eating (ET), rumination (RT) and chewing time (CT) per kg silage DM compared with normal harvested silage (CT = 38 vs. 46 min kg-1 DM). Chopping of silage reduced CT significantly, mainly...

  12. Correlation of Telomere Length and Telomerase Activity with Occult Ovarian Insufficiency

    Butts, Samantha; Riethman, Harold; Ratcliffe, Sarah; Shaunik, Alka; Coutifaris, Christos; Barnhart, Kurt

    2009-01-01

    Background: Occult ovarian insufficiency is associated with infertility, impaired response to ovarian stimulation, and reduced live birth rates in women treated with assisted reproductive technologies. Although a decline in ovarian follicle number is expected with age, the proximate causes of occult ovarian insufficiency in young women remain poorly understood. Abnormalities in telomere length and telomerase activity in human granulosa cells may serve as molecular markers for this condition.

  13. Characterizing proton-activated materials to develop PET-mediated proton range verification markers.

    Cho, Jongmin; Ibbott, Geoffrey S; Kerr, Matthew D; Amos, Richard A; Stingo, Francesco C; Marom, Edith M; Truong, Mylene T; Palacio, Diana M; Betancourt, Sonia L; Erasmus, Jeremy J; DeGroot, Patricia M; Carter, Brett W; Gladish, Gregory W; Sabloff, Bradley S; Benveniste, Marcelo F; Godoy, Myrna C; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials ((18)O, Cu, and (68)Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm(-3)) and beef (~1.0 g cm(-3)) were embedded with Cu or (68)Zn foils of several volumes (10-50 mm(3)). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils' PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers. PMID:27203621

  14. Characterizing proton-activated materials to develop PET-mediated proton range verification markers

    Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm‑3) and beef (~1.0 g cm‑3) were embedded with Cu or 68Zn foils of several volumes (10–50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1–5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20–40 min of scan time using various delay times (30–150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  15. Active Stream Length Dynamics in Headwater Catchments Spanning Physiographic Provinces in the Appalachian Highlands

    Jensen, C.; McGuire, K. J.

    2015-12-01

    One of the most basic descriptions of streams is the presence of channelized flow. However, this seemingly simple query goes unanswered for the majority of headwater networks, as stream length expands and contracts with the wetness of catchments seasonally, interannually, and in response to storm events. Although streams are known to grow and shrink, a lack of information on longitudinal dynamics across different geographic regions precludes effective management. Understanding the temporal variation in temporary network length over a broad range of settings is critical for policy decisions that impact aquatic ecosystem health. This project characterizes changes in active stream length for forested headwater catchments spanning four physiographic provinces of the Appalachian Highlands: the New England at Hubbard Brook Experimental Forest, New Hampshire; Valley and Ridge at Poverty Creek and the North Fork of Big Stony Creek in Jefferson National Forest, Virginia; Blue Ridge at Coweeta Hydrologic Laboratory, North Carolina; and Appalachian Plateau at Fernow Experimental Forest, West Virginia. Multivariate statistical analysis confirms these provinces exhibit characteristic topographies reflecting differences in climate, geology, and environmental history and, thus, merit separate consideration. The active streams of three watersheds (<45 ha) in each study area were mapped six times to capture a variety of moderate flow conditions that can be expected most of the time (i.e., exceedance probabilities between 25 to 75%). The geomorphic channel and channel heads were additionally mapped to determine how active stream length variability relates to the development of the geomorphic network. We found that drainage density can vary up to four-fold with discharge. Stream contraction primarily proceeds by increasing disconnection and disintegration into pools, while the number of flow origins remains constant except at high and low extremes of discharge. This work demonstrates

  16. Chain length dependence of non-surface activity and micellization behavior of cationic amphiphilic diblock copolymers.

    Ghosh, Arjun; Yusa, Shin-ichi; Matsuoka, Hideki; Saruwatari, Yoshiyuki

    2014-04-01

    The cationic and anionic amphiphilic diblock copolymers with a critical chain length and block ratio do not adsorb at the air/water interface but form micelles in solution, which is a phenomenon called "non-surface activity". This is primarily due to the high charge density of the block copolymer, which creates a strong image charge effect at the air/water interface preventing adsorption. Very stable micelle formation in bulk solution could also play an important role in the non-surface activity. To further confirm these unique properties, we studied the adsorption and micellization behavior of cationic amphiphilic diblock copolymers of poly(n-butyl acrylate)-b-poly(3-(methacryloyloxy)ethyl)trimethylammonium chloride) (PBA-b-PDMC) with different molecular weights of hydrophobic blocks but with the same ionic block length. These block copolymers were successfully prepared via consecutive reversible addition-fragmentation chain transfer (RAFT) polymerization. The block copolymer with the shortest hydrophobic block length was surface-active; the solution showed surface tension reduction and foam formation. However, above the critical block ratio, the surface tension of the solution did not decrease with increasing polymer concentration, and there was no foam formation, indicating lack of surface activity. After addition of 0.1 M NaCl, stable foam formation and slight reduction of surface tension were observed, which is reminiscent of the electrostatic nature of the non-surface activity. Fluorescence and dynamic and static light scattering measurements showed that the copolymer with the shortest hydrophobic block did not form micelles, while the block copolymers formed spherical micelles having radii of 25-30 nm. These observations indicate that micelle formation is also important for non-surface activity. Upon addition of NaCl, cmc did not decrease but rather increased as observed for non-surface-active block copolymers previously studied. The micelles formed were

  17. Implementation of the Additional Protocol: Verification activities at uranium mines and mills

    Full text: The mining and milling of uranium is the first in a long chain of processes required to produce nuclear materials in a form suitable for use in nuclear weapons. Misuse of a declared uranium mining/milling facility, in the form of understatement of production, would be hard to detect with the same high level of confidence as afforded by classical safeguards on other parts of the nuclear fuel cycle. For these reasons, it would not be cost-effective to apply verification techniques based on classical safeguards concepts to a mining/milling facility in order to derive assurance of the absence of misuse. Indeed, these observations have been recognised in the Model Protocol (INFCIRC/540): 'the Agency shall not mechanistically or systematically seek to verify' information provided to it by States (Article 4.a.). Nevertheless, complementary access to uranium mining/milling sites 'on a selective basis in order to assure the absence of undeclared nuclear material and activities' (Article 4.a.(i)) is provided for. On this basis, therefore, this paper will focus predominantly on options other than site access, which are available to the Agency for deriving assurance that declared mining/milling operations are not misused. Such options entail the interpretation and analysis of information provided to the Agency including, for example, from declarations, monitoring import/export data, open source reports, commercial satellite imagery, aerial photographs, and information provided by Member States. Uranium mining techniques are diverse, and the inventories, flows and uranium assays which arise at various points in the process will vary considerably between mines, and over the operating cycle of an individual mine. Thus it is essentially impossible to infer any information, which can be used precisely to confirm, or otherwise, declared production by measuring or estimating any of those parameters at points within the mining/milling process. The task of attempting to

  18. A Method Based on Active Appearance Model and Gradient Orientation Pyramid of Face Verification as People Age

    Ji-Xiang Du

    2014-01-01

    Full Text Available Face verification in the presence of age progression is an important problem that has not been widely addressed. In this paper, we propose to use the active appearance model (AAM and gradient orientation pyramid (GOP feature representation for this problem. First, we use the AAM on the dataset and generate the AAM images; we then get the representation of gradient orientation on a hierarchical model, which is the appearance of GOP. When combined with a support vector machine (SVM, experimental results show that our approach has excellent performance on two public domain face aging datasets: FGNET and MORPH. Second, we compare the performance of the proposed methods with a number of related face verification methods; the results show that the new approach is more robust and performs better.

  19. Observing Evolution in the Supergranular Network Length Scale During Periods of Low Solar Activity

    McIntosh, Scott W.; Leamon, Robert J.; Hock, Rachel A.; Rast, Mark P.; Ulrich, Roger K.

    2011-03-01

    We present the initial results of an observational study into the variation of the dominant length scale of quiet solar emission: supergranulation. The distribution of magnetic elements in the lanes that from the network affects, and reflects, the radiative energy in the plasma of the upper solar chromosphere and transition region at the magnetic network boundaries forming as a result of the relentless interaction of magnetic fields and convective motions of the Suns' interior. We demonstrate that a net difference of ~0.5 Mm in the supergranular emission length scale occurs when comparing observation cycle 22/23 and cycle 23/24 minima. This variation in scale is reproduced in the data sets of multiple space- and ground-based instruments and using different diagnostic measures. By means of extension, we consider the variation of the supergranular length scale over multiple solar minima by analyzing a subset of the Mount Wilson Solar Observatory Ca II K image record. The observations and analysis presented provide a tantalizing look at solar activity in the absence of large-scale flux emergence, offering insight into times of "extreme" solar minimum and general behavior such as the phasing and cross-dependence of different components of the spectral irradiance. Given that the modulation of the supergranular scale imprints itself in variations of the Suns' spectral irradiance, as well as in the mass and energy transport into the entire outer atmosphere, this preliminary investigation is an important step in understanding the impact of the quiet Sun on the heliospheric system.

  20. OBSERVING EVOLUTION IN THE SUPERGRANULAR NETWORK LENGTH SCALE DURING PERIODS OF LOW SOLAR ACTIVITY

    We present the initial results of an observational study into the variation of the dominant length scale of quiet solar emission: supergranulation. The distribution of magnetic elements in the lanes that from the network affects, and reflects, the radiative energy in the plasma of the upper solar chromosphere and transition region at the magnetic network boundaries forming as a result of the relentless interaction of magnetic fields and convective motions of the Suns' interior. We demonstrate that a net difference of ∼0.5 Mm in the supergranular emission length scale occurs when comparing observation cycle 22/23 and cycle 23/24 minima. This variation in scale is reproduced in the data sets of multiple space- and ground-based instruments and using different diagnostic measures. By means of extension, we consider the variation of the supergranular length scale over multiple solar minima by analyzing a subset of the Mount Wilson Solar Observatory Ca II K image record. The observations and analysis presented provide a tantalizing look at solar activity in the absence of large-scale flux emergence, offering insight into times of 'extreme' solar minimum and general behavior such as the phasing and cross-dependence of different components of the spectral irradiance. Given that the modulation of the supergranular scale imprints itself in variations of the Suns' spectral irradiance, as well as in the mass and energy transport into the entire outer atmosphere, this preliminary investigation is an important step in understanding the impact of the quiet Sun on the heliospheric system.

  1. Leukocyte Telomere Length in Healthy Caucasian and African-American Adolescents : Relationships with Race, Sex, Adiposity, Adipokines, and Physical Activity

    Zhu, Haidong; Wang, Xiaoling; Gutin, Bernard; Davis, Catherine L.; Keeton, Daniel; Thomas, Jeffrey; Stallmann-Jorgensen, Inger; Mooken, Grace; Bundy, Vanessa; Snieder, Harold; van der Harst, Pim; Dong, Yanbin

    2011-01-01

    Objective To examine the relationships of race, sex, adiposity, adipokines, and physical activity to telomere length in adolescents. Study design Leukocyte telomere length (T/S ratio) was assessed cross-sectionally in 667 adolescents (aged 14-18 years; 48% African-Americans; 51% girls) using a quant

  2. Design information verification (DIV) of operating geological repositories (SAGOR activity 3b)

    Following IAEA Advisory and Consultants Group meetings in September 1988 and in May 1991 respectively an IAEA multi-national Support Programme Task was initiated to consider the 'Development of Safeguards for Final Disposal of Spent Fuel in Geological Repositories' (SAGOR). A 'Technical Coordination Committee' (TCC) was set up with invited representatives from those Member State Support Programmes wishing to be involved. The joint programme, through the TCC, was given the task of studying the safeguards requirements in: conditioning plant (where the spent fuel is prepared for transfer to the repository); operating repositories (i.e. those in which the fuel is being emplaced); closed repositories. At the first meeting of the TCC in Washington in July 1994 the UK undertook to provide a study of the Design Information Verification (DIV) required in all three areas. For this activity the requirements, techniques and procedures for the Design Information Verification (DIV) of operating repositories have been considered. In completing the study the findings reported for activities 1b and 2b (descriptions of a Model Repository and Potential Diversion Paths, respectively) have been used in formulating any conclusions reached. As with any facility there are a number of stages in its lifetime. For the purposes of this report the operating life of a repository is deemed to extend from its inception to when it is finally closed and the ground surface returned to being a green field. Areas where repositories differ from other safeguarded activities are highlighted in the model facility described in SAGOR activity 1b/c. Their impact makes it inevitable that DIV will play a key role in safeguarding an operational repository. They include: continual expansion during its operational life (the only current possible exception is that being proposed in Finland), flexible design during construction as geological features may be exposed which require that the declared layout to be

  3. Design information verification (DIV) of conditioning plant for geological repositories (SAGOR activity 3a)

    Following IAEA Advisory and Consultants Group meetings in September 1988 and in May 1991 respectively an IAEA multi-national Support Programme Task was initiated to consider the 'Development of Safeguards for Final Disposal of Spent Fuel in Geological Repositories' (SAGOR). A Technical Coordination Committee (TCC) was set up with invited representatives from those Member State Support Programmes wishing to be involved. The joint programme, through the TCC, was given the task of studying the safeguards requirements in: conditioning plant (where the spent fuel is prepared for transfer to the repository); operating repositories (i.e. those in which the fuel is being emplaced); closed repositories. At the first meeting of the TCC in Washington in July 1994 the UK undertook to provide a study of the Design Information Verification (DIV) required in all three areas. For this activity the requirements, techniques and procedures for the Design Information Verification (DIV) of conditioning plant have been considered. In completing the study the findings reported for Activities 1a and 2a (descriptions of a Model Facility and Potential Diversion Paths, respectively) have been used in formulating any conclusions reached. The findings of Activity 1a show that a conditioning plant will be far simpler in concept than is a reprocessing plant. From this it must be concluded that DIV is unlikely to present any significant problems; experience based on reprocessing plant shows that it can be done both efficiently and effectively using proven techniques. In section 3.2, however, a number of valuable techniques which are currently being developed are also described. These have potential for effecting better DIV and could provide further evidence that the declared designs are (or are not) being fully complied with. More general DIV requirements are described in section 4. Like the detailed considerations mentioned above, these are no different from those already being applied in

  4. 77 FR 28401 - Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office...

    2012-05-14

    ... Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request ACTION... comments on a collection of information that we will submit to the Office of Management and Budget (OMB... 1014-0009 in your comment and include your name and return address. FOR FURTHER INFORMATION...

  5. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  6. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  7. Telomerase activity is increased and telomere length shortened in T cells from blood of patients with atopic dermatitis and psoriasis

    Wu, Kehuai; Higashi, N; Hansen, E R;

    2000-01-01

    We studied telomerase activity and telomere length in PBMC and purified CD4(+) and CD8(+) T cells from blood obtained from a total of 32 patients with atopic dermatitis, 16 patients with psoriasis, and 30 normal controls. The telomerase activity was significantly increased in PBMC from the patients......(+) T cell subsets from normal donors. In conclusion, the increased telomerase activity and shortened telomere length indicates that T lymphocytes in atopic dermatitis and psoriasis are chronically stimulated and have an increased cellular turnover in vivo....

  8. FMCT verification: Case studies

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  9. Physical Education: The Effect of Epoch Lengths on Children’s Physical Activity in a Structured Context

    Aibar, Alberto; Chanal, Julien

    2015-01-01

    Background Despite a consensus emerging that affirms that shorter epochs should be used in youth to correctly register physical activity levels in free-living conditions, little is known about its effect on children’s physical activity conducted in structured periods of time. This study analyzed the effect that epoch length (1, 2, 3, 5, 10, 15, 30 and 60s) may have on different physical activity intensities in physical education lessons. Methods A sample of 1912 individual measures of physica...

  10. Status of the Agency's verification activities in Iraq as of 8 January 2003. Statement by the Director General. New York, 09 January 2003

    The following information is provided to update the Council on the activities of the IAEA pursuant to Security Council resolution 1441 (2002) and other relevant resolutions. It describes the verification activities performed thus far, next steps, and where we are at this stage

  11. Ground-based verification and data processing of Yutu rover Active Particle-induced X-ray Spectrometer

    Guo, Dong-Ya; Wang, Huan-Yu; Peng, Wen-Xi; Cui, Xing-Zhu; Zhang, Cheng-Mo; Liu, Ya-Qing; Liang, Xiao-Hua; Dong, Yi-Fan; Wang, Jin-Zhou; Gao, Min; Yang, Jia-Wei; Zhang, Jia-Yu; Li, Chun-Lai; Zou, Yong-Liao; Zhang, Guang-Liang; Zhang, Li-Yan; Fu, Xiao-Hui

    2015-07-01

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads on board the Yutu rover of the Chang'E-3 mission. In order to assess the instrumental performance of APXS, a ground verification test was performed for two unknown samples (basaltic rock, mixed powder sample). In this paper, the details of the experiment configurations and data analysis method are presented. The results show that the elemental abundance of major elements can be well determined by the APXS with relative deviations <15 wt.% (detection distance=30 mm, acquisition time=30 min). The derived detection limit of each major element is inversely proportional to acquisition time and directly proportional to detection distance, suggesting that the appropriate distance should be <50 mm. Supported by National Science and Technology Major Project (Chang'E-3 Active Particle-induced X-ray Spectrometer)

  12. Nuclear test ban verification

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency Lg and Pn recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  13. Telomere Lengths and Telomerase Activity in Dog Tissues: A Potential Model System to Study Human Telomere and Telomerase Biology

    Lubna Nasir

    2001-01-01

    Full Text Available Studies on telomere and telomerase biology are fundamental to the understanding of aging and age-related diseases such as cancer. However, human studies have been hindered by differences in telomere biology between humans and the classical murine animal model system. In this paper, we describe basic studies of telomere length and telomerase activity in canine normal and neoplastic tissues and propose the dog as an alternative model system. Briefly, telomere lengths were measured in normal canine peripheral blood mononuclear cells (PBMCs, a range of normal canine tissues, and in a panel of naturally occurring soft tissue tumours by terminal restriction fragment (TRF analysis. Further, telomerase activity was measured in canine cell lines and multiple canine tissues using a combined polymerase chain reaction/enzyme-linked immunosorbent assay method. TRF analysis in canine PBMCs and tissues demonstrated mean TRF lengths to range between 12 and 23 kbp with heterogeneity in telomere lengths being observed in a range of normal somatic tissues. In soft tissue sarcomas, two subgroups were identified with mean TRFs of 22.2 and 18.2 kbp. Telomerase activity in canine tissue was present in tumour tissue and testis with little or no activity in normal somatic tissues. These results suggest that the dog telomere biology is similar to that in humans and may represent an alternative model system for studying telomere biology and telomerase-targeted anticancer therapies.

  14. Ground-based verification and data processing of Yutu rover Active Particle-induced X-ray Spectrometer

    Guo, Dongya; Peng, Wenxi; Cui, Xingzhu; Zhang, Chengmo; Liu, Yaqing; Liang, Xiaohua; Dong, Yifan; Wang, Jinzhou; Gao, Min; Yang, Jiawei; Zhang, Jiayu; Li, Chunlai; Zou, Yongliao; Zhang, Guangliang; Zhang, Liyan; Fu, Xiaohui

    2015-01-01

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads on board the Yutu rover of Chang'E-3 mission. In order to assess the instrumental performance of APXS, a ground verification test was done for two unknown samples (basaltic rock, mixed powder sample). In this paper, the details of the experiment configurations and data analysis method are presented. The results show that the elemental abundance of major elements can be well determined by the APXS with relative deviations < 15 wt. % (detection distance = 30 mm, acquisition time = 30 min). The derived detection limit of each major element is inversely proportional to acquisition time and directly proportional to detection distance, suggesting that the appropriate distance should be < 50mm.

  15. Medial gastrocnemius muscle fascicle active torque-length and Achilles tendon properties in young adults with spastic cerebral palsy.

    Barber, Lee; Barrett, Rod; Lichtwark, Glen

    2012-10-11

    Individuals with spastic cerebral palsy (CP) typically experience muscle weakness. The mechanisms responsible for muscle weakness in spastic CP are complex and may be influenced by the intrinsic mechanical properties of the muscle and tendon. The purpose of this study was to investigate the medial gastrocnemius (MG) muscle fascicle active torque-length and Achilles tendon properties in young adults with spastic CP. Nine relatively high functioning young adults with spastic CP (GMFCS I, 17±2 years) and 10 typically developing individuals (18±2 years) participated in the study. Active MG torque-length and Achilles tendon properties were assessed under controlled conditions on a dynamometer. EMG was recorded from leg muscles and ultrasound was used to measure MG fascicle length and Achilles tendon length during maximal isometric contractions at five ankle angles throughout the available range of motion and during passive rotations imposed by the dynamometer. Compared to the typically developing group, the spastic CP group had 33% lower active ankle plantarflexion torque across the available range of ankle joint motion, partially explained by 37% smaller MG muscle and 4% greater antagonistic co-contraction. The Achilles tendon slack length was also 10% longer in the spastic CP group. This study confirms young adults with mild spastic CP have altered muscle-tendon mechanical properties. The adaptation of a longer Achilles tendon may facilitate a greater storage and recovery of elastic energy and partially compensate for decreased force and work production by the small muscles of the triceps surae during activities such as locomotion. PMID:22867763

  16. The changes in telomerase activity and telomere length in HeLa cells undergoing apop- tosis induced by sodium butyrate

    2001-01-01

    The changes in telomerase activity and telomere length during apoptosis in HeLa cells as induced by sodium butyrate (SB) have been studied. After a 48 h SB treatment, HeLa cells demonstrated characteristic apoptotic hallmarks including chromatin condensation, formation of apoptotic bodies and DNA Laddering which were caused by the cleavage and degradation of DNA between nucleosomes. There were no significant changes in telomerase activity of apoptotic cells, while the telomere length shortened markedly. In the meanwhile, cells became more susceptible to apoptotic stimuli and telomere became more vulnerable to degradation after telomerase activity was inhibited. All the results suggest that the apoptosis induced by SB is closely related to telomere shortening, while telomerase enhances resistance of HeLa cells to apoptotic stimuli by protecting telomere.

  17. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  18. ANATOMY OF SOLAR CYCLE LENGTH AND SUNSPOT NUMBER: DEPENDENCE OF AVERAGE GLOBAL TEMPERATURE ON SOLAR ACTIVITY

    Bhattacharya, A. B.; B. RAHA; Das, T.; M. Debnath; D. HALDER

    2011-01-01

    The paper examines thoroughly all the past 23 sunspot cycles and the associated 11 hale cycles. It is noticed that solar cycle 23 had a deep minimum with longest decline phase. When solar cycles 20 to 23 are compared with solar cycles 1 to 4, the forthcoming Dalton minimum can be expected. The predicted variation of sunspot number for the present solar cycle 24 is examined at length and it appears that the peak monthly sunspot number of the solar cycle 24 will be around 80. We have correlated...

  19. North Atlantic Basin Tropical Cyclone Activity in Relation to Temperature and Decadal- Length Oscillation Patterns

    Wilson, Robert M.

    2009-01-01

    Yearly frequencies of North Atlantic basin tropical cyclones, their locations of origin, peak wind speeds, average peak wind speeds, lowest pressures, and average lowest pressures for the interval 1950-2008 are examined. The effects of El Nino and La Nina on the tropical cyclone parametric values are investigated. Yearly and 10-year moving average (10-yma) values of tropical cyclone parameters are compared against those of temperature and decadal-length oscillation, employing both linear and bi-variate analysis, and first differences in the 10-yma are determined. Discussion of the 2009 North Atlantic basin hurricane season, updating earlier results, is given.

  20. Association of day length and weather conditions with physical activity levels in older community dwelling people

    Witham, Miles D; Donnan, Peter T.; Thenmalar Vadiveloo; Sniehotta, Falko F.; Crombie, Iain K; Zhiqiang Feng; McMurdo, Marion E. T.

    2014-01-01

    BACKGROUND: Weather is a potentially important determinant of physical activity. Little work has been done examining the relationship between weather and physical activity, and potential modifiers of any relationship in older people. We therefore examined the relationship between weather and physical activity in a cohort of older community-dwelling people. METHODS: We analysed prospectively collected cross-sectional activity data from community-dwelling people aged 65 and over in the Physical...

  1. A self-centering active probing technique for kinematic parameter identification and verification of articulated arm coordinate measuring machines

    A crucial task in the procedure of identifying the parameters of a kinematic model of an articulated arm coordinate measuring machine (AACMM) or robot arm is the process of capturing data. In this paper a capturing data method is analyzed using a self-centering active probe, which drastically reduces the capture time and the required number of positions of the gauge as compared to the usual standard and manufacturer methods. The mathematical models of the self-centering active probe and AACMM are explained, as well as the mathematical model that links the AACMM global reference system to the probe reference system. We present a self-calibration method that will allow us to determine a homogeneous transformation matrix that relates the probe's reference system to the AACMM last reference system from the probing of a single sphere. In addition, a comparison between a self-centering passive probe and self-centering active probe is carried out to show the advantages of the latter in the procedures of kinematic parameter identification and verification of the AACMM

  2. Professional verification a guide to advanced functional verification

    Wilcox, Paul

    2007-01-01

    The Profession of Verification.- Verification Challenges.- Advanced Funtional Verification.- Successful Verification.- Professional Verification.- The Unified Verification Methodology.- The Unified Verification Methodology.- UVM System-Level Design.- Control Digital Subsystems.- Algorithmic Digital Subsystems.- Analog/RF Subsystems.- Integration and System Verification.- Tools of the Trade.- System-Level Design.- Formal Verification Tools.- Testbench Development.- Advanced Testbenches.- Hardware-Based Verification.

  3. Experimental verification of the control mechanism of marketing activities of small consulting organization

    Kirushkin, A. A.

    2013-01-01

    This article contains results of testing the new mechanism of managing marketing activity of a small consulting organization, proposed in the previous author’s article (see Journal Marketing MBA, 2013, issue 1). The testing was done with computer simulating the organization’s activity with several initial data. The simulations have given recommendations on managing marketing activity of small consulting organizations.

  4. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  5. The length of a lantibiotic hinge region has profound influence on antimicrobial activity and host specificity

    Liang eZhou

    2015-01-01

    Full Text Available Lantibiotics are ribosomally synthesized (methyllanthionine containing peptides which can efficiently inhibit the growth of Gram-positive bacteria. As lantibiotics kill bacteria efficiently and resistance to them is difficult to be obtained, they have the potential to be used in many applications, e.g. in pharmaceutical industry or food industry. Nisin can inhibit the growth of Gram-positive bacteria by binding to lipid II and by making pores in their membrane. The C-terminal part of nisin is known to play an important role during translocation over the membrane and forming pore complexes. However, as the thickness of bacterial membranes varies between different species and environmental conditions, this property could have an influence on the pore forming activity of nisin. To investigate this, the so-called hinge region of nisin (residues NMK was engineered to vary from one to six amino acid residues and specific activity against different indicators was compared. Antimicrobial activity in liquid culture assays showed that wild type nisin is most active, while truncation of the hinge region dramatically reduced the activity of the peptide. However, one or two amino acids extensions showed only slightly reduced activity against most indicator strains. Notably, some variants (+2, +1, -1, -2 exhibited higher antimicrobial activity than nisin in agar well diffusion assays against Lactococcus lactis MG1363, Listeria monocytogenes, Enterococcus faecalis VE14089, Bacillus sporothermodurans IC4 and Bacillus cereus 4153 at certain temperatures.

  6. Nuclear disarmament verification

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  7. Nuclear disarmament verification

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  8. The chain length of biologically produced (R)-3-hydroxyalkanoic acid affects biological activity and structure of anti-cancer peptides.

    Szwej, Emilia; Devocelle, Marc; Kenny, Shane; Guzik, Maciej; O'Connor, Stephen; Nikodinovic-Runic, Jasmina; Radivojevic, Jelena; Maslak, Veselin; Byrne, Annete T; Gallagher, William M; Zulian, Qun Ren; Zinn, Manfred; O'Connor, Kevin E

    2015-06-20

    Conjugation of DP18L peptide with (R)-3-hydroxydecanoic acid, derived from the biopolymer polyhydroxyalkanoate, enhances its anti-cancer activity (O'Connor et al., 2013. Biomaterials 34, 2710-2718). However, it is unknown if other (R)-3-hydroxyalkanoic acids (R3HAs) can enhance peptide activity, if chain length affects enhancement, and what effect R3HAs have on peptide structure. Here we show that the degree of enhancement of peptide (DP18L) anti-cancer activity by R3HAs is carbon chain length dependent. In all but one example the R3HA conjugated peptides were more active against cancer cells than the unconjugated peptides. However, R3HAs with 9 and 10 carbons were most effective at improving DP18L activity. DP18L peptide variant DP17L, missing a hydrophobic amino acid (leucine residue 4) exhibited lower efficacy against MiaPaCa cells. Circular dichroism analysis showed DP17L had a lower alpha helix content and the conjugation of any R3HA ((R)-3-hydroxyhexanoic acid to (R)-3-hydroxydodecanoic acid) to DP17L returned the helix content back to levels of DP18L. However (R)-3-hydroxyhexanoic did not enhance the anti-cancer activity of DP17L and at least 7 carbons were needed in the R3HA to enhance activity of D17L. DP17L needs a longer chain R3HA to achieve the same activity as DP18L conjugated to an R3HA. As a first step to assess the synthetic potential of polyhydroxyalkanoate derived R3HAs, (R)-3-hydroxydecanoic acid was synthetically converted to (±)3-chlorodecanoic acid, which when conjugated to DP18L improved its antiproliferative activity against MiaPaCa cells. PMID:25820126

  9. AutoProof: Auto-active Functional Verification of Object-oriented Programs

    Tschannen, Julian; Furia, Carlo A.; Nordio, Martin; Polikarpova, Nadia

    2015-01-01

    Auto-active verifiers provide a level of automation intermediate between fully automatic and interactive: users supply code with annotations as input while benefiting from a high level of automation in the back-end. This paper presents AutoProof, a state-of-the-art auto-active verifier for object-oriented sequential programs with complex functional specifications. AutoProof fully supports advanced object-oriented features and a powerful methodology for framing and class invariants, which make...

  10. Modulating anti-MicroRNA-21 activity and specificity using oligonucleotide derivatives and length optimization

    Munoz-Alarcon, Andres; Guterstam, Peter; Romero, Cristian;

    2012-01-01

    MicroRNAs are short, endogenous RNAs that direct posttranscriptional regulation of gene expression vital for many developmental and cellular functions. Implicated in the pathogenesis of several human diseases, this group of RNAs provides interesting targets for therapeutic intervention. Anti......-microRNA oligonucleotides constitute a class of synthetic antisense oligonucleotides used to interfere with microRNAs. In this study, we investigate the effects of chemical modifications and truncations on activity and specificity of anti-microRNA oligonucleotides targeting microRNA-21. We observed an increased activity...

  11. Honey, I Shrunk the DNA : DNA Length as a Probe for Nucleic-Acid Enzyme Activity

    Oijen, Antoine M. van

    2007-01-01

    The replication, recombination, and repair of DNA are processes essential for the maintenance of genomic information and require the activity of numerous enzymes that catalyze the polymerization or digestion of DNA. This review will discuss how differences in elastic properties between single- and d

  12. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies

    Caswell, Joseph M.; Singh, Manraj; Persinger, Michael A.

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings.

  13. Length and activity of the root apical meristem revealed in vivo by infrared imaging

    Bizet, François; Hummel, Iréne; Bogeat-Triboulot, Marie-Béatrice

    2014-01-01

    Understanding how cell division and cell elongation influence organ growth and development is a long-standing issue in plant biology. In plant roots, most of the cell divisions occur in a short and specialized region, the root apical meristem (RAM). Although RAM activity has been suggested to be of high importance to understand how roots grow and how the cell cycle is regulated, few experimental and numeric data are currently available. The characterization of the RAM is difficult and essenti...

  14. Verification of a characterization method of the laser-induced selective activation based on industrial lasers

    Zhang, Yang; Hansen, Hans Nørgaard; Tang, Peter T.;

    2013-01-01

    In this article, laser-induced selective activation (LISA) for subsequent autocatalytic copper plating is performed by several types of industrial scale lasers, including a Nd:YAG laser, a UV laser, a fiber laser, a green laser, and a short pulsed laser. Based on analysis of all the laser-machine...

  15. Verification and validation benchmarks.

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  16. Length of guanosine homopolymeric repeats modulates promoter activity of subfamily II tpr genes of Treponema pallidum ssp. pallidum

    Giacani, Lorenzo; Lukehart, Sheila; Centurion-Lara, Arturo

    2007-01-01

    In Treponema pallidum, homopolymeric guanosine repeats of varying length are present upstream of both Subfamily I (tprC, D, F and I) and II (tprE, G and J) tpr genes, a group of potential virulence factors, immediately upstream of the +1 nucleotide. To investigate the influence of these poly-G sequences on promoter activity, tprE, G, J, F and I promoter regions containing homopolymeric tracts with different numbers of Gs, the ribosomal binding site and start codon were cloned in frame with th...

  17. Identification of uranium signatures in swipe samples on verification of nuclear activities for nuclear safeguards purposes

    The use of environmental sampling for safeguards purposes, has been applied by the International Atomic Energy Agency–IAEA since 1996 and are routinely used as a complementary measure to strengthen the traditional nuclear safeguards procedures. The aim is verify if the states signatory to the safeguards agreements are not diverging their peaceful nuclear activities for undeclared nuclear activities. This work describes a new protocol of collect and analysis of the swipe samples for identification of nuclear signatures that may be related to the nuclear activities developed in the inspected facility. This work was used as a case of study a real uranium conversion plant of the nuclear fuel cycle of IPEN. The strategy proposed uses different analytical techniques, such as alpha radiation meter, SEM-EDX and ICP-MS to identify signatures of uranium adhered to the swipe samples. In the swipe samples analysis, it was possible to identify particles of UO2F2 and UF4 through the morphological comparison and semi-quantitative analyses performed by SEM-EDX technique. In this work, methods were used that as a result has the average isotopic composition of the sample, in which the enrichment ranged from 1.453 ± 0.023 to 18.24 % ± 0.15 % in the 235U isotope. Through these externally collections, a non-intrusive sampling, it was possible to identify enriched material handling activities with enrichment of 1.453 % ± 0.023 % to 6.331 ± 0.055 % in the isotope 235U, as well as the use of reprocessed material, through the identification of the 236U isotope. The uncertainties obtained for the n(235U)/n(238U) ratio varied from 0.40% to 0.86 % for the internal swipe samples. (author)

  18. Load rather than length sensitive feedback contributes to soleus muscle activity during human treadmill walking

    Klint, Richard Albin Ivar af; Mazzaro, Nazarena; Nielsen, Jens Bo;

    2010-01-01

    body load and ankle joint angle. The volunteers walked on a treadmill ( approximately 3.6 km/h) connected to a body weight support (BWS) system. To manipulate the load sensitive afferents the level of BWS was switched between 5 and 30% of body weight. The effect of transient changes in BWS on the...... significant difference was observed for the SLR (P = 0.13). Similarly, the effect of the BWS was measured on the unload response, i.e., the depression in soleus activity following a plantar-flexion perturbation ( approximately 5.6 degrees, 203-247 degrees/s), quantified over a 50 ms analysis window. The...... unload response decreased with decreased load (P > 0.001), but was not significantly affected (P = 0.45) by tizanidine induced depression of the MLR (P = 0.039, n = 6). Since tizanidine is believed to depress the group II afferent pathway, these results are consistent with the idea that force...

  19. Active neutron interrogation for verification of storage of weapons components at the Oak Ridge Y-12 Plant

    A nuclear weapons identification system (NWIS), under development since 1984 at the Oak Ridge Y-12 Plant and presently in use there, uses active neutron interrogation with low-intensity 252Cf sources in ionization chambers to provide a timed source of fission neutrons from the spontaneous fission of 252Cf. To date, measurements have been performed on ∼15 different weapons systems in a variety of configurations both in and out of containers. Those systems included pits and fully assembled systems ready for deployment at the Pantex Plant in Amarillo, Texas, and weapons components at the Oak Ridge Y-12 Plant. These measurements have shown that NWIS can identify nuclear weapons and/or components; nuclear weapons/components can be distinguished from mockups where fissile material has been replaced by nonfissile material; omissions of small amounts (4%) of fissile material can be detected; changes in internal configurations can be determined; trainer parts can be identified as was demonstrated by verification of 512 containers with B33 components at the Y-12 Plant (as many as 32 in one 8-hour shift); and nonfissile components can be identified. The current NWIS activities at the Oak Ridge Y-12 Plant include: (1) further development of the system for more portability and lower power consumption, (2) collection of reference signatures for all weapons components in containers, and (3) confirmation of a particular weapons component in storage and confirmation of receipts. This paper describes the recent measurements with NWIS for a particular weapons component in storage that have resolved an Inspector General (IG's) audit finding with regard to performance of confirmation of inventory

  20. Thermodynamic compatibility of actives encapsulated into PEG-PLA nanoparticles: In Silico predictions and experimental verification.

    Erlebach, Andreas; Ott, Timm; Otzen, Christoph; Schubert, Stephanie; Czaplewska, Justyna; Schubert, Ulrich S; Sierka, Marek

    2016-09-15

    Achieving optimal solubility of active substances in polymeric carriers is of fundamental importance for a number of industrial applications, including targeted drug delivery within the growing field of nanomedicine. However, its experimental optimization using a trial-and-error approach is cumbersome and time-consuming. Here, an approach based on molecular dynamics (MD) simulations and the Flory-Huggins theory is proposed for rapid prediction of thermodynamic compatibility between active species and copolymers comprising hydrophilic and hydrophobic segments. In contrast to similar methods, our approach offers high computational efficiency by employing MD simulations that avoid explicit consideration of the actual copolymer chains. The accuracy of the method is demonstrated for compatibility predictions between pyrene and nile red as model dyes as well as indomethacin as model drug and copolymers containing blocks of poly(ethylene glycol) and poly(lactic acid) in different ratios. The results of the simulations are directly verified by comparison with the observed encapsulation efficiency of nanoparticles prepared by nanoprecipitation. © 2016 Wiley Periodicals, Inc. PMID:27425625

  1. Photocatalytic Activity and Photocurrent Properties of TiO2 Nanotube Arrays Influenced by Calcination Temperature and Tube Length

    Hou, Jian; Zhang, Min; Yan, Guotian; Yang, Jianjun

    2012-06-01

    In this article, titanium oxide nanotube arrays (TiO2-NTAs) were fabricated by anodic oxidation in an ethylene glycol (EG) electrolyte solution containing 0.25 wt.% NH4F. By varying anodized time and annealed temperature, the obtained nanotube arrays behaved different photocatalytic (PC) activities and photocurrent properties. These samples were characterized by scanning electronic microscope (SEM), X-ray powder diffraction (XRD). It was indicated in SEM images that TiO2 nanotube manifests highly ordered structure which, however, has been completely destroyed when the temperature comes to 800°C. XRD manifested that TiO2 nanotubes with various kinds of length all possessed anatase crystallite when annealed at 500°C; meanwhile, with certain length, TiO2-NTAs annealed at series calcination temperature range of 300-600°C also presented anatase crystallite, which is gradually enhanced with the increment of temperature. At 700°C, mixed structure was observed which was made up of proportions of overwhelming anatase and toothful rutile. Methyl blue (MB) degradation and photocurrent measurement testified that TiO2-NTAs under 4 h oxidation and 3 h of 600°C calcination manifested the highest activity and photocurrent density.

  2. Production of medium-chain-length polyhydroxyalkanoates by activated sludge enriched under periodic feeding with nonanoic acid.

    Lee, Sun Hee; Kim, Jae Hee; Mishra, Debaraj; Ni, Yu-Yang; Rhee, Young Ha

    2011-05-01

    The potential use of activated sludge for the production of medium-chain-length polyhydroxyalkanoates (MCL-PHAs) was investigated. The enrichment of bacterial populations capable of producing MCL-PHAs was achieved by periodic feeding with nonanoic acid in a sequencing batch reactor (SBR). Denaturing gradient gel electrophoresis analysis revealed Pseudomonas aeruginosa strains to be predominant in the bacterial community during the SBR process. The composition of PHA synthesized by the enriched biomass from nonanoic acid consisted of a large concentration (>89 mol%) of MCL monomer units and a small amount of short-chain-length monomer units. Under fed-batch fermentation with continuous feeding of nonanoic acid at a flow rate of 0.225 g/L/h and a C/N ratio of 40, a maximum PHA content of 48.6% dry cell weight and a conversion yield (Y(p/s)) of 0.94 g/g were achieved. These results indicate that MCL-PHA production by activated sludge is a promising alternative to typical pure culture approaches. PMID:21463934

  3. Verification of the viability of virions detection using neutron activation analysis

    The use of nuclear techniques, as Neutron Activation Analysis, can be an alternative way for the microbiological diagnosis, bringing a significant profit in the analysis time, for not needing pre cultivated samples in appropriate way. In this technique, the samples are collected and submitted to a thermal neutron beam. The interaction of these neutrons with the samples generates gamma rays whose energy spectre is a characteristic of the elemental composition of these samples. Of this done one, a virus presence can be detected in the sample through the distinction of its respective elemental compositions allowing, also, carrying through the analysis in real time. In this work, computational simulations had been become fulfilled using the radiation transport code based on the Monte Carlo Method, MCNP4B, to verify the viability of the application of this system for the virus particle detection in its natural collection environment. (author)

  4. On-site characterization and verification for removal of cesium contamination along an active rail spur

    A rapid, cost-effective sampling and analysis approach was used to conduct on-site analysis of soil and ballast for 137Cs contamination along an active rail spur in Oak Ridge, Tennessee. Savings in time and analytical costs were realized for both the characterization and cleanup phases of this project. For characterization, a procedure was developed to conduct gamma-logging of shallow boreholes. Portable gamma radiation survey instruments consisting of a 5- by 5-cm NaI detector coupled to a digital ratemeter/scaler with a single-channel analyzer were calibrated to a 137Cs source, then used for both surface and subsurface gamma radiation surveys. A correlation was made between the direct radiation measurements and 137Cs concentrations in surface and subsurface soil and ballast. Borehole logs of gamma-ray activity as a function of depth were used to estimate vertical profiles of the subsurface distribution of 137Cs. Information from these surveys was used to develop accurate 3-dimensional isopleths of contaminated areas, including estimates of waste volumes. After a remedial alternative was selected, on-site analytical methods were again used to verify that the cleanup level was achieved. Samples from excavated areas were analyzed for 137Cs by comparative gamma-ray counting using a multi-channel pulse height analyzer and a 3-in. diameter by 2-in. thick NaI crystal. The counting system was housed in a mobile field laboratory. To better direct ongoing excavation, a correlation factor between samples analyzed by gamma spectroscopy and measurements made by field instruments was determined. In both the characterization and the cleanup phases, sample results were verified by independent off-site laboratory analyses

  5. Hydrophobic Side-Chain Length Determines Activity and Conformational Heterogeneity of a Vancomycin Derivative Bound to the Cell Wall of Staphylococcus aureus§

    Kim, Sung Joon; Schaefer, Jacob

    2008-01-01

    Disaccharide modified glycopeptides with hydrophobic sidechains are active against vancomycin-resistant enterococci and vancomycin-resistant S. aureus. The activity depends on the length of the sidechain. The benzyl sidechain of N-(4-fluorobenzyl)vancomycin (FBV) has the minimal length sufficient for enhancement in activity against vancomycin-resistant pathogens. The conformation of FBV bound to the peptidoglycan in whole cells of S. aureus has been determined using rotational-echo double res...

  6. Availability verification of information for human system interface in automatic SG level control using activity diagram

    Steam Generator (SG) level control system in OPR 1000 is one of representative automatic systems that falls under the Supervisory Control level in Endsley's taxonomy. Supervisory control of automated systems is classified as a form of out of the loop (OOTL) performance due to passive involvement in the systems operation, which could lead to loss of situation awareness (SA). There was a reported event, which was caused by inadequate human automation communication that contributed to an unexpected reactor trip in July 2005. A high SG level trip occurred in Yeonggwang (YGN) Unit 6 Nuclear Power Plant (NPP) due to human operator failure to recognize the need to change the control mode of the economizer valve controller (EVC) to manual mode during swap over (the transition from low power mode to high power mode) after the loss of offsite power (LOOP) event was recovered. This paper models the human system interaction in NPP SG level control system using Unified Modeling Language (UML) Activity Diagram. Then, it identifies the missing information for operators in the OPR1000 Main Control Room (MCR) and suggests some means of improving the human system interaction

  7. Prompt γ-ray activation analysis of Martian analogues at the FRM II neutron reactor and the verification of a Monte Carlo planetary radiation environment model

    Planetary radiation environment modelling is important to assess the habitability of a planetary body. It is also useful when interpreting the γ-ray data produced by natural emissions from radioisotopes or prompt γ-ray activation analysis. γ-ray spectra acquired in orbit or in-situ by a suitable detector can be converted into meaningful estimates of the concentration of certain elements on the surface of a planet. This paper describes the verification of a Monte Carlo model developed using the MCNPX code at University of Leicester. The model predicts the performance of a geophysical package containing a γ-ray spectrometer operating at a depth of up to 5 m. The experimental verification of the Monte Carlo model was performed at the FRM II facility in Munich, Germany. The paper demonstrates that the model is in good agreement with the experimental data and can be used to model the performance of an in-situ γ-ray spectrometer.

  8. Validation and verification of a deterministic model for corrosion and activity incorporation using operational data from Kozloduy NPP

    An integrated deterministic model of corrosion and activity incorporation in the construction materials of the primary circuit of light water reactors based on fundamental physico-chemical processes has been recently proposed. The calculational procedure of the model enables to obtain reliable estimates of the kinetic and transport parameters of growth and restructuring of inner and outer oxide layers on austenitic steels (AISI 304, 0X18H10T, AISI 316, A800) and nickel alloys (A600 ∦ A690) via a quantitative comparison of the model equations with electrochemical data on the conduction mechanism and ex-situ analytical information on the thickness and in-depth chemical composition of the oxides on such materials stemming from both laboratory and in-pile BWR, PWR and WWER reactor experience. As a result, a large database of kinetic and transport parameters makes it possible to predict the kinetics of growth and restructuring of the oxides, as well as corrosion release from the construction materials in the primary circuit, by using data on the water chemistry and radioactive corrosion products in the coolant and the piping surfaces. Predictions on the incorporation of radioactive corrosion products during oxide growth and restructuring are also made. In the present communication, the validation and verification of the model using data for the primary circuit water chemistry and radioactive corrosion products from both reactors of the Kozloduy NPP are discussed. The calculations are in good agreement with the available experimental data and allow for reliable long-term predictions of the corrosion processes and radioactivity accumulation in the primary circuit of both reactors of Kozloduy NPP to be obtained. (author)

  9. Validation and verification of a deterministic model for corrosion and activity incorporation using operational data from Kozloduy NPP

    Betova, I. [Technical Univ. of Sofia, Sofia (Bulgaria); Bojinov, M. [Univ. of Chemical Technology and Metallurgy, Sofia (Bulgaria); Minkova, K. [Kozloduy Nuclear Power Plant, Kozloduy (Bulgaria)

    2010-07-01

    An integrated deterministic model of corrosion and activity incorporation in the construction materials of the primary circuit of light water reactors based on fundamental physico-chemical processes has been recently proposed. The calculational procedure of the model enables to obtain reliable estimates of the kinetic and transport parameters of growth and restructuring of inner and outer oxide layers on austenitic steels (AISI 304, 0X18H10T, AISI 316, A800) and nickel alloys (A600 ∦ A690) via a quantitative comparison of the model equations with electrochemical data on the conduction mechanism and ex-situ analytical information on the thickness and in-depth chemical composition of the oxides on such materials stemming from both laboratory and in-pile BWR, PWR and WWER reactor experience. As a result, a large database of kinetic and transport parameters makes it possible to predict the kinetics of growth and restructuring of the oxides, as well as corrosion release from the construction materials in the primary circuit, by using data on the water chemistry and radioactive corrosion products in the coolant and the piping surfaces. Predictions on the incorporation of radioactive corrosion products during oxide growth and restructuring are also made. In the present communication, the validation and verification of the model using data for the primary circuit water chemistry and radioactive corrosion products from both reactors of the Kozloduy NPP are discussed. The calculations are in good agreement with the available experimental data and allow for reliable long-term predictions of the corrosion processes and radioactivity accumulation in the primary circuit of both reactors of Kozloduy NPP to be obtained. (author)

  10. Para-aminobenzamidine linked regenerated cellulose membranes for plasminogen activator purification: Effect of spacer arm length and ligand density

    Fasoli, Ezio; Reyes, Yiaslin Ruiz; Guzman, Osiris Martinez; Rosado, Alexandra; Cruz, Vivian Rodriguez; Borges, Amaris; Martinez, Edmarie; Bansal, Vibha

    2013-01-01

    Despite membrane-based separations offering superior alternative to packed bed chromatographic processes, there has been a substantial lacuna in their actual application to separation processes. One of the major reasons behind this is the lack of availability of appropriately modified or end-group modifiable membranes. In this paper, an affinity membrane was developed using a commercially available serine protease inhibitor, para-aminobenzamidine (pABA). The membrane modification was optimized for protein binding capacity by varying: i) the length of the spacer arm (SA; 5-atoms, 7-atoms, and 14-atoms) linking the ligand to membrane surface; ii) the affinity ligand (pABA) density on membrane surface (5–25 nmoles per cm2). Resulting membranes were tested for their ability to bind plasminogen activators (PAs) from mono- and multi- component systems in batch mode. The membrane containing pABA linked through 7-atoms SA but similar ligand density as in the case of 5- or 14- atoms long SA was found to bind up to 1.6-times higher amounts of PA per nmole of immobilized ligand from conditioned HeLa cell culture media. However, membranes with similar ligand densities but different lengths of SA, showed comparable binding capacities in monocomponent system. In addition, the length of SA did not affect the selectivity of the ligand for PA. A clear inverse linear correlation was observed between ligand density and binding capacity until the point of PA binding optima was reached (11±1.0 nmoles per cm2) in mono- and multi- component systems for 7- as well as 14- atoms SA. Up to 200-fold purification was achieved in a single step separation of PA from HeLa conditioned media using these affinity membranes. The issues of ligand leaching and reuse of the membranes were also investigated. An extensive regeneration procedure allowed the preservation of approximately 95% of the PA binding capacity of the membranes even after five cycles of use. PMID:23703544

  11. Conjugation of fatty acids with different lengths modulates the antibacterial and antifungal activity of a cationic biologically inactive peptide.

    Malina, Amir; Shai, Yechiel

    2005-09-15

    Many studies have shown that an amphipathic structure and a threshold of hydrophobicity of the peptidic chain are crucial for the biological function of AMPs (antimicrobial peptides). However, the factors that dictate their cell selectivity are not yet clear. In the present study, we show that the attachment of aliphatic acids with different lengths (10, 12, 14 or 16 carbon atoms) to the N-terminus of a biologically inactive cationic peptide is sufficient to endow the resulting lipopeptides with lytic activity against different cells. Mode-of-action studies were performed with model phospholipid membranes mimicking those of bacterial, mammalian and fungal cells. These include determination of the structure in solution and membranes by using CD and ATR-FTIR (attenuated total reflectance Fourier-transform infrared) spectroscopy, membrane leakage experiments and by visualizing bacterial and fungal damage via transmission electron microscopy. The results obtained reveal that: (i) the short lipopeptides (10 and 12 carbons atoms) are non-haemolytic, active towards both bacteria and fungi and monomeric in solution. (ii) The long lipopeptides (14 and 16 carbons atoms) are highly antifungal, haemolytic only at concentrations above their MIC (minimal inhibitory concentration) values and aggregate in solution. (iii) All the lipopeptides adopt a partial alpha-helical structure in 1% lysophosphatidylcholine and bacterial and mammalian model membranes. However, the two short lipopeptides contain a significant fraction of random coil in fungal membranes, in agreement with their reduced antifungal activity. (iv) All the lipopeptides have a membranolytic effect on all types of cells assayed. Overall, the results reveal that the length of the aliphatic chain is sufficient to control the pathogen specificity of the lipopeptides, most probably by controlling both the overall hydrophobicity and the oligomeric state of the lipopeptides in solution. Besides providing us with basic

  12. Independent verification of the delivered dose in High-Dose Rate (HDR) brachytherapy

    An important aspect of a Quality Assurance program in Clinical Dosimetry is an independent verification of the dosimetric calculation done by the Treatment Planning System for each radiation treatment. The present paper is aimed at creating a spreadsheet for the verification of the dose recorded at a point of an implant with radioactive sources and HDR in gynecological injuries. An 192Ir source automatic differed loading equipment, GammaMedplus model, Varian Medical System with HDR installed at the Angel H. Roffo Oncology Institute has been used. The planning system implemented for getting the dose distribution is the BraquiVision. The sources coordinates as well as those of the calculation point (Rectum) are entered into the Excel-devised verification program by assuming the existence of a point source in each one of the applicators' positions. Such calculation point has been selected as the rectum is an organ at risk, therefore determining the treatment planning. The dose verification is performed at points standing at a sources distance having at least twice the active length of such sources, so they may be regarded as point sources. Most of the sources used in HDR brachytherapy with 192Ir have a 5 mm active length for all equipment brands. Consequently, the dose verification distance must be at least of 10 mm. (author)

  13. Flame Length

    Earth Data Analysis Center, University of New Mexico — Flame length was modeled using FlamMap, an interagency fire behavior mapping and analysis program that computes potential fire behavior characteristics. The tool...

  14. Neutron Multiplicity And Active Well Neutron Coincidence Verification Measurements Performed For March 2009 Semi-Annual DOE Inventory

    The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control and Accountability (MC and A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe) γ-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate γ-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC and A booking, and holdup measurements assays Site-wide. However none of the AD HpGe γ-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a γ-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC and A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass/fail criteria of

  15. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Fischer, J

    2005-12-21

    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  16. Potency of Full- Length MGF to Induce Maximal Activation of the IGF-I R Is Similar to Recombinant Human IGF-I at High Equimolar Concentrations

    Janssen, Joseph A. M. J. L.; Hofland, Leo J.; Strasburger, Christian J.; Elisabeth S R van den Dungen; Mario Thevis

    2016-01-01

    textabstractAims To compare full-length mechano growth factor (full-length MGF) with human recombinant insulin-like growth factor-I (IGF-I) and human recombinant insulin (HI) in their ability to activate the human IGF-I receptor (IGF-IR), the human insulin receptor (IR-A) and the human insulin receptor-B (IR-B), respectively. In addition, we tested the stimulatory activity of human MGF and its stabilized analog Goldspink-MGF on the IGF-IR. Methods The effects of full-length MGF, IGF-I, human ...

  17. Role of active contraction and tropomodulins in regulating actin filament length and sarcomere structure in developing zebrafish skeletal muscle

    Lise eMazelet

    2016-03-01

    Full Text Available Whilst it is recognised that contraction plays an important part in maintaining the structure and function of mature skeletal muscle, its role during development remains undefined. In this study the role of movement in skeletal muscle maturation was investigated in intact zebrafish embryos using a combination of genetic and pharmacological approaches. An immotile mutant line (cacnb1ts25 which lacks functional voltage-gated calcium channels (dihydropyridine receptors in the muscle and pharmacological immobilisation of embryos with a reversible anaesthetic (Tricaine, allowed the study of paralysis (in mutants and anaesthetised fish and recovery of movement (reversal of anaesthetic treatment. The effect of paralysis in early embryos (aged between 17-24 hours post fertilisation, hpf on skeletal muscle structure at both myofibrillar and myofilament level was determined using both immunostaining with confocal microscopy and small angle X-ray diffraction. The consequences of paralysis and subsequent recovery on the localisation of the actin capping proteins Tropomodulin 1 &4 (Tmod in fish aged from 17hpf until 42hpf was also assessed. The functional consequences of early paralysis were investigated by examining the mechanical properties of the larval muscle. The length-force relationship, active and passive tension, was measured in immotile, recovered and control skeletal muscle at 5 and 7 day post fertilisation (dpf. Recovery of muscle function was also assessed by examining swimming patterns in recovered and control fish. Inhibition of the initial embryonic movements (up to 24 hpf resulted in an increase in myofibril length and a decrease in width followed by almost complete recovery in both moving and paralysed fish by 42hpf. In conclusion, myofibril organisation is regulated by a dual mechanism involving movement-dependent and movement-independent processes. The initial contractile event itself drives the localisation of Tmod1 to its sarcomeric

  18. Length-dependent binding of human XLF to DNA and stimulation of XRCC4.DNA ligase IV activity.

    Lu, Haihui; Pannicke, Ulrich; Schwarz, Klaus; Lieber, Michael R

    2007-04-13

    An XRCC4-like factor, called XLF or Cernunnos, was recently identified as another important factor in the non-homologous DNA end joining (NHEJ) process. NHEJ is the major pathway for the repair of double-strand DNA breaks. The similarity in the putative secondary structures of XLF and XRCC4 as well as the association of XLF with XRCC4.DNA ligase IV in vivo suggested a role in the final ligation step of NHEJ. Here, we find that purified XLF directly interacts with purified XRCC4.DNA ligase IV complex and stimulates the ligase complex in a direct assay for ligation activity. Purified XLF has DNA binding activity, but this binding is dependent on DNA length in a manner most consistent with orientation of the C-terminal alpha helices parallel to the DNA helix. To better understand the function of XLF, we purified an XLF mutant (R57G), which was identified in patients with NHEJ deficiency and severe combined immunodeficiency. Surprisingly, the mutant protein retained its ability to stimulate XRCC4.DNA ligase IV but failed to translocate to the nucleus, and this appears to be the basis for the NHEJ defect in this patient. PMID:17317666

  19. Accumulation of human full-length tau induces degradation of nicotinic acetylcholine receptor α4 via activating calpain-2

    Yin, Yaling; Wang, Yali; Gao, Di; Ye, Jinwang; Wang, Xin; Fang, Lin; Wu, Dongqin; Pi, Guilin; Lu, Chengbiao; Zhou, Xin-Wen; Yang, Ying; Wang, Jian-Zhi

    2016-01-01

    Cholinergic impairments and tau accumulation are hallmark pathologies in sporadic Alzheimer’s disease (AD), however, the intrinsic link between tau accumulation and cholinergic deficits is missing. Here, we found that overexpression of human wild-type full-length tau (termed hTau) induced a significant reduction of α4 subunit of nicotinic acetylcholine receptors (nAChRs) with an increased cleavage of the receptor producing a ~55kDa fragment in primary hippocampal neurons and in the rat brains, meanwhile, the α4 nAChR currents decreased. Further studies demonstrated that calpains, including calpain-1 and calpain-2, were remarkably activated with no change of caspase-3, while simultaneous suppression of calpain-2 by selective calpain-2 inhibitor but not calpain-1 attenuated the hTau-induced degradation of α4 nAChR. Finally, we demonstrated that hTau accumulation increased the basal intracellular calcium level in primary hippocampal neurons. We conclude that the hTau accumulation inhibits nAChRs α4 by activating calpain-2. To our best knowledge, this is the first evidence showing that the intracellular accumulation of tau causes cholinergic impairments. PMID:27277673

  20. Length of guanosine homopolymeric repeats modulates promoter activity of subfamily II tpr genes of Treponema pallidum ssp. pallidum.

    Giacani, Lorenzo; Lukehart, Sheila; Centurion-Lara, Arturo

    2007-11-01

    In Treponema pallidum, homopolymeric guanosine repeats of varying length are present upstream of both Subfamily I (tprC, D, F and I) and II (tprE, G and J) tpr genes, a group of potential virulence factors, immediately upstream of the +1 nucleotide. To investigate the influence of these poly-G sequences on promoter activity, tprE, G, J, F and I promoter regions containing homopolymeric tracts with different numbers of Gs, the ribosomal binding site and start codon were cloned in frame with the green fluorescent protein reporter gene (GFP), and promoter activity was measured both as fluorescence emission from Escherichia coli cultures transformed with the different plasmid constructs and using quantitative RT-PCR. For tprJ, G and E-derived clones, fluorescence was significantly higher with constructs containing eight Gs or fewer, while plasmids containing the same promoters with none or more Gs gave modest or no signal above the background. In contrast, tprF/I-derived clones induced similar levels of fluorescence regardless of the number of Gs within the promoter. GFP mRNA quantification showed that all of the promoters induced measurable transcription of the GFP gene; however, only for Subfamily II promoters was message synthesis inversely correlated to the number of Gs in the construct. PMID:17683506

  1. Security Protocol Verification: Symbolic and Computational Models

    Blanchet, Bruno

    2012-01-01

    Security protocol verification has been a very active research area since the 1990s. This paper surveys various approaches in this area, considering the verification in the symbolic model, as well as the more recent approaches that rely on the computational model or that verify protocol implementations rather than specifications. Additionally, we briefly describe our symbolic security protocol verifier ProVerif and situate it among these approaches.

  2. Potency of Full- Length MGF to Induce Maximal Activation of the IGF-I R Is Similar to Recombinant Human IGF-I at High Equimolar Concentrations.

    Joseph A M J L Janssen

    Full Text Available To compare full-length mechano growth factor (full-length MGF with human recombinant insulin-like growth factor-I (IGF-I and human recombinant insulin (HI in their ability to activate the human IGF-I receptor (IGF-IR, the human insulin receptor (IR-A and the human insulin receptor-B (IR-B, respectively. In addition, we tested the stimulatory activity of human MGF and its stabilized analog Goldspink-MGF on the IGF-IR.The effects of full-length MGF, IGF-I, human mechano growth factor (MGF, Goldspink-MGF and HI were compared using kinase specific receptor activation (KIRA bioassays specific for IGF-I, IR-A or IR-B, respectively. These assays quantify activity by measuring auto-phosphorylation of the receptor upon ligand binding.IGF-IR: At high equimolar concentrations maximal IGF-IR stimulating effects generated by full-length MGF were similar to that of IGF-I (89-fold vs. 77-fold, respectively. However, EC50 values of IGF-I and full-length MGF for the IGF-I receptor were 0.86 nmol/L (95% CI 0.69-1.07 and 7.83 nmol/L (95% CI: 4.87-12.58, respectively. No IGF-IR activation was observed by human MGF and Goldspink-MGF, respectively. IR-A/IR-B: At high equimolar concentrations similar maximal IR-A stimulating effects were observed for full -length MGF and HI, but maximal IR-B stimulation achieved by full -length MGF was stronger than that by HI (292-fold vs. 98-fold. EC50 values of HI and full-length MGF for the IR-A were 1.13 nmol/L (95% CI 0.69-1.84 and 73.11 nmol/L (42.87-124.69, respectively; for IR-B these values were 1.28 nmol/L (95% CI 0.64-2.57 and 35.10 nmol/L (95% 17.52-70.33, respectively.Full-length MGF directly stimulates the IGF-IR. Despite a higher EC50 concentration, at high equimolar concentrations full-length MGF showed a similar maximal potency to activate the IGF-IR as compared to IGF-I. Further research is needed to understand the actions of full-length MGF in vivo and to define the physiological relevance of our in vitro findings.

  3. Potency of Full- Length MGF to Induce Maximal Activation of the IGF-I R Is Similar to Recombinant Human IGF-I at High Equimolar Concentrations

    Janssen, Joseph A. M. J. L.; Hofland, Leo J.; Strasburger, Christian J.; van den Dungen, Elisabeth S. R.; Thevis, Mario

    2016-01-01

    Aims To compare full-length mechano growth factor (full-length MGF) with human recombinant insulin-like growth factor-I (IGF-I) and human recombinant insulin (HI) in their ability to activate the human IGF-I receptor (IGF-IR), the human insulin receptor (IR-A) and the human insulin receptor-B (IR-B), respectively. In addition, we tested the stimulatory activity of human MGF and its stabilized analog Goldspink-MGF on the IGF-IR. Methods The effects of full-length MGF, IGF-I, human mechano growth factor (MGF), Goldspink-MGF and HI were compared using kinase specific receptor activation (KIRA) bioassays specific for IGF-I, IR-A or IR-B, respectively. These assays quantify activity by measuring auto-phosphorylation of the receptor upon ligand binding. Results IGF-IR: At high equimolar concentrations maximal IGF-IR stimulating effects generated by full-length MGF were similar to that of IGF-I (89-fold vs. 77-fold, respectively). However, EC50 values of IGF-I and full-length MGF for the IGF-I receptor were 0.86 nmol/L (95% CI 0.69–1.07) and 7.83 nmol/L (95% CI: 4.87–12.58), respectively. No IGF-IR activation was observed by human MGF and Goldspink-MGF, respectively. IR-A/IR-B: At high equimolar concentrations similar maximal IR-A stimulating effects were observed for full -length MGF and HI, but maximal IR-B stimulation achieved by full -length MGF was stronger than that by HI (292-fold vs. 98-fold). EC50 values of HI and full-length MGF for the IR-A were 1.13 nmol/L (95% CI 0.69–1.84) and 73.11 nmol/L (42.87–124.69), respectively; for IR-B these values were 1.28 nmol/L (95% CI 0.64–2.57) and 35.10 nmol/L (95% 17.52–70.33), respectively. Conclusions Full-length MGF directly stimulates the IGF-IR. Despite a higher EC50 concentration, at high equimolar concentrations full-length MGF showed a similar maximal potency to activate the IGF-IR as compared to IGF-I. Further research is needed to understand the actions of full-length MGF in vivo and to define the

  4. Global-scale pattern of peatland Sphagnum growth driven by photosynthetically active radiation and growing season length

    Z. Yu

    2012-02-01

    Full Text Available High-latitude peatlands contain about one third of the world's soil organic carbon, most of which is derived from partly decomposed Sphagnum (peat moss plants. We conducted a meta-analysis based on a global dataset of Sphagnum growth measurements collected from published literature to investigate the effects of bioclimatic variables on Sphagnum growth. Analysis of variance and general linear models were used to relate Sphagnum magellanicum and S. fuscum growth rates to photosynthetically active radiation integrated over the growing season (PAR0 and a moisture index. We found that PAR0 was the main predictor of Sphagnum growth for the global dataset, and effective moisture was only correlated with moss growth at continental sites. The strong correlation between Sphagnum growth and PAR0 suggests the existence of a global pattern of growth, with slow rates under cool climate and short growing seasons, highlighting the important role of temperature and growing season length in explaining peatland biomass production. Large-scale patterns of cloudiness during the growing season might also limit moss growth. Although considerable uncertainty remains over the carbon balance of peatlands under a changing climate, our results suggest that increasing PAR0 as a result of global warming and lengthening growing seasons could promote Sphagnum growth. Assuming that production and decomposition have the same sensitivity to temperature, this enhanced growth could lead to greater peat-carbon sequestration, inducing a negative feedback to climate change.

  5. Global-scale pattern of peatland Sphagnum growth driven by photosynthetically active radiation and growing season length

    Z. Yu

    2012-07-01

    Full Text Available High-latitude peatlands contain about one third of the world's soil organic carbon, most of which is derived from partly decomposed Sphagnum (peat moss plants. We conducted a meta-analysis based on a global data set of Sphagnum growth measurements collected from published literature to investigate the effects of bioclimatic variables on Sphagnum growth. Analysis of variance and general linear models were used to relate Sphagnum magellanicum and S. fuscum growth rates to photosynthetically active radiation integrated over the growing season (PAR0 and a moisture index. We found that PAR0 was the main predictor of Sphagnum growth for the global data set, and effective moisture was only correlated with moss growth at continental sites. The strong correlation between Sphagnum growth and PAR0 suggests the existence of a global pattern of growth, with slow rates under cool climate and short growing seasons, highlighting the important role of growing season length in explaining peatland biomass production. Large-scale patterns of cloudiness during the growing season might also limit moss growth. Although considerable uncertainty remains over the carbon balance of peatlands under a changing climate, our results suggest that increasing PAR0 as a result of global warming and lengthening growing seasons, without major change in cloudiness, could promote Sphagnum growth. Assuming that production and decomposition have the same sensitivity to temperature, this enhanced growth could lead to greater peat-carbon sequestration, inducing a negative feedback to climate change.

  6. Telomere Length, Telomerase Activity, and Replicative Potential in HIV Infection: Analysis of CD4+ and CD8+T Cells from HIV-discordant Monozygotic Twins

    Palmer, Larry D.; Weng, Nan-ping; Levine, Bruce L.; June, Carl H; Lane, H. Clifford; Hodes, Richard J.

    1997-01-01

    To address the possible role of replicative senescence in human immunodeficiency virus (HIV) infection, telomere length, telomerase activity, and in vitro replicative capacity were assessed in peripheral blood T cells from HIV+ and HIV− donors. Genetic and age-specific effects on these parameters were controlled by studying HIV-discordant pairs of monozygotic twins. Telomere terminal restriction fragment (TRF) lengths from CD4+ T cells of HIV+ donors were significantly greater than those from...

  7. Bacterial membrane activity of a-peptide/b-peptoid chimeras: Influence of amino acid composition and chain length on the activity against different bacterial strains

    Hein-Kristensen, Line; Knapp, Kolja M; Franzyk, Henrik;

    2011-01-01

    BACKGROUND: Characterization and use of antimicrobial peptides (AMPs) requires that their mode of action is determined. The interaction of membrane-active peptides with their target is often established using model membranes, however, the actual permeabilization of live bacterial cells and...... permeabilization of the bacterial cell envelope, and the outer membrane may act as a barrier in Gram-negative bacteria. The tolerance of S. marcescens to chimeras may be due to differences in the composition of the lipopolysaccharide layer also responsible for its resistance to polymyxin B....... subsequent killing is usually not tested. In this report, six α-peptide/β-peptoid chimeras were examined for the effect of amino acid/peptoid substitutions and chain length on the membrane perturbation and subsequent killing of food-borne and clinical bacterial isolates. RESULTS: All six AMP analogues...

  8. Study on safety classifications of software used in nuclear power plants and distinct applications of verification and validation activities in each class

    This paper describes the safety classification regarding instrumentation and control (I and C) systems and their software used in nuclear power plants, provides regulatory positions for software important to safety, and proposes verification and validation (V and V) activities applied differently in software classes which are important elements in ensuring software quality assurance. In other word, the I and C systems important to safety are classified into IC-1, IC-2, IC-3, and Non-IC and their software are classified into safety-critical, safety-related, and non-safety software. Based upon these safety classifications, the extent of software V and V activities in each class is differentiated each other. In addition, the paper presents that the software for use in I and C systems important to safety is divided into newly-developed and previously-developed software in terms of design and implementation, and provides the regulatory positions on each type of software

  9. PCB153 reduces telomerase activity and telomere length in immortalized human skin keratinocytes (HaCaT) but not in human foreskin keratinocytes (NFK)

    Senthilkumar, P.K. [Interdisciplinary Graduate Program in Human Toxicology, The University of Iowa, Iowa City, IA (United States); Robertson, L.W. [Interdisciplinary Graduate Program in Human Toxicology, The University of Iowa, Iowa City, IA (United States); Department of Occupational and Environmental Health, The University of Iowa, Iowa City, IA (United States); Ludewig, G., E-mail: Gabriele-ludewig@uiowa.edu [Interdisciplinary Graduate Program in Human Toxicology, The University of Iowa, Iowa City, IA (United States); Department of Occupational and Environmental Health, The University of Iowa, Iowa City, IA (United States)

    2012-02-15

    Polychlorinated biphenyls (PCBs), ubiquitous environmental pollutants, are characterized by long term-persistence in the environment, bioaccumulation, and biomagnification in the food chain. Exposure to PCBs may cause various diseases, affecting many cellular processes. Deregulation of the telomerase and the telomere complex leads to several biological disorders. We investigated the hypothesis that PCB153 modulates telomerase activity, telomeres and reactive oxygen species resulting in the deregulation of cell growth. Exponentially growing immortal human skin keratinocytes (HaCaT) and normal human foreskin keratinocytes (NFK) were incubated with PCB153 for 48 and 24 days, respectively, and telomerase activity, telomere length, superoxide level, cell growth, and cell cycle distribution were determined. In HaCaT cells exposure to PCB153 significantly reduced telomerase activity, telomere length, cell growth and increased intracellular superoxide levels from day 6 to day 48, suggesting that superoxide may be one of the factors regulating telomerase activity, telomere length and cell growth compared to untreated control cells. Results with NFK cells showed no shortening of telomere length but reduced cell growth and increased superoxide levels in PCB153-treated cells compared to untreated controls. As expected, basal levels of telomerase activity were almost undetectable, which made a quantitative comparison of treated and control groups impossible. The significant down regulation of telomerase activity and reduction of telomere length by PCB153 in HaCaT cells suggest that any cell type with significant telomerase activity, like stem cells, may be at risk of premature telomere shortening with potential adverse health effects for the affected organism. -- Highlights: ► Human immortal (HaCaT) and primary (NFK) keratinocytes were exposed to PCB153. ► PCB153 significantly reduced telomerase activity and telomere length in HaCaT. ► No effect on telomere length and

  10. PCB153 reduces telomerase activity and telomere length in immortalized human skin keratinocytes (HaCaT) but not in human foreskin keratinocytes (NFK)

    Polychlorinated biphenyls (PCBs), ubiquitous environmental pollutants, are characterized by long term-persistence in the environment, bioaccumulation, and biomagnification in the food chain. Exposure to PCBs may cause various diseases, affecting many cellular processes. Deregulation of the telomerase and the telomere complex leads to several biological disorders. We investigated the hypothesis that PCB153 modulates telomerase activity, telomeres and reactive oxygen species resulting in the deregulation of cell growth. Exponentially growing immortal human skin keratinocytes (HaCaT) and normal human foreskin keratinocytes (NFK) were incubated with PCB153 for 48 and 24 days, respectively, and telomerase activity, telomere length, superoxide level, cell growth, and cell cycle distribution were determined. In HaCaT cells exposure to PCB153 significantly reduced telomerase activity, telomere length, cell growth and increased intracellular superoxide levels from day 6 to day 48, suggesting that superoxide may be one of the factors regulating telomerase activity, telomere length and cell growth compared to untreated control cells. Results with NFK cells showed no shortening of telomere length but reduced cell growth and increased superoxide levels in PCB153-treated cells compared to untreated controls. As expected, basal levels of telomerase activity were almost undetectable, which made a quantitative comparison of treated and control groups impossible. The significant down regulation of telomerase activity and reduction of telomere length by PCB153 in HaCaT cells suggest that any cell type with significant telomerase activity, like stem cells, may be at risk of premature telomere shortening with potential adverse health effects for the affected organism. -- Highlights: ► Human immortal (HaCaT) and primary (NFK) keratinocytes were exposed to PCB153. ► PCB153 significantly reduced telomerase activity and telomere length in HaCaT. ► No effect on telomere length and

  11. General Electric SPDS verification and validation

    Verification and validation are critical quality assurance activities for the successful design and implementation of computer systems. Verification and validation methodology involves an extensive series of paper and product related activities. The individual activities must be clearly focused on specific objectives so when taken as a total package, they represent a thorough test and evaluation of the integrated hardware/software system. This paper provides an overview description of the extensive verification and validation program General Electric has put in place for its Safety Parameter Display System and other computer system products

  12. Formal Verification of UML Profil

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification...

  13. Automatic adjustment of cycle length and aeration time for improved nitrogen removal in an alternating activated sludge process

    Isaacs, Steven Howard

    1997-01-01

    The paper examines the nitrogen dynamics in the alternating BIODENITRO and BIODENIPHO processes with a focus on two control handles influencing now scheduling and aeration: the cycle length and the ammonia concentration at which a nitrifying period is terminated. A steady state analysis examining...

  14. Telomerase activity is increased and telomere length shortened in T cells from blood of patients with atopic dermatitis and psoriasis

    Wu, Kehuai; Higashi, H; Hansen, E R;

    2000-01-01

    subsets from both atopic dermatitis and psoriasis patients compared with normal individuals. Furthermore, the telomere length was found to be significantly shorter in CD4(+) memory T cells compared with the CD4(+) naive T cells, and both of the cell subsets from diseases were shown to be of significantly...

  15. Influence of pre-seeding processing of seeds by different waves length of ultraviolet light on activity of phytohormones in leaves of horse beans

    Preceding processing’s of seeds by ultraviolet irradiation of different length of waves (254 nanometers and 313 nanometers) on activity of complex endogenous growth regulators (indoleacetic acid, abscisic acid and cytokinin) in leaves of plants is considered in this work. The results showed that high level frequency of ultraviolet irradiation in highlands decreases the indoleacetic acid activity of plants thereby affects on the growth of plants. (author)

  16. On the role of code comparisons in verification and validation.

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  17. trans activation by the full-length E2 proteins of human papillomavirus type 16 and bovine papillomavirus type 1 in vitro and in vivo: cooperation with activation domains of cellular transcription factors.

    Ushikai, M; Lace, M J; Yamakawa, Y.; Kono, M; Anson, J; Ishiji, T; Parkkinen, S; Wicker, N.; Valentine, M E; Davidson, I

    1994-01-01

    Papillomaviral E2 genes encode proteins that regulate viral transcription. While the full-length bovine papillomavirus type 1 (BPV-1) E2 peptide is a strong trans activator, the homologous full-length E2 product of human papillomavirus type 16 (HPV-16) appeared to vary in function in previous studies. Here we show that when expressed from comparable constructs, the full-length E2 products of HPV-16 and BPV-1 trans activate a simple E2- and Sp1-dependent promoter up to approximately 100-fold i...

  18. Augmented telomerase activity, reduced telomere length and the presence of alternative lengthening of telomere in renal cell carcinoma: plausible predictive and diagnostic markers.

    Pal, Deeksha; Sharma, Ujjawal; Khajuria, Ragini; Singh, Shrawan Kumar; Kakkar, Nandita; Prasad, Rajendra

    2015-05-15

    In this study, we analyzed 100 cases of renal cell carcinoma (RCC) for telomerase activity, telomere length and alternative lengthening of telomeres (ALT) using the TRAP assay, TeloTTAGGG assay kit and immunohistochemical analysis of ALT associated promyelocytic leukemia (PML) bodies respectively. A significantly higher (P=0.000) telomerase activity was observed in 81 cases of RCC which was correlated with clinicopathological features of tumor for instance, stage (P=0.008) and grades (P=0.000) but not with the subtypes of RCC (P = 0.355). Notwithstanding, no correlation was found between telomerase activity and subtypes of RCC. Strikingly, the telomere length was found to be significantly shorter in RCC (P=0.000) to that of corresponding normal renal tissues and it is well correlated with grades (P=0.016) but not with stages (P=0.202) and subtypes (P=0.669) of RCC. In this study, telomere length was also negatively correlated with the age of patients (r(2)=0.528; P=0.000) which supports the notion that it could be used as a marker for biological aging. ALT associated PML bodies containing PML protein was found in telomerase negative cases of RCC. It suggests the presence of an ALT pathway mechanism to maintain the telomere length in telomerase negative RCC tissues which was associated with high stages of RCC, suggesting a prevalent mechanism for telomere maintenance in high stages. In conclusion, the telomerase activity and telomere length can be used as a diagnostic as well as a predictive marker in RCC. The prevalence of ALT mechanism in high stages of RCC is warranted for the development of anti-ALT inhibitors along with telomerase inhibitor against RCC as a therapeutic approach. PMID:25769384

  19. Liposome encapsulation of lipophilic N-alkyl-propanediamine platinum complexes: impact on their cytotoxic activity and influence of the carbon chain length

    Silva, Heveline; Fontes, Ana Paula S. [Universidade Federal de Juiz de Fora (UFJF), MG (Brazil). Dept. de Quimica; Lopes, Miriam Teresa P. [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Farmacologia; Frezard, Frederic, E-mail: frezard@icb.ufmg.b [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Fisiologia e Biofisica

    2010-07-01

    Antitumor platinum(II) complexes derived from N-alkyl-propanediamine differing in the length of their carbon chain (C8, C10, C12 and C14) were incorporated in liposomes and the cytotoxic activity of these formulations was evaluated against tumor (A{sub 549}, MDA-MB-231, B16-F1 and B16-F10) and non-tumor (BHK-21 and CHO) cell lines. Stable and monodisperse liposome suspensions incorporating the platinum complexes were obtained from the lipid composition consisting of distearoyl-sn-glycero-3-phosphocholine, cholesterol and 1,2-distearoyl-sn-glycero- 3-phosphoethanolamine-N-(methoxy(polyethylene glycol)-2000) at 5:3:0.3 molar ratio. The entrapment efficiency (EE%) of the platinum complexes in liposomes increased with the carbon chain length. EE% was higher than 80% in C12- and C14-derivatives. The effect of liposome encapsulation on the cytotoxic activity of the complexes was found to depend on the carbon chain length. These data indicate that the highest drug bioavailability from liposome formulations was achieved with the complex showing intermediate carbon chain length and partition between the liposome membrane and aqueous phase. (author)

  20. Investigating the role of chain and linker length on the catalytic activity of an H 2 production catalyst containing a β-hairpin peptide

    Reback, Matthew L.; Ginovska, Bojana; Buchko, Garry W.; Dutta, Arnab; Priyadarshani, Nilusha; Kier, Brandon L.; Helm, Monte L.; Raugei, Simone; Shaw, Wendy J.

    2016-06-02

    Building on our recent report of an active H2 production catalyst [Ni(PPh2NProp-peptide)2]2+ (Prop=para-phenylpropionic acid, peptide (R10)=WIpPRWTGPR-NH2, p=D-proline, and P2N=1-aza-3,6-diphosphacycloheptane) that contains structured -hairpin peptides, here we investigate how H2 production is effected by: (1) the length of the hairpin (eight or ten residues) and (2) limiting the flexibility between the peptide and the core complex by altering the length of the linker: para-phenylpropionic acid (three carbons) or para-benzoic acid (one carbon). Reduction of the peptide chain length from ten to eight residues increases or maintains the catalytic current for H2 production for all complexes, suggesting a non-productive steric interaction at longer peptide lengths. While the structure of the hairpin appears largely intact for the complexes, NMR data are consistent with differences in dynamic behavior which may contribute to the observed differences in catalytic activity. Molecular dynamics simulations demonstrate that complexes with a one-carbon linker have the desired effect of restricting the motion of the hairpin relative to the complex; however, the catalytic currents are significantly reduced compared to complexes containing a three-carbon linker as a result of the electron withdrawing nature of the -COOH group. These results demonstrate the complexity and interrelated nature of the outer coordination sphere on catalysis.

  1. The Effect of Length of Exposure to Computer-based Vocabulary Activities on Young Iranian EFL Learners’ Vocabulary Gain

    Karim Sadeghi; Masoumeh Dousti

    2014-01-01

    In recent years, research in the area of CALL and its role on teaching foreign languages has gained a strong foothold. This study was anattempt to explore the effectiveness of CALL technology in comparison to traditional book-based approach in teaching vocabulary to young Iranian EFL learners. As this study addressed young learners in an EFL context, it was supposed that time factor would play a crucial role in this regard. Hence, attending to the possible role of length of exposure to CALL t...

  2. Machine learning techniques for the verification of refueling activities in CANDU-type nuclear power plants (NPPs) with direct applications in nuclear safeguards

    This dissertation deals with the problem of automated classification of the signals obtained from certain radiation monitoring systems, specifically from the Core Discharge Monitor (CDM) systems, that are successfully operated by the International Atomic Energy Agency (IAEA) at various CANDU-type nuclear power plants around the world. In order to significantly reduce the costly and error-prone manual evaluation of the large amounts of the collected CDM signals, a reliable and efficient algorithm for the automated data evaluation is necessary, which might ensure real-time performance with maximum of 0.01 % misclassification ratio. This thesis describes the research behind finding a successful prototype implementation of such automated analysis software. The finally adopted methodology assumes a nonstationary data-generating process that has a finite number of states or basic fueling activities, each of which can emit observable data patterns having particular stationary characteristics. To find out the underlying state sequences, a unified probabilistic approach known as the hidden Markov model (HMM) is used. Each possible fueling sequence is modeled by a distinct HMM having a left-right profile topology with explicit insert and delete states. Given an unknown fueling sequence, a dynamic programming algorithm akin to the Viterbi search is used to find the maximum likelihood state path through each model and eventually the overall best-scoring path is picked up as the recognition hypothesis. Machine learning techniques are applied to estimate the observation densities of the states, because the densities are not simply parameterizable. Unlike most present applications of continuous monitoring systems that rely on heuristic approaches to the recognition of possibly risky events, this research focuses on finding techniques that make optimal use of prior knowledge and computer simulation in the recognition task. Thus, a suitably modified, approximate n-best variant of

  3. Verification and Validation of Flight Critical Systems Project

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  4. Verification and the safeguards legacy

    ; qualitative and quantitative measurements of nuclear material; familiarity and access to sensitive technologies related to detection, unattended verification systems, containment/surveillance and sensors; examination and verification of design information of large and complex facilities; theoretical and practical aspects of technologies relevant to verification objectives; analysis of inspection findings and evaluation of their mutual consistency; negotiations on technical issues with facility operators and State authorities. This experience is reflected in the IAEA Safeguards Manual which sets out the policies and procedures to be followed in the inspection process as well as in the Safeguards Criteria which provide guidance for verification, evaluation and analysis of the inspection findings. The IAEA infrastructure and its experience with verification permitted in 1991 the organization to respond immediately and successfully to the tasks required by the Security Council Resolution 687(1991) for Iraq as well as to the tasks related to the verification of completeness and correctness of the initial declarations in the cases of the DPRK. and of S. Africa. In the case of Iraq the discovery of its undeclared programs was made possible through the existing verification system enhanced by additional access rights, information and application of modern detection technology. Such discoveries made it evident that there was a need for an intensive development effort to strengthen the safeguards system to develop a capability to detect undeclared activities. For this purpose it was recognized that there was need for additional and extended a) access to information, b) access to locations. It was also obvious that access to the Security Council, to bring the IAEA closer to the body responsible for maintenance of international peace and security, would be a requirement for reporting periodically on non-proliferation and the results of the IAEA's verification activities. While the case

  5. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  6. Verification of Simulation Tools

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  7. Antisense epidermal growth factor receptor RNA transfection in human glioblastoma cells down-regulates telomerase activity and telomere length

    Tian, X-X; Pang, JC-S; J. Zheng; Chen, J; To, S S T; Ng, H-K

    2002-01-01

    Epidermal growth factor receptor is overexpressed and/or amplified in up to 50% of glioblastomas, suggesting an important role of this gene in glial tumorigenesis and progression. In the present study we demonstrated that epidermal growth factor receptor is involved in regulation of telomerase activity in glioblastoma. Antisense-epidermal growth factor receptor approach was used to inhibit epidermal growth factor receptor expression of glioblastoma U87MG cells. Telomerase activity in antisens...

  8. Proton Therapy Verification with PET Imaging

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions...

  9. Revisiting the prediction of solar activity based on the relationship between the solar maximum amplitude and max-max cycle length

    Carrasco, V M S; Gallego, M C

    2016-01-01

    It is very important to forecast the future solar activity due to its effect on our planet and near space. Here, we employ the new version of the sunspot number index (version 2) to analyse the relationship between the solar maximum amplitude and max-max cycle length proposed by Du (2006). We show that the correlation between the parameters used by Du (2006) for the prediction of the sunspot number (amplitude of the cycle, Rm, and max-max cycle length for two solar cycles before, Pmax-2) disappears when we use solar cycles prior to solar cycle 9. We conclude that the correlation between these parameters depends on the time interval selected. Thus, the proposal of Du (2006) should definitively not be considered for prediction purposes.

  10. Survey on Existing Techniques for Writer Verification

    Rashmi Welekar

    2015-11-01

    Full Text Available This paper presents a survey of the literature on handwriting analysis and writer verification schemes and techniques up till date. The paper outlines an overview of the writer identification schemes mainly in English, Arabic, Bangla, Malayalam and Gujrati languages. Taxonomy of different features adopted for online and offline writer identification schemes is also drawn at. The feature extraction methods adopted for the schemes are discussed in length outlining the merits and demerits of the same. In automated writer verification, text independent and text dependent methods are available which is also discussed in this paper. An evaluation of writer verification schemes under multiple languages is also analyzed by comparing the recognition rate. New method proposed for identifying writer using slant, orientation, eccentricity enabling to identify writer‟s mental state by features associated.

  11. Determination of the minimum polypeptide lengths of the functionally active sites of human interleukins 1α and 1β

    Interleukin 1 (IL-1) is a two-member family of proteins (IL-1α and IL-1β) that mediates a diverse series of immune and inflammatory responses. These two proteins have only 26% amino acid homology yet bind to the same receptor. It is of importance to define the active sites of these molecules in order to understand their receptor interactions and the mechanisms involved in their multiple biological functions. They authors report here the localization of the biologically active portions within the initial polypeptide translation products. An in vitro transcription and translation system was used to generate specific fragments of each of the IL-1 molecules, which then were assayed for receptor binding capability and biological activity. Using this system, they have demonstrated that core sequences of 147 amino acids for IL-1β (numbers 120-266) and 140 amino acids for IL-1α (numbers 128-267) must be left intact to retain full biological activity and further that the biological activities of the IL-1 polypeptides parallel their receptor binding capabilities

  12. Verification Account Management System (VAMS)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  13. Influence of Linker Length Variations on the Biomass-Degrading Performance of Heat-Active Enzyme Chimeras.

    Rizk, Mazen; Antranikian, Garabed; Elleuche, Skander

    2016-04-01

    Plant cell walls are composed of complex polysaccharides such as cellulose and hemicellulose. In order to efficiently hydrolyze cellulose, the synergistic action of several cellulases is required. Some anaerobic cellulolytic bacteria form multienzyme complexes, namely cellulosomes, while other microorganisms produce a portfolio of diverse enzymes that work in synergistic fashion. Molecular biological methods can mimic such effects through the generation of artificial bi- or multifunctional fusion enzymes. Endoglucanase and β-glucosidase from extremely thermophilic anaerobic bacteria Fervidobacterium gondwanense and Fervidobacterium islandicum, respectively, were fused end-to-end in an approach to optimize polysaccharide degradation. Both enzymes are optimally active at 90 °C and pH 6.0-7.0 representing excellent candidates for fusion experiments. The direct linkage of both enzymes led to an increased activity toward the substrate specific for β-glucosidase, but to a decreased activity of endoglucanase. However, these enzyme chimeras were superior over 1:1 mixtures of individual enzymes, because combined activities resulted in a higher final product yield. Therefore, such fusion enzymes exhibit promising features for application in industrial bioethanol production processes. PMID:26921187

  14. Structural properties of the active layer of discotic hexabenzocoronene/perylene diimide bulk hetero junction photovoltaic devices: The role of alkyl side chain length

    We investigate thin blend films of phenyl-substituted hexa-peri-hexabenzocoronenes (HBC) with various alkyl side chain lengths ((CH2)n, n = 6, 8, 12 and 16)/perylenediimide (PDI). These blends constitute the active layers in bulk-hetero junction organic solar cells we studied recently [1]. Their structural properties are studied by both scanning electron microscopy and X-ray diffraction measurements. The results support the evidence for the formation of HBC donor-PDI acceptor complexes in all blends regardless of the side chain length of the HBC molecule. These complexes are packed into a layered structure parallel to the substrate for short side chain HBC molecules (n = 6 and 8). The layered structure is disrupted by increasing the side chain length of the HBC molecule and eventually a disordered structure is formed for long side chains (n > 12). We attribute this behavior to the size difference between the aromatic parts of the HBC and PDI molecules. For short side chains, the size difference results in a room for the side chains of the two molecules to fill in the space around the aromatic cores. For long side chains (n > 12), the empty space will not be enough to accommodate this increase, leading to the disruption of the layered structure and a rather disordered structure is formed. Our results highlight the importance of the donor-acceptor interaction in a bulk heterojunction active layer as well as the geometry of the two molecules and their role in determining the structure of the active layer and thus their photovoltaic performance.

  15. Spatial correlation of proton irradiation-induced activity and dose in polymer gel phantoms for PET/CT delivery verification studies

    activity and lays the groundwork for further investigations using BANG3-Pro2 as a dosimetric phantom in PET/CT delivery verification studies.

  16. Telomerase-Associated Protein TEP1 Is Not Essential for Telomerase Activity or Telomere Length Maintenance In Vivo

    Liu, Yie; Snow, Bryan E.; Hande, M. Prakash; Baerlocher, Gabriela; Kickhoefer, Valerie A.; Yeung, David; Wakeham, Andrew; Itie, Annick; Siderovski, David P.; Lansdorp, Peter M.; Robinson, Murray O; Harrington, Lea

    2000-01-01

    TEP1 is a mammalian telomerase-associated protein with similarity to the Tetrahymena telomerase protein p80. Like p80, TEP1 is associated with telomerase activity and the telomerase reverse transcriptase, and it specifically interacts with the telomerase RNA. To determine the role of mTep1 in telomerase function in vivo, we generated mouse embryonic stem (ES) cells and mice lacking mTep1. The mTep1-deficient (mTep1−/−) mice were viable and were bred for seven successive generations with no ob...

  17. In Situ Time-Resolved FRET Reveals Effects of Sarcomere Length on Cardiac Thin-Filament Activation

    Li, King-Lun; Rieck, Daniel; Solaro, R. John; Dong, Wenji

    2014-01-01

    During cardiac thin-filament activation, the N-domain of cardiac troponin C (N-cTnC) binds to Ca2+ and interacts with the actomyosin inhibitory troponin I (cTnI). The interaction between N-cTnC and cTnI stabilizes the Ca2+-induced opening of N-cTnC and is presumed to also destabilize cTnI–actin interactions that work together with steric effects of tropomyosin to inhibit force generation. Recently, our in situ steady-state FRET measurements based on N-cTnC opening suggested that at long sarco...

  18. Decommissioning of nuclear reactors - verification of a calculation model for induced activity in structural materials - calculations and final report

    The purpose of this study is to verify a calculation model for neutron induced activity in the area outside the reactor vessel, especially in the biological shield. Here, the induced activity is small, and at a certain distance could be below the level which classifies the material as inactive, leading to other possible disposal alternatives. In this region it is difficult to calculate accurately the induced activity due to the large attenuation of the neutron flux from the core. Cement specimens and foils, sensitive to different part of the neutron spectrum, were irradiated in Oskarshamn 1 during the operational year 1987-88. They were placed in two chains, one located in the gap between the reactor pressure vessel and the biological shield, and the other located in an empty channel in the biological shield. Calculated and measured foil activities were compared in order to verify the model. The neutron flux calculations were performed with the one-dimensional Sn code ANISN. It is difficult to make an accurate model of the reactor core in the radial direction with an one-dimensional code and this has been confirmed by the study. It is suggested to use instead the corresponding two-dimensional code DORT. A new program was developed to combine one axial and several radial ANISN calculations to obtain a two-dimensional neutron flux distribution. The result was, however, not so good especially in the areas above and below the core height. Improvements of the model have been recommended. A second result of the study is the importance of having a good knowledge about the concrete. The density and the composition, especially the water content, have a significant influence on the attenuation of fast neutrons and on the energy distribution of the neutrons in the biological shield

  19. Optimization of the procedure for verification of the response on activity of activimeters in PET production units using 13N

    The Spanish Protocol for quality control in medicine nuclear1 establishes the need to control regular activity of the calibrators of dose response. This Protocol establishes the use of 99mTc for performing this test. In the case of a PET facility this methodology is feasible not to dispose of the radionuclide in the installation. We propose an alternative methodology for carrying out this test of fast, cheap and efficient form in a facility that has cyclotron. (Author)

  20. Communication dated 18 December 2013 received from the Delegation of the European Union to the International Organisations in Vienna on the European Union's Support for the IAEA Activities in the Areas of Nuclear Security and Verification

    The Secretariat has received a note verbale dated 18 December 2013 from the Delegation of the European Union to the International Organisations in Vienna with Council Decision 2013/517/CFSP of 21 October 2013, in support of the IAEA activities in the areas of nuclear security and verification and in the framework of the implementation of the EU Strategy against Proliferation of Weapons of Mass Destruction. As requested in that communication, the note verbale and the enclosure are circulated herewith for information

  1. Advanced verification topics

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  2. Influenza activity in Europe during eight seasons (1999–2007: an evaluation of the indicators used to measure activity and an assessment of the timing, length and course of peak activity (spread across Europe

    Meijer Adam

    2007-11-01

    Full Text Available Abstract Background The European Influenza Surveillance Scheme (EISS has collected clinical and virological data on influenza since 1996 in an increasing number of countries. The EISS dataset was used to characterise important epidemiological features of influenza activity in Europe during eight winters (1999–2007. The following questions were addressed: 1 are the sentinel clinical reports a good measure of influenza activity? 2 how long is a typical influenza season in Europe? 3 is there a west-east and/or south-north course of peak activity ('spread' of influenza in Europe? Methods Influenza activity was measured by collecting data from sentinel general practitioners (GPs and reports by national reference laboratories. The sentinel reports were first evaluated by comparing them to the laboratory reports and were then used to assess the timing and spread of influenza activity across Europe during eight seasons. Results We found a good match between the clinical sentinel data and laboratory reports of influenza collected by sentinel physicians (overall match of 72% for +/- 1 week difference. We also found a moderate to good match between the clinical sentinel data and laboratory reports of influenza from non-sentinel sources (overall match of 60% for +/- 1 week. There were no statistically significant differences between countries using ILI (influenza-like illness or ARI (acute respiratory disease as case definition. When looking at the peak-weeks of clinical activity, the average length of an influenza season in Europe was 15.6 weeks (median 15 weeks; range 12–19 weeks. Plotting the peak weeks of clinical influenza activity reported by sentinel GPs against the longitude or latitude of each country indicated that there was a west-east spread of peak activity (spread of influenza across Europe in four winters (2001–2002, 2002–2003, 2003–2004 and 2004–2005 and a south-north spread in three winters (2001–2002, 2004–2005 and 2006

  3. Structure-specific nuclease activity of RAGs is modulated by sequence, length and phase position of flanking double-stranded DNA.

    Kumari, Rupa; Raghavan, Sathees C

    2015-01-01

    RAGs (recombination activating genes) are responsible for the generation of antigen receptor diversity through the process of combinatorial joining of different V (variable), D (diversity) and J (joining) gene segments. In addition to its physiological property, wherein RAG functions as a sequence-specific nuclease, it can also act as a structure-specific nuclease leading to genomic instability and cancer. In the present study, we investigate the factors that regulate RAG cleavage on non-B DNA structures. We find that RAG binding and cleavage on heteroduplex DNA is dependent on the length of the double-stranded flanking region. Besides, the immediate flanking double-stranded region regulates RAG activity in a sequence-dependent manner. Interestingly, the cleavage efficiency of RAGs at the heteroduplex region is influenced by the phasing of DNA. Thus, our results suggest that sequence, length and phase positions of the DNA can affect the efficiency of RAG cleavage when it acts as a structure-specific nuclease. These findings provide novel insights on the regulation of the pathological functions of RAGs. PMID:25327637

  4. Estimation of genome length

    2001-01-01

    The genome length is a fundamental feature of a species. This note outlined the general concept and estimation method of the physical and genetic length. Some formulae for estimating the genetic length were derived in detail. As examples, the genome genetic length of Pinus pinaster Ait. and the genetic length of chromosome Ⅵ of Oryza sativa L. were estimated from partial linkage data.

  5. Verification of threshold activation detection (TAD) technique in prompt fission neutron detection using scintillators containing 19F

    In the present study ⌀ 5''× 3'' and ⌀ 2''× 2'' EJ-313 liquid fluorocarbon as well as ⌀ 2'' × 3'' BaF2 scintillators were exposed to neutrons from a 252Cf neutron source and a Sodern Genie 16GT deuterium-tritium (D+T) neutron generator. The scintillators responses to β− particles with maximum endpoint energy of 10.4 MeV from the n+19F reactions were studied. Response of a ⌀ 5'' × 3'' BC-408 plastic scintillator was also studied as a reference. The β− particles are the products of interaction of fast neutrons with 19F which is a component of the EJ-313 and BaF2 scintillators. The method of fast neutron detection via fluorine activation is already known as Threshold Activation Detection (TAD) and was proposed for photofission prompt neutron detection from fissionable and Special Nuclear Materials (SNM) in the field of Homeland Security and Border Monitoring. Measurements of the number of counts between 6.0 and 10.5 MeV with a 252Cf source showed that the relative neutron detection efficiency ratio, defined as εBaF2 / εEJ−313−5'', is 32.0% ± 2.3% and 44.6% ± 3.4% for front-on and side-on orientation of the BaF2, respectively. Moreover, the ⌀ 5'' EJ-313 and side-on oriented BaF2 were also exposed to neutrons from the D+T neutron generator, and the relative efficiency εBaF2 / εEJ−313−5'' was estimated to be 39.3%. Measurements of prompt photofission neutrons with the BaF2 detector by means of data acquisition after irradiation (out-of-beam) of nuclear material and between the beam pulses (beam-off) techniques were also conducted on the 9 MeV LINAC of the SAPHIR facility

  6. Verification of threshold activation detection (TAD) technique in prompt fission neutron detection using scintillators containing 19F

    Sibczynski, P.; Kownacki, J.; Moszyński, M.; Iwanowska-Hanke, J.; Syntfeld-Każuch, A.; Gójska, A.; Gierlik, M.; Kaźmierczak, Ł.; Jakubowska, E.; Kędzierski, G.; Kujawiński, Ł.; Wojnarowicz, J.; Carrel, F.; Ledieu, M.; Lainé, F.

    2015-09-01

    In the present study ⌀ 5''× 3'' and ⌀ 2''× 2'' EJ-313 liquid fluorocarbon as well as ⌀ 2'' × 3'' BaF2 scintillators were exposed to neutrons from a 252Cf neutron source and a Sodern Genie 16GT deuterium-tritium (D+T) neutron generator. The scintillators responses to β- particles with maximum endpoint energy of 10.4 MeV from the n+19F reactions were studied. Response of a ⌀ 5'' × 3'' BC-408 plastic scintillator was also studied as a reference. The β- particles are the products of interaction of fast neutrons with 19F which is a component of the EJ-313 and BaF2 scintillators. The method of fast neutron detection via fluorine activation is already known as Threshold Activation Detection (TAD) and was proposed for photofission prompt neutron detection from fissionable and Special Nuclear Materials (SNM) in the field of Homeland Security and Border Monitoring. Measurements of the number of counts between 6.0 and 10.5 MeV with a 252Cf source showed that the relative neutron detection efficiency ratio, defined as epsilonBaF2 / epsilonEJ-313-5'', is 32.0% ± 2.3% and 44.6% ± 3.4% for front-on and side-on orientation of the BaF2, respectively. Moreover, the ⌀ 5'' EJ-313 and side-on oriented BaF2 were also exposed to neutrons from the D+T neutron generator, and the relative efficiency epsilonBaF2 / epsilonEJ-313-5'' was estimated to be 39.3%. Measurements of prompt photofission neutrons with the BaF2 detector by means of data acquisition after irradiation (out-of-beam) of nuclear material and between the beam pulses (beam-off) techniques were also conducted on the 9 MeV LINAC of the SAPHIR facility.

  7. Solar cycle length hypothesis appears to support the IPCC on global warming

    Laut, Peter; Gundermann, Jesper

    1999-01-01

    warming from the enhanced concentrations of greenhouse gases. The "solar hypothesis" claims that solar activity causes a significant component of the global mean temperature to vary in phase opposite to the filtered solar cycle lengths. In an earlier paper we have demonstrated that for data covering...... lengths with the "corrected" temperature anomalies is substantially better than with the historical anomalies. Therefore our findings support a total reversal of the common assumption that a verification of the solar hypothesis would challenge the IPCC assessment of man-made global warming.......Since the discovery of a striking correlation between 1-2-2-2-1 filtered solar cycle lengths and the 11-year running average of Northern Hemisphere land air temperatures there have been widespread speculations as to whether these findings would rule out any significant contributions to global...

  8. Solar cycle length hypothesis appears to support the IPCC on global warming

    Laut, Peter; Gundermann, Jesper

    1999-01-01

    warming from the enhanced concentrations of greenhouse gases. The "solar hypothesis" claims that solar activity causes a significant component of the global mean temperature to vary in phase opposite to the filtered solar cycle lengths. In an earlier paper we have demonstrated that for data covering the...... lengths with the "corrected" temperature anomalies is substantially better than with the historical anomalies. Therefore our findings support a total reversal of the common assumption that a verification of the solar hypothesis would challenge the IPCC assessment of man-made global warming.......Since the discovery of a striking correlation between 1-2-2-2-1 filtered solar cycle lengths and the 11-year running average of Northern Hemisphere land air temperatures there have been widespread speculations as to whether these findings would rule out any significant contributions to global...

  9. Verification and Examination Management of Complex Systems

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  10. Verification of biological activity of irradiated Sopoongsan, an oriental medicinal prescription, for industrial application of functional cosmetic material

    Sopoongsan is an oriental medicinal prescription including 12 medicinal herbs. Sopoongsan is known to have anti-inflammatory, anti-microbial, anti-allergic, and anti-cancer effects on human skin. To use Sopoongsan extract for functional cosmetic composition, its dark color should be brighter for seeking consumer demand, clear products, without any adverse change in its function. Irradiation with doses 0, 5, 10, and 20 kGy was applied to improve color of ethanol- or water-extracted Sopoongsan and also superoxide dismutase (SOD), xanthine oxidase (XO), melanoma cell growth inhibition, and anti-microbial activity was investigated. Generally, ethanol extract was better than water extract in function and irradiation up to 20 kGy did not change any functional effect. Especially, the inhibition of melanin deposition on skin measured by inhibition of B16F10 (melanoma) cell growth was as high as arbutin, commercially available product, when the ethanol-extracted Sopoongsan was irradiated for 20 kGy. Results showed that when irradiation technology is used, the limitation of addition amount of natural materials for food or cosmetic composition caused by color problem can be decreased significantly with time saving and cost benefit compared to conventional color removal process. Therefore, irradiation would be one of the good methods to pose an additional value for related industry

  11. The role of instrumental activation analysis in the verification and completion of analytical data of rock reference materials

    Ten selected rock reference materials (USGS diabase W-l, basalt BCR-1, andesite AGV-1, granite G-2, granodiorite GSP-1, and CRPG basalt BE-N, granite GS-N, trachyte ISH-G, serpentine UB-N, glass standard VS-N) were analyzed by instrumental neutron and photon activation analyses. The results were evaluated on average for the entire set of samples to detect possible systematic deviations of the determined values from the reference values. Out of 47 elements determined, 43 elements were determined with reasonable agreement (deviation <10% on average) with the reference values. Au could not be determined because of a high blank from packaging polyethylene foil. Systematically higher Dy and lower Ho and Tm (by about 20 % on average) in our results require further investigation. In several cases, reasons for greater differences between the determined and recommended values could not be traced in our procedures. The most suspect is the recommended value for W in the CRPG BE-N basalt, which is twenty-five times higher than the value determined in the present work, probably due to inconsistent contamination from a W carbide mill used in production of this reference material. (author)

  12. Verification of biological activity of irradiated Sopoongsan, an oriental medicinal prescription, for industrial application of functional cosmetic material

    Lee, Jin-Young; Park, Tae-Soon; Ho Son, Jun; Jo, Cheorun; Woo Byun, Myung; Jeun An, Bong

    2007-11-01

    Sopoongsan is an oriental medicinal prescription including 12 medicinal herbs. Sopoongsan is known to have anti-inflammatory, anti-microbial, anti-allergic, and anti-cancer effects on human skin. To use Sopoongsan extract for functional cosmetic composition, its dark color should be brighter for seeking consumer demand, clear products, without any adverse change in its function. Irradiation with doses 0, 5, 10, and 20 kGy was applied to improve color of ethanol- or water-extracted Sopoongsan and also superoxide dismutase (SOD), xanthine oxidase (XO), melanoma cell growth inhibition, and anti-microbial activity was investigated. Generally, ethanol extract was better than water extract in function and irradiation up to 20 kGy did not change any functional effect. Especially, the inhibition of melanin deposition on skin measured by inhibition of B16F10 (melanoma) cell growth was as high as arbutin, commercially available product, when the ethanol-extracted Sopoongsan was irradiated for 20 kGy. Results showed that when irradiation technology is used, the limitation of addition amount of natural materials for food or cosmetic composition caused by color problem can be decreased significantly with time saving and cost benefit compared to conventional color removal process. Therefore, irradiation would be one of the good methods to pose an additional value for related industry.

  13. Verification of biological activity of irradiated Sopoongsan, an oriental medicinal prescription, for industrial application of functional cosmetic material

    Lee, Jin-Young; Park, Tae-Soon; Ho Son, Jun [Department of Cosmeceutical Science, Daegu Haany University, Kyungsan 712-715 (Korea, Republic of); Jo, Cheorun [Department of Animal Science and Biotechnology, Chungnam National University, Daejeon 305-764 (Korea, Republic of); Woo Byun, Myung [Radiation Food Science and Biotechnology Team, Korea Atomic Energy Research Institute, Jeongeup 580-185 (Korea, Republic of); Jeun An, Bong [Department of Cosmeceutical Science, Daegu Haany University, Kyungsan 712-715 (Korea, Republic of)], E-mail: anbj@dhu.ac.kr

    2007-11-15

    Sopoongsan is an oriental medicinal prescription including 12 medicinal herbs. Sopoongsan is known to have anti-inflammatory, anti-microbial, anti-allergic, and anti-cancer effects on human skin. To use Sopoongsan extract for functional cosmetic composition, its dark color should be brighter for seeking consumer demand, clear products, without any adverse change in its function. Irradiation with doses 0, 5, 10, and 20 kGy was applied to improve color of ethanol- or water-extracted Sopoongsan and also superoxide dismutase (SOD), xanthine oxidase (XO), melanoma cell growth inhibition, and anti-microbial activity was investigated. Generally, ethanol extract was better than water extract in function and irradiation up to 20 kGy did not change any functional effect. Especially, the inhibition of melanin deposition on skin measured by inhibition of B16F10 (melanoma) cell growth was as high as arbutin, commercially available product, when the ethanol-extracted Sopoongsan was irradiated for 20 kGy. Results showed that when irradiation technology is used, the limitation of addition amount of natural materials for food or cosmetic composition caused by color problem can be decreased significantly with time saving and cost benefit compared to conventional color removal process. Therefore, irradiation would be one of the good methods to pose an additional value for related industry.

  14. SSN Verification Service

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  15. Method for Experimental Verification of the Effect of Gravitational Time Dilation by Using an Active Hydrogen Maser

    Malykin, G. B.

    2015-09-01

    The well-known experiments performed by Pound and Rebka already in the 1960s confirmed the effect of gravitational time dilation, which had been predicted earlier within the framework of the general relativity theory. However, since photon exchange occurred in the course of these experiments on comparing the frequencies of nuclear resonance fluorescence at various altitudes, the reasons underlying the origin of this effect are explained in the literature by two different and, in fact, alternative presumed physical phenomena. According to the first explanation, clocks locate higher run faster, which is due to an increase in the gravitational potential with increasing distance from the Earth, whereas ascending and descending photons do not change their frequency (by the same clock, e.g., that of the so-called outside observer). According to the second explanation, the clock rate is the same at different altitudes, but the ascending photons undergo a redshift since they lose their energy, while the descending photons undergo a blueshift since they acquire energy. Other combined interpretations of the gravitational time dilation, which presume that the both phenomena exist simultaneously, are proposed in the literature. We propose an experiment with two clocks being active hydrogen masers, one of which is located at the bottom of a high-rise building, and the other, on the top of the building. In this case, time is measured by the first and second clocks during a sufficiently long time interval. After that, the masers are placed at one point, and their indications are compared. In this case, the photon exchange is not required for comparison of the clock readings, and, therefore, the method proposed allows one to reveal the actual reason of the effect under consideration. Numerical estimations are made, which allow for the accompanying effects influencing the measurement accuracy. Critical analysis of the earlier experiments shows that they are either equivocal, or are

  16. Relativistic length agony continued

    Redžić D.V.

    2014-01-01

    Full Text Available We made an attempt to remedy recent confusing treatments of some basic relativistic concepts and results. Following the argument presented in an earlier paper (Redžić 2008b, we discussed the misconceptions that are recurrent points in the literature devoted to teaching relativity such as: there is no change in the object in Special Relativity, illusory character of relativistic length contraction, stresses and strains induced by Lorentz contraction, and related issues. We gave several examples of the traps of everyday language that lurk in Special Relativity. To remove a possible conceptual and terminological muddle, we made a distinction between the relativistic length reduction and relativistic FitzGerald-Lorentz contraction, corresponding to a passive and an active aspect of length contraction, respectively; we pointed out that both aspects have fundamental dynamical contents. As an illustration of our considerations, we discussed briefly the Dewan-Beran-Bell spaceship paradox and the ‘pole in a barn’ paradox. [Projekat Ministarstva nauke Republike Srbije, br. 171028

  17. Mesenchymal stem cells with high telomerase expression do not actively restore their chromosome arm specific telomere length pattern after exposure to ionizing radiation

    Graakjaer, Jesper; Christensen, Rikke; Kølvrå, Steen;

    2007-01-01

    investigate the existence and maintenance of the telomere length pattern in stem cells. For this aim we studied telomere length in primary human mesenchymal stem cells (hMSC) and their telomerase-immortalised counterpart (hMSC-telo1) during extended proliferation as well as after irradiation. Telomere lengths...

  18. Polyester hydrolytic and synthetic activity catalyzed by the medium-chain-length poly(3-hydroxyalkanoate) depolymerase from Streptomyces venezuelae SO1.

    Santos, Marta; Gangoiti, Joana; Keul, Helmut; Möller, Martin; Serra, Juan L; Llama, María J

    2013-01-01

    The extracellular medium-chain-length polyhydroxyalkanote (MCL-PHA) depolymerase from an isolate identified as Streptomyces venezuelae SO1 was purified to electrophoretic homogeneity and characterized. The molecular mass and pI of the purified enzyme were approximately 27 kDa and 5.9, respectively. The depolymerase showed its maximum activity in the alkaline pH range and 50 °C and retained more than 70 % of its initial activity after 8 h at 40 °C. The MCL-PHA depolymerase hydrolyzes various p-nitrophenyl-alkanoates and polycaprolactone but not polylactide, poly-3-hydroxybutyrate, and polyethylene succinate. The enzymatic activity was markedly enhanced by the presence of low concentrations of detergents and organic solvents, being inhibited by dithiothreitol and EDTA. The potential of using the enzyme to produce (R)-3-hydroxyoctanoate in aqueous media or to catalyze ester-forming reactions in anhydrous media was investigated. In this sense, the MCL-PHA depolymerase catalyzes the hydrolysis of poly-3-hydroxyoctanoate to monomeric units and the ring-opening polymerization of β-butyrolactone and lactides, while ε-caprolactone and pentadecalactone were hardly polymerized. PMID:22695803

  19. Procedure generation and verification

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  20. Open verification methodology cookbook

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  1. Requirement Assurance: A Verification Process

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  2. Proton Therapy Verification with PET Imaging

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  3. Improved computational model (AQUIFAS) for activated sludge, integrated fixed-film activated sludge, and moving-bed biofilm reactor systems, part III: analysis and verification.

    Sen, Dipankar; Randall, Clifford W

    2008-07-01

    Research was undertaken to analyze and verify a model that can be applied to activated sludge, integrated fixed-film activated sludge (IFAS), and moving-bed biofilm reactor (MBBR) systems. The model embeds a biofilm model into a multicell activated sludge model. The advantage of such a model is that it eliminates the need to run separate computations for a plant being retrofitted from activated sludge to IFAS or MBBR. The biofilm flux rates for organics, nutrients, and biomass can be computed by two methods-a semi-empirical model of the biofilm that is relatively simpler, or a diffusional model of the biofilm that is computationally intensive. Biofilm support media can be incorporated to the anoxic and aerobic cells, but not the anaerobic cells. The model can be run for steady-state and dynamic simulations. The model was able to predict the changes in nitrification and denitrification at both pilot- and full-scale facilities. The semi-empirical and diffusional models of the biofilm were both used to evaluate the biofilm flux rates for media at different locations. The biofilm diffusional model was used to compute the biofilm thickness and growth, substrate concentrations, volatile suspended solids (VSS) concentration, and fraction of nitrifiers in each layer inside the biofilm. Following calibration, both models provided similar effluent results for reactor mixed liquor VSS and mixed liquor suspended solids and for the effluent organics, nitrogen forms, and phosphorus concentrations. While the semi-empirical model was quicker to run, the diffusional model provided additional information on biofilm thickness, quantity of growth in the biofilm, and substrate profiles inside the biofilm. PMID:18710147

  4. Study of the Physics of Neutron Active Zones by Active and Passive Surfaces Methods. Experimental Verification by Means of a Subcritical Assembly

    For the purpose of studying a heterogeneous multiplying medium, the author proposes dividing it into a number of homogeneous regions having the diffusion and absorption properties of the pure moderator. The fuel elements, represented by portions of active surfaces of zero thickness, constitute the separation surfaces of these sub-regions. Externally, the system is bounded by passive surfaces devoid of fissionable nuclei. The theory of diffusion involving several groups of neutrons is applied to each sub-region, while the productive and absorbing effects of fissionable materials are represented by the conditions on the active surfaces. To apply the method, it is necessary to know certain parameters of the behaviour of an active surface in a known flux. The moderator group constants are presumed to be known. The author shows that, theoretically, a single exponential experiment, carried out with a very small number of rods, should suffice to determine these parameters experimentally. The facility used for these experiments is a subcritical assembly; the fuel is uranium oxide containing 1.8% uranium235; a water moderator is used. Measurements made for a series of different configurations confirm that the parameters sought depend solely on the nature of the fuel. The results are used to forecast the behaviour of a subcritical and a critical lattice. In the first case the calculations are verified directly by experiment; in the second, they are checked by comparison with the published results. (author)

  5. On the Relationship Between the Length of Season and Tropical Cyclone Activity in the North Atlantic Basin During the Weather Satellite Era, 1960-2013

    Wilson, Robert M.

    2014-01-01

    Officially, the North Atlantic basin tropical cyclone season runs from June 1 through November 30 of each year. During this 183-day interval, the vast majority of tropical cyclone onsets are found to occur. For example, in a study of the 715 tropical cyclones that occurred in the North Atlantic basin during the interval 1945-2010, it was found that about 97 percent of them had their onsets during the conventional hurricane season, with the bulk (78 percent) having had onset during the late summer-early fall months of August, September, and October and with none having had onset in the month of March. For the 2014 hurricane season, it already has had the onset of its first named storm on July 1 (day of year (DOY) 182), Arthur, which formed off the east coast of Florida, rapidly growing into a category-2 hurricane with peak 1-minute sustained wind speed of about 90 kt and striking the coast of North Carolina as a category-2 hurricane on July 3. Arthur is the first hurricane larger than category-1 to strike the United States (U.S.) since the year 2008 when Ike struck Texas as a category-2 hurricane and there has not been a major hurricane (category-3 or larger) to strike the U.S. since Wilma struck Florida as a category-3 hurricane in 2005. Only two category-1 hurricanes struck the U.S. in the year 2012 (Isaac and Sandy, striking Louisiana and New York, respectively) and there were no U.S. land-falling hurricanes in 2013 (also true for the years 1962, 1973, 1978, 1981, 1982, 1990, 1994, 2000, 2001, 2006, 2009, and 2010). In recent years it has been argued that the length of season (LOS), determined as the inclusive elapsed time between the first storm day (FSD) and the last storm day (LSD) of the yearly hurricane season (i.e., when peak 1-minute sustained wind speed of at least 34 kt occurred and the tropical cyclone was not classified as 'extratropical'), has increased in length with the lengthening believed to be due to the FSD occurring sooner and the LSD occurring

  6. Crystal Structure of Full-length Mycobacterium tuberculosis H37Rv Glycogen Branching Enzyme; Insights of N-Terminal [beta]-Sandwich in Sustrate Specifity and Enzymatic Activity

    Pal, Kuntal; Kumar, Shiva; Sharma, Shikha; Garg, Saurabh Kumar; Alam, Mohammad Suhail; Xu, H. Eric; Agrawal, Pushpa; Swaminathan, Kunchithapadam (NU Sinapore); (Van Andel); (IMT-India)

    2010-07-13

    The open reading frame Rv1326c of Mycobacterium tuberculosis (Mtb) H37Rv encodes for an {alpha}-1,4-glucan branching enzyme (MtbGlgB, EC 2.4.1.18, Uniprot entry Q10625). This enzyme belongs to glycoside hydrolase (GH) family 13 and catalyzes the branching of a linear glucose chain during glycogenesis by cleaving a 1 {yields} 4 bond and making a new 1 {yields} 6 bond. Here, we show the crystal structure of full-length MtbGlgB (MtbGlgBWT) at 2.33-{angstrom} resolution. MtbGlgBWT contains four domains: N1 {beta}-sandwich, N2 {beta}-sandwich, a central ({beta}/{alpha}){sub 8} domain that houses the catalytic site, and a C-terminal {beta}-sandwich. We have assayed the amylase activity with amylose and starch as substrates and the glycogen branching activity using amylose as a substrate for MtbGlgBWT and the N1 domain-deleted (the first 108 residues deleted) Mtb{Delta}108GlgB protein. The N1 {beta}-sandwich, which is formed by the first 105 amino acids and superimposes well with the N2 {beta}-sandwich, is shown to have an influence in substrate binding in the amylase assay. Also, we have checked and shown that several GH13 family inhibitors are ineffective against MtbGlgBWT and Mtb{Delta}108GlgB. We propose a two-step reaction mechanism, for the amylase activity (1 {yields} 4 bond breakage) and isomerization (1 {yields} 6 bond formation), which occurs in the same catalytic pocket. The structural and functional properties of MtbGlgB and Mtb{Delta}108GlgB are compared with those of the N-terminal 112-amino acid-deleted Escherichia coli GlgB (EC{Delta}112GlgB).

  7. Is flow verification necessary

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  8. Integrated Java Bytecode Verification

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  9. Diacyltransferase Activity and Chain Length Specificity of Mycobacterium tuberculosis PapA5 in the Synthesis of Alkyl β-Diol Lipids

    Touchette, Megan H.; Bommineni, Gopal R.; Delle Bovi, Richard J.; Gadbery, John; Nicora, Carrie D.; Shukla, Anil K.; Kyle, Jennifer E.; Metz, Thomas O.; Martin, Dwight W.; Sampson, Nicole S.; Miller, W. T.; Tonge, Peter J.; Seeliger, Jessica C.

    2015-09-08

    Although classified as Gram-positive bacteria, Corynebacterineae possess an asymmetric outer membrane that imparts structural and thereby physiological similarity to more distantly related Gram-negative bacteria. Like lipopolysaccharide in Gram-negative bacteria, lipids in the outer membrane of Corynebacterineae have been associated with the virulence of pathogenic species such as Mycobacterium tuberculosis (Mtb). For example, Mtb strains that lack long, branched-chain alkyl esters known as dimycocerosates (DIMs) are significantly attenuated in model infections. The resultant interest in the biosynthetic pathway of these unusual virulence factors has led to the elucidation of many of the steps leading to the final esterification of the alkyl beta-diol, phthiocerol, with branched-chain fatty acids know as mycocerosates. PapA5 is an acyltransferase implicated in these final reactions. We here show that PapA5 is indeed the terminal enzyme in DIM biosynthesis by demonstrating its dual esterification activity and chain-length preference using synthetic alkyl beta-diol substrate analogues. Applying these analogues to a series of PapA5 mutants, we also revise a model for the substrate binding within PapA5. Finally, we demonstrate that the Mtb Ser/Thr kinase PknB modifies PapA5 on three Thr residues, including two (T196, T198) located on an unresolved loop. These results clarify the DIM biosynthetic pathway and suggest possible mechanisms by which DIM biosynthesis may be regulated by the post-translational modification of PapA5.

  10. Characterization of the cloned full-length and a truncated human target of rapamycin: Activity, specificity, and enzyme inhibition as studied by a high capacity assay

    The mammalian target of rapamycin (mTOR/TOR) is implicated in cancer and other human disorders and thus an important target for therapeutic intervention. To study human TOR in vitro, we have produced in large scale both the full-length TOR (289 kDa) and a truncated TOR (132 kDa) from HEK293 cells. Both enzymes demonstrated a robust and specific catalytic activity towards the physiological substrate proteins, p70 S6 ribosomal protein kinase 1 (p70S6K1) and eIF4E binding protein 1 (4EBP1), as measured by phosphor-specific antibodies in Western blotting. We developed a high capacity dissociation-enhanced lanthanide fluorescence immunoassay (DELFIA) for analysis of kinetic parameters. The Michaelis constant (K m) values of TOR for ATP and the His6-S6K substrate were shown to be 50 and 0.8 μM, respectively. Dose-response and inhibition mechanisms of several known inhibitors, the rapamycin-FKBP12 complex, wortmannin and LY294002, were also studied in DELFIA. Our data indicate that TOR exhibits kinetic features of those shared by traditional serine/threonine kinases and demonstrate the feasibility for TOR enzyme screen in searching for new inhibitors

  11. Crystallization and preliminary X-ray crystallographic analysis of a full-length active form of the Cry4Ba toxin from Bacillus thuringiensis

    The crystallization of the Cry4Ba toxin from B. thuringiensis is described. To obtain a complete structure of the Bacillus thuringiensis Cry4Ba mosquito-larvicidal protein, a 65 kDa functional form of the Cry4Ba-R203Q mutant toxin was generated for crystallization by eliminating the tryptic cleavage site at Arg203. The 65 kDa trypsin-resistant fragment was purified and crystallized using the sitting-drop vapour-diffusion method. The crystals belonged to the rhombohedral space group R32, with unit-cell parameters a = b = 184.62, c = 187.36 Å. Diffraction data were collected to at least 2.07 Å resolution using synchrotron radiation and gave a data set with an overall Rmerge of 9.1% and a completeness of 99.9%. Preliminary analysis indicated that the asymmetric unit contained one molecule of the active full-length mutant, with a VM coefficient and solvent content of 4.33 Å3 Da−1 and 71%, respectively

  12. Decomposition by tree dimension in Horn clause verification

    Kafle, Bishoksan; Gallagher, John Patrick; Ganty, Pierre

    In this paper we investigate the use of the concept of tree dimension in Horn clause analysis and verification. The dimension of a tree is a measure of its non-linearity - for example a list of any length has dimension zero while a complete binary tree has dimension equal to its height. We apply ...

  13. INF and IAEA: A comparative analysis of verification strategy

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities

  14. Distorted Fingerprint Verification System

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  15. Material integrity verification radar

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  16. Robust verification analysis

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  17. Secure Location Verification

    Becker, Georg T.; Lo, Sherman C.; De Lorenzo, David S.; Enge, Per K.; Paar, Christof

    2010-01-01

    The use of location based services has increased significantly over the last few years. However, location information is only sparsely used as a security mechanism. One of the reasons for this is the lack of location verification techniques with global coverage. Recently, a new method for authenticating signals from Global Navigation Satellite Systems(GNSS) such as GPS or Galileo has been proposed. In this paper, we analyze the security of this signal authentication mechanism and show how it ...

  18. Distorted Fingerprint Verification System

    Divya KARTHIKAESHWARAN; Jeyalatha SIVARAMAKRISHNAN

    2011-01-01

    Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the...

  19. TFE verification program

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  20. Continuous verification using multimodal biometrics.

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  1. The NRC measurement verification program

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  2. Quantum money with classical verification

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  3. Quantum money with classical verification

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  4. Verification of nuclear material accounts as a management function

    UKAEA management has established a technically based Nuclear Materials Accounting Control Team (NMACT) whose responsibilities include verification of nuclear material holdings, independently of local management, at the six Authority establishments. This paper reviews the principles on which this technical verification work is based and discusses, where appropriate, the relevance to and differences from Safeguards' inspections. Preliminary results are presented on both record audit and physical verification activities at a fissile material store, a plutonium laboratory, a prototype power reactor, an experimental fuels laboratory, and a fuel fabrication plant. (author)

  5. Criteria for structural verification of fast reactor core elements

    Structural and functional criteria and relative verifications of PEC reactor fuel element are presented and discussed. Particular attention has been given to differentiate the structural verifications of low neutronic damage zones from those high neutronic damage ones. The structural verification criteria, which had already been presented at the 8th SMIRT Seminar Conference in Paris, have had some modifications during the Safety Report preparation. Finally some necessary activities are indicated for structural criteria validation, in particular for irradiated components, and for converging towards a European fast reactor code. (author). 3 refs, 6 tabs

  6. Verification of wet blasting decontamination technology

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  7. Scalable Techniques for Formal Verification

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  8. Core seismic methods verification report

    Information on HTGR reactor core seismic requirements is presented concerning element properties and code parameters; correlation and verification of the codes; sensitivity studies; and application to design

  9. EU Environmental Technology Verification pilot programme - Guidance documents: Guidelines for the workflow of documents and information between Verification Bodies, Technical Working Groups and Commission Services

    BARBOSA LANHAM ANA; PIERS DE RAVESCHOOT RONALD; SCHOSGER Jean-Pierre; Henry, Pierre

    2014-01-01

    Environmental Technology Verification (ETV) is a new tool to enable the verification of the claims provided by environmental technologies. The Programme is set up foreseeing the existence of Technical Working Groups (TWGs), one for each technology area active under the Pilot programme. These are chaired by the JRC and composed by Commission Invited Experts and by Experts representing the Verification Bodies with the overall aim to harmonise and exchange good practices among member states. ...

  10. Einstein's Length Concept

    Einstein's length measuring procedure of a rod moving with velocity υ (0 ≤ |υ| < c) is discussed. The part of this procedure, namely measuring the length of the resting (υ = 0) rod, is realizable and leads to the elongation of the moving rod. The other part of Einstein's procedure, measuring the length of the moving (υ ≠ 0) rod, is not realizable and leads to the contraction of the moving rod. As the result of this procedure the moving rod contraction concept is supposed physically unfounded. (author). 8 refs; 1 fig

  11. High-level verification

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  12. Advanced formal verification

    Drechsler, Rolf

    2007-01-01

    Preface. Contributing Authors. Introduction; R. Drechsler. 1. Formal Verification. 2. Challenges. 3. Contributions to this Book. 1: What SAT-Solvers Can and Cannot Do; E. Goldberg. 1. Introduction. 2. Hard Equivalence Checking CNF Formulas. 3. Stable Sets of Points. 2: Advancements in Mixed BDD and SAT Techniques; G. Cabodi, S. Quer. 1. Introduction. 2. Background. 3. Comparing SAT and BDD Approaches: Are they Different? 4. Decision Diagrams as a Slave Engine in General SAT: Clause Compression by Means of ZBDDs. 5. Decision Diagram Preprocessing and Circuit-Based SAT. 6. Using SAT in Symbolic

  13. Verification of LHS distributions.

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  14. A Characteristic Particle Length

    Roberts, Mark D

    2015-01-01

    It is argued that there are characteristic intervals associated with any particle that can be derived without reference to the speed of light $c$. Such intervals are inferred from zeros of wavefunctions which are solutions to the Schr\\"odinger equation. The characteristic length is $\\ell=\\beta^2\\hbar^2/(8Gm^3)$, where $\\beta=3.8\\dots$; this length might lead to observational effects on objects the size of a virus.

  15. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  16. Clinical Verification of Homeopathy

    Michel Van Wassenhoven

    2011-07-01

    Full Text Available The world is changing! This is certainly true regarding the homeopathic practice and access to homeopathic medicine. Therefore our first priority at the ECH-LMHI [1] has been to produce a yearly report on the scientific framework of homeopathy. In the 2010 version a new chapter about epidemic diseases has been added including the Leptospirosis survey on the Cuban population. A second priority has been to review the definition of the homeopathic medicines respecting the new framework generated by the official registration procedure and the WHO report. We are working now on a documented (Materia Medica and provings list of homeopathic remedies to facilitate the registration of our remedies. The new challenges are: first of all more good research proposals and as such more funding (possible through ISCHI + Blackie Foundation as examples [2]; international acceptance of new guidelines for proving and clinical verification of homeopathic symptoms (Proposals are ready for discussion; total reconsideration of the homeopathic repertories including results of the clinical verification of the symptoms. The world is changing, we are part of the world and changes are needed also for homeopathy!

  17. HDL to verification logic translator

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  18. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Abhishek Jain

    2013-01-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based ReusableVerification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tightschedules on all projects it is important to have a strong verification methodology which contributes toFirst Silicon Success. Deploy methodologies which enforce full functional coverage and verification ofcorner cases through pseudo random test scenarios is required. Also, standardization of verification flow isneeded. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC LevelVerification was used for Functional Verification. Different Verification Environments were used at IPlevel and SoC level. Different Verification/Validation Methodologies were used for SoC Verification acrossmultiple sites. Verification teams were also looking for the ways how to catch bugs early in the designcycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based ReusableVerification Environment is required to avoid the problem of having so many methodologies and provides astandard unified solution which compiles on all tools.

  19. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  20. Verification and Validation in Computational Fluid Dynamics; TOPICAL

    Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas. The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized

  1. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  2. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    OLGUIN, L.J.

    2000-09-25

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work.

  3. Online fingerprint verification.

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  4. Verification of uncertainty budgets

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , because their influence requires samples taken at long intervals, e.g., the acquisition of a new calibrant. It is therefore recommended to include verification of the uncertainty budget in the continuous QA/QC monitoring; this will eventually lead to a test also for such rarely occurring effects....... full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...... observed and expected variability is tested by means of the T-test, which follows a chi-square distribution with a number of degrees of freedom determined by the number of replicates. Significant deviations between predicted and observed variability may be caused by a variety of effects, and examples will...

  5. Reconfigurable system design and verification

    Hsiung, Pao-Ann; Huang, Chun-Hsian

    2009-01-01

    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e

  6. Verification of the Danish 1990, 2000 and 2010 emission inventory data

    Fauser, Patrik; Nielsen, Malene; Winther, Morten; Plejdrup, Marlene Schmidt; Gyldenkærne, Steen; Mikkelsen, Mette Hjorth; Albrektsen, Rikke; Hoffmann, Leif; Thomsen, Marianne; Hjelgaard, Katja; Nielsen, Ole-Kenneth

    Danish emission values, implied emission factors and activity data for the national greenhouse gas inventory are assessed according to an updated verification procedure. Focus is on 25 identified key categories, represented by 29 verification categories, and 28 Annex II indicators covering energy...... is made with data for energy consumption (Eurostat), agricultural statistics (Eurostat), industrial processes (UN) and waste disposal (OECD). Verification in this approach is a combination of qualitative and quantitative assessments and can assist to identify sectors and categories that require more...

  7. NDA techniques for spent fuel verification and radiation monitoring. Report on activities 6a and 6b of Task JNT C799 (SAGOR). Finnish support programme to the IAEA safeguards

    A variety of NDA methods exist for measurement of spent fuel at various stages of the disposition process. Each of the methods has weaknesses and strengths that make them applicable to one or more stages in disposition. Both passive and active methods are, under favorable conditions, capable of providing either a mapping of an assembly to identify missing fuel pins or a measurement of the fissile content and some are capable of providing a mapping of a canister to identify missing assemblies or a measurement of the fissile content. However, a spent fuel measurement system capable of making routine partial defect tests of spent fuel assemblies is missing. The active NDA methods, in particular, the active neutron methods, hold the most promise for providing quantitative measurements on fuel assemblies and canisters. Application of NDA methods to shielded casks may not be practical or even possible due to the extent of radiation attenuation by the shielding materials, and none of these methods are considered to have potential for quantitative measurements once the spent fuel cask has been placed in a repository. The most practical approach to spent fuel verification is to confirm the characteristics of the spent fuel prior to loading in a canister or cask at the conditioning facility. Fissile material tracking systems in addition to containment and surveillance methods have the capability to assure continuity of the verified knowledge of the sample from loading of the canisters to final disposal and closing of the repository. (orig.)

  8. NDA techniques for spent fuel verification and radiation monitoring. Report on activities 6a and 6b of Task JNT C799 (SAGOR). Finnish support programme to the IAEA safeguards

    Tarvainen, M. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Levai, F. [Technical Univ., Budabest (Hungary); Valentine, T.E. [Oak Ridge National Lab., TN (United States); Abhold, M. [Los Alamos National Lab., NM (United States); Moran, B. [USNRC, Washington, DC (United States)

    1997-08-01

    A variety of NDA methods exist for measurement of spent fuel at various stages of the disposition process. Each of the methods has weaknesses and strengths that make them applicable to one or more stages in disposition. Both passive and active methods are, under favorable conditions, capable of providing either a mapping of an assembly to identify missing fuel pins or a measurement of the fissile content and some are capable of providing a mapping of a canister to identify missing assemblies or a measurement of the fissile content. However, a spent fuel measurement system capable of making routine partial defect tests of spent fuel assemblies is missing. The active NDA methods, in particular, the active neutron methods, hold the most promise for providing quantitative measurements on fuel assemblies and canisters. Application of NDA methods to shielded casks may not be practical or even possible due to the extent of radiation attenuation by the shielding materials, and none of these methods are considered to have potential for quantitative measurements once the spent fuel cask has been placed in a repository. The most practical approach to spent fuel verification is to confirm the characteristics of the spent fuel prior to loading in a canister or cask at the conditioning facility. Fissile material tracking systems in addition to containment and surveillance methods have the capability to assure continuity of the verified knowledge of the sample from loading of the canisters to final disposal and closing of the repository. (orig.). 49 refs.

  9. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  10. Radiochemical verification and validation in the environmental data collection process

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  11. Seasonal variation in the length of the daily activity period in buffy-headed marmosets (Callithrix flaviceps): an important consideration for the analysis of foraging strategies in observational field studies of primates.

    Ferrari, Stephen F; Hilário, Renato R

    2014-04-01

    Activity budgets are widely used in primate behavioral studies for the analysis of ecological strategies. In some cases, there is considerable seasonal variation in the length of the daily activity period. Here, activity budgets from two field studies of Callithrix flaviceps were compiled first by the traditional approach (proportion of scan sample records) and then by considering the proportion of time dedicated to each activity over the 24-hr cycle (adjusted budget). Both groups were almost invariably active for at least 1-2 hr less than the daylight period, with significantly shorter activity periods during the austral winter, when the daylight period was up to 2:35 hr shorter than in the summer. The adjustment of activity budgets provided a completely different perspective on foraging strategies. Whereas the basic budgets indicated a significant increase in foraging and moving during the resource-poor dry season (winter) months, the time-adjusted data revealed that the primary strategy was a time-minimizing one, with the animals simply spending more time at rest during the longer activity periods of summer days. While both groups followed the same pattern of relatively short activity periods, there were considerable differences between sites in the mean duration of the period in a given month, and in behavior patterns, although the analysis of the determining factors was beyond the scope of the present study. Overall, the results of the study indicate that the manipulation of the duration of the daily activity period may be an integral component of primate behavioral strategies, and that this parameter should be taken into account systematically when evaluating activity patterns, especially at sites at relatively high latitudes where day length may vary considerably over the course of the year. PMID:24323495

  12. Numident Online Verification Utility (NOVU)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  13. TPS verification with UUT simulation

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  14. Atomic frequency-time-length standards

    The principles of operative of atomic frequency-time-length standards and their principle characteristics are described. The role of quartz crystal oscillators which are sloved to active or passive standards is presented. (authors)

  15. Trajectory Based Behavior Analysis for User Verification

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  16. Video-Based Fingerprint Verification

    Lili Liu; Yilong Yin; Wei Qin

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and...

  17. Verification of Monte Carlo transport codes FLUKA, Mars and Shield

    The present study is a continuation of the project 'Verification of Monte Carlo Transport Codes' which is running at GSI as a part of activation studies of FAIR relevant materials. It includes two parts: verification of stopping modules of FLUKA, MARS and SHIELD-A (with ATIMA stopping module) and verification of their isotope production modules. The first part is based on the measurements of energy deposition function of uranium ions in copper and stainless steel. The irradiation was done at 500 MeV/u and 950 MeV/u, the experiment was held at GSI from September 2004 until May 2005. The second part is based on gamma-activation studies of an aluminium target irradiated with an argon beam of 500 MeV/u in August 2009. Experimental depth profiling of the residual activity of the target is compared with the simulations. (authors)

  18. Addressing verification challenges [International safeguards symposium on addressing verification challenges

    now provide information relevant to physical protection as well. The IAEA does not receive all information they would need, for example systematic information from the Nuclear Suppliers Group on exports and imports. Other challenges are financial resources (IAEA's budget: $ 130 million) and the IAEA laboratories in Vienna which are not equipped for state- of-the-art analysis of environmental samples. There is also need for transparency measures in certain situations - for example, interviewing people and having access to documents. Another challenge is how to deal with countries having already begun weaponization activities, how to verify that weapons have been dismantled, weaponization structures have been destroyed and custody has been taken of weapon design information. The IAEA recently moved from a system based on facility verification to a State level safeguards approach. The IAEA has also introduced an integrated safeguards approach, which is more cost effective and enables the IAEA to provide better assurances. Environmental sampling and satellite monitoring are new tools that the IAEA is now using almost routinely. Moreover, the IAEA is continuing to work with the Member States to develop new verification tools. Each of the issues discussed presents its own challenge and there is hope for input and new ideas provided by the participants. The real purpose of the symposium is to determine how the IAEA can continue to be effective and relevant, and a valuable instrument to help the international community deal with nuclear weapons proliferation

  19. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  20. Multi-project verification of Swedish AIJ projects. Verification results and documentation

    Uzzell, J.; Lehmann, M.; Nestaas, I.; Telnes, E.; Lund, T.; Vaage, G. [Det Norske Veritas AS, Hoevik (Norway)

    2000-03-01

    In 2000 DNV was engaged by the Swedish National Energy Administration to carry out a pilot multi-project verification of Swedish AIJ (Activities Implemented Jointly) projects located in the Baltic countries of Estonia, Latvia, and Lithuania. The CO{sub 2} emissions reductions from 27 fuel switch projects were verified as a case study for the multi-project verification methodology developed by DNV. These AIJ projects replaced fossil fuel boilers with advanced biofuel boiler technology. These biofuel boilers use primarily wood waste and the air emissions are assumed to be CO2 neutral in accordance with IPCC guidelines. The aim of the multi-project methodology is to reduce verification transaction costs by selecting only a sample from the projects for on-site auditing. In order to maintain a high level of confidence in the verified emission reductions the multi-project verification methodology conservatively estimates the verified emission reductions: by discounting the ERUs due to uncertainty in monitored data and uncertainty in baseline parameters; and by extrapolating project reporting error to the projects which were not audited on-site. A logical and transparent site selection process was used for selecting the projects to be audited on-site; and DNV audited on-site 61% of the verified emissions reductions while visiting only 11 of the 27 projects. The 27 AIJ projects were assessed with AIJ and JI criteria existing at this time and were found to be in agreement with these criteria. The total amount of emission reductions which could be conservatively verified by DNV during the period of 1993-1999 for these 27 projects was 498,710 tonnes of CO{sub 2}. The lessons learned from this pilot multi-project verification are documented in a companion report.

  1. A Series of Diamagnetic Pyridine Monoimine Rhenium Complexes with Different Degrees of Metal-to-Ligand Charge Transfer: Correlating (13) C NMR Chemical Shifts with Bond Lengths in Redox-Active Ligands.

    Sieh, Daniel; Kubiak, Clifford P

    2016-07-18

    A set of pyridine monoimine (PMI) rhenium(I) tricarbonyl chlorido complexes with substituents of different steric and electronic properties was synthesized and fully characterized. Spectroscopic (NMR and IR) and single-crystal X-ray diffraction analyses of these complexes showed that the redox-active PMI ligands are neutral and that the overall electronic structure is little affected by the choices of the substituent at the ligand backbone. One- and two-electron reduction products were prepared from selected starting compounds and could also be characterized by multiple spectroscopic methods and X-ray diffraction. The final product of a one-electron reduction in THF is a diamagnetic metal-metal-bonded dimer after loss of the chlorido ligand. Bond lengths in and NMR chemical shifts of the PMI ligand backbone indicate partial electron transfer to the ligand. Two-electron reduction in THF also leads to the loss of the chlorido ligand and a pentacoordinate complex is obtained. The comparison with reported bond lengths and (13) C NMR chemical shifts of doubly reduced free pyridine monoaldimine ligands indicates that both redox equivalents in the doubly reduced rhenium complex investigated here are located in the PMI ligand. With diamagnetic complexes varying over three formal reduction stages at the PMI ligand we were, for the first time, able to establish correlations of the (13) C NMR chemical shifts with the relevant bond lengths in redox-active ligands over a full redox series. PMID:27319753

  2. An innovative piping verification program for steam generator replacement

    The traditional programmatic approach to confirm the acceptability of piping thermal expansion has an impact on the schedule for the startup of nuclear plants. The process of obtaining, evaluating, and resolving critical measurements at pipe supports and plant structures is a critical path activity that extends the time required for the plant to obtain or resume full power operation. In order to support the schedule for and minimize the duration of the steam generator replacement (SGR) outage at North Anna Unit 1, an innovative piping verification program was developed and implemented. The approach used for the restart verification program involved a significant planning effort prior to the SGR outage and kept piping system commodity verification activities off of the critical path by performing a series of engineering evaluation tasks before and during the SGR outage. The lessons learned from the successful program is being revised and improved for implementation on the steam generator replacement project for North Anna Unit 2

  3. Kinetic analysis of anionic surfactant adsorption from aqueous solution onto activated carbon and layered double hydroxide with the zero length column method

    Schouten, Natasja; Ham, Louis G.J. van der; Euverink, Gert-Jan W.; Haan, André B. de

    2009-01-01

    Low cost adsorption technology offers high potential to clean-up laundry rinsing water. From an earlier selection of adsorbents, layered double hydroxide (LDH) and granular activated carbon (GAC) proved to be interesting materials for the removal of anionic surfactant, linear alkyl benzene sulfonate

  4. Functional verification of floating point arithmetic unit

    For continuous real-time reactivity monitoring of PFBR reactivity safety channel, a FPGA based reactivity meter has been developed by Electronics Division, BARC. Verification of designs involved in Safety Critical systems is very important and necessary. Functional verification of this design is presently carried out by EID, IGCAR. In Reactivity meter, Floating Point Arithmetic Unit (FPAU) is a major and very important sub module, which needs to be completely verified first. Two types of verifications are possible: Functional verification and Formal verification. This paper discusses and shares the experiences of functional verification of FPAU module for all special floating point numbers. (author)

  5. Calibration and verification of surface contamination meters --- Procedures and techniques

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm2), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm2. The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum

  6. Verification of Chemical Weapons Destruction

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  7. Kinetic analysis of anionic surfactant adsorption from aqueous solution onto activated carbon and layered double hydroxide with the zero length column method

    Schouten, Natasja; Ham, Louis G.J. van der; Euverink, Gert-Jan W.; Haan, André B. de

    2009-01-01

    Low cost adsorption technology offers high potential to clean-up laundry rinsing water. From an earlier selection of adsorbents, layered double hydroxide (LDH) and granular activated carbon (GAC) proved to be interesting materials for the removal of anionic surfactant, linear alkyl benzene sulfonate (LAS), which is the main contaminant in rinsing water. The main research question is to identify adsorption kinetics of LAS onto GAC-1240 and LDH. The influence of pre-treatment of the adsorbent, ...

  8. Mutational analysis of Mdm2 C-terminal tail suggests an evolutionarily conserved role of its length in Mdm2 activity toward p53 and indicates structural differences between Mdm2 homodimers and Mdm2/MdmX heterodimers

    Dolezelova, Pavlina; Cetkovska, Katerina; Vousden, Karen H.; Uldrijan, Stjepan

    2012-01-01

    Mdm2 can mediate p53 ubiquitylation and degradation either in the form of the Mdm2 homodimer or Mdm2/MdmX heterodimer. The ubiquitin ligase activity of these complexes resides mainly in their respective RING finger domains and also requires adjacent C-terminal tails. So far, structural studies have failed to show significant differences between Mdm2 RING homodimers and Mdm2/MdmX RING heterodimers. Here, we report that not only the primary amino acid sequence, but also the length of the C-term...

  9. Automated Verification of Virtualized Infrastructures

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a...... platform that connects declarative and expressive description languages with state-of-the art verification methods. The languages integrate homogeneously descriptions of virtualized infrastructures, their transformations, their desired goals, and evaluation strategies. The different verification tools...... range from model checking to theorem proving; this allows us to exploit the complementary strengths of methods, and also to understand how to best represent the analysis problems in different contexts. We consider first the static case where the topology of the virtual infrastructure is fixed and...

  10. Review of some activities performed in Italy on the development and verification of codes for three-dimensional power distribution calculations in LWRs

    A review of some activities performed in Italy on 3D simulators is given. Some of these activities contribute to the confirmation of methods currently used in core design and operation, through operating and experimental data at Garigliano and Caorso power plants. On the other hand, a few codes obtained from the MIT have been modified and applied both to the usual benchmark problems and to critical situations. Besides, some efforts have been devoted to the development of an essentially one-group coarse mesh model (COMETA, now embodied into the three-dimensional simulator CETHRA); spectral effects arising from the presence of great material heterogeneities are taken into account. Finally, the FEMSYN and SYNTH-C codes based on the synthesis method have been developed

  11. Biometric Technologies and Verification Systems

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  12. Space Telescope performance and verification

    Wright, W. F.

    1980-01-01

    The verification philosophy for the Space Telescope (ST) has evolved from years of experience with multispacecraft programs modified by the new factors introduced by the Space Transportation System. At the systems level of test, the ST will undergo joint qualification/acceptance tests with environment simulation using Lockheed's large spacecraft test facilities. These tests continue the process of detecting workmanship defects and module interface incompatibilities. The test program culminates in an 'all up' ST environmental test verification program resulting in a 'ready to launch' ST.

  13. Revised Final Report - Independent Verification Survey Activities At The Separations Process Research Unit Sites, Niskayuna, New York - DCN 0496-SR-06-1

    The Separations Process Research Unit (SPRU) complex located on the Knolls Atomic Power Laboratory (KAPL) site in Niskayuna, New York, was constructed in the late 1940s to research the chemical separation of plutonium and uranium (Figure A-1). SPRU operated as a laboratory scale research facility between February 1950 and October 1953. The research activities ceased following the successful development of the reduction oxidation and plutonium/uranium extraction processes. The oxidation and extraction processes were subsequently developed for large scale use by the Hanford and Savannah River sites (aRc 2008a). Decommissioning of the SPRU facilities began in October 1953 and continued through the 1990s.

  14. Chemical Weapons and Problems of Verification

    P.K. Ramachandran

    1990-01-01

    Full Text Available This paper reviews the existing treaties for ban and verification of the production and use of chemical weapons. The proposed Chemical Weapons Convection, its thrust areas of verifications, the organisations for and process of verification are described briefly. Various technical verification measures including field techniques, such as detector papers, tubes, enzyme tickets, etc. and analytical methods such as gas chromatography, microsensors, different spectrometry methods including IR techniques and stationary system are also discussed.

  15. On Verification Modelling of Embedded Systems

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verification of any non-trivial system. Good verification models, therefore, are lean and mean, and cannot be obtained easily or generated automatically. Current research, however, seems to take the construct...

  16. Verification of radiation transport codes with unstructured meshes

    Confidence in the results of a radiation transport code requires that the code be verified against problems with known solutions. Such verification problems may be generated by means of the method of manufactured solutions. Previously we reported the application of this method to the verification of radiation transport codes for structured meshes, in particular the SCEPTRE code. We extend this work to verification with unstructured meshes and again apply it to SCEPTRE. We report on additional complexities for unstructured mesh verification of transport codes. Refinement of such meshes for error convergence studies is more involved, particularly for tetrahedral meshes. Furthermore, finite element integrations arising from the presence of the streaming operator exhibit different behavior for unstructured meshes than for structured meshes. We verify SCEPTRE with a combination of 'exact' and 'inexact' problems. Errors in the results are consistent with the discretizations, either being limited to roundoff error or displaying the expected rates of convergence with mesh refinement. We also observe behaviors in the results that were difficult to analyze and predict from a strictly theoretical basis, thereby yielding benefits from verification activities beyond demonstrating code correctness. (author)

  17. Alloy4SPV : A Formal Framework for Software Process Verification

    Laurent, Yoann; Bendraou, Reda; Baarir, Souheib; Gervais, Marie-Pierre

    2014-01-01

    In this paper we present a framework for software process verification called Alloy4SPV which uses a subset of UML2 Activity Diagrams as a process modeling language. In order to achieve software process verification, we i) define a formal model of our process modeling language using first-order logic, ii) we give it a formal semantics based on the fUML standard, and iii) we implement this formalization using the Alloy language [1]. In order to ease its adoption by process mod-elers, our frame...

  18. A static analysis tool set for assembler code verification

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  19. Guidance for the verification and validation of neural networks

    Pullum, L; Darrah, M

    2007-01-01

    Guidance for the Verification and Validation of Neural Networks is a supplement to the IEEE Standard for Software Verification and Validation, IEEE Std 1012-1998. Born out of a need by the National Aeronautics and Space Administration's safety- and mission-critical research, this book compiles over five years of applied research and development efforts. It is intended to assist the performance of verification and validation (V&V) activities on adaptive software systems, with emphasis given to neural network systems. The book discusses some of the difficulties with trying to assure adaptive systems in general, presents techniques and advice for the V&V practitioner confronted with such a task, and based on a neural network case study, identifies specific tasking and recommendations for the V&V of neural network systems.

  20. Smoking cessation early in pregnancy and birth weight, length, head circumference, and endothelial nitric oxide synthase activity in umbilical and chorionic vessels: an observational study of healthy singleton pregnancies

    Andersen, Malene R; Simonsen, Ulf; Uldbjerg, Niels;

    2009-01-01

    BACKGROUND: Reduced production of the vasodilator nitric oxide (NO) in fetal vessels in pregnant smokers may lower the blood flow to the fetus and result in lower birth weight, length, and head circumference. The present study measured endothelial NO synthase (eNOS) activity in fetal umbilical and...... chorionic vessels from nonsmokers, smokers, and ex-smokers and related the findings to the fetal outcome. METHODS AND RESULTS: Of 266 healthy, singleton pregnancies, 182 women were nonsmokers, 43 were smokers, and 41 stopped smoking early in pregnancy. eNOS activity and concentration were quantified in...... endothelial cells of the fetal vessels. Cotinine, lipid profiles, estradiol, l-arginine, and dimethylarginines that may affect NO production were determined in maternal and fetal blood. Serum cotinine verified self-reported smoking. Newborns of smokers had a lower weight (P< or =0.001) and a smaller head...

  1. Emergency operating procedures. Generation, verification and validation

    Systems Response, Operator Cognition and the application of the Emergency Operating Procedures (EOP) Standards for Canadian Nuclear Utilities are three of the four corner stones of the Point Lepreau EOP program, the fourth corner stone is a common sense application of the other three. The Emergency Operating Procedures for the Point Lepreau Generating Station have been subject to two major revisions over the past decade. The later revision, currently in progress, reflects a full application of the 'Emergency Operating Procedures Standards for Canadian Utilities'. The Standards require, prior to issue of an Emergency Operating Procedure, the application of a process which entails the elements of 'Generation', 'Verification' and 'Validation'. This paper describes that process with respect to the production (including Generation, Verification and Validation) of a generic EOP and those EOPs which deal with Loss of Coolant Accidents and Loss of Heat Sink accidents. The activities involved in each of the elements are discussed and illustrated with examples extracted from the EOPs. The EOPs are part of a larger framework which dictates the human response to an upset - the plant specific 'Upset Response Strategy'. That strategy is developed from a fundamental understanding of the process time constants. Likewise, the strategies internal to an EOP must recognize both process time constants and the 'human time constants'. The EOP structure, format and detailed content must recognize the Control Room Operator as an intelligent controller -objectives, inputs, decisions and actions must be expressed with the CROs' cognition foremost. Proper application of the elements of Generation, Verification and Validation ensure that the necessary technical and operational experience has been incorporated into an EOP before it is released to training and before it is issued. (author) 8 refs., 4 figs

  2. VEG-01: Veggie Hardware Verification Testing

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  3. Verification of safety critical software

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  4. A scheme for symmetrization verification

    Sancho, Pedro

    2011-01-01

    We propose a scheme for symmetrization verification in two-particle systems, based on one-particle detection and state determination. In contrast to previous proposals, it does not follow a Hong-Ou-Mandel-type approach. Moreover, the technique can be used to generate superposition states of single particles.

  5. A scheme for symmetrization verification

    Sancho, Pedro

    2011-08-01

    We propose a scheme for symmetrization verification in two-particle systems, based on one-particle detection and state determination. In contrast to previous proposals, it does not follow a Hong-Ou-Mandel-type approach. Moreover, the technique can be used to generate superposition states of single particles.

  6. Eggspectation : organic egg verification tool

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and size

  7. VERIFICATION OF WATER QUALITY MODELS

    The basic concepts of water quality models are reviewed and the need to recognize calibration and verification of models with observed data is stressed. Post auditing of models after environmental control procedures are implemented is necessary to determine true model prediction ...

  8. A verification environment for bigraphs

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples: a t...

  9. A new model for verification

    DU Zhen-jun; MA Guang-sheng; FENG Gang

    2007-01-01

    Formal verification is playing a significant role in IC design. However, the common models for verification either have their complexity problems or have applicable limitations. In order to overcome the deficiencies, a novel model-WGL (Weighted Generalized List) is proposed, which is based on the general-list decomposition of polynomials, with three different weights and manipulation rules introduced to effect node sharing and the canonicity. Timing parameters and operations on them are also considered. Examples show the word-level WGL is the only model to linearly represent the common word-level functions and the bit-level WGL is especially suitable for arithmetic intensive circuits. The model is proved to be a uniform and efficient model for both bitlevel and word-level functions. Then based on the WGL model, a backward-construction verification approach is proposed, which reduces time and space complexity for multipliers to polynomial complexity ( time complexity is less than O( n3.6) and space complexity is less than O( n1.5) ) without hierarchical partitioning. Both the model and the verification method show their theoretical and applicable significance in IC design.

  10. Formal Verification of Continuous Systems

    Sloth, Christoffer

    2012-01-01

    losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due to the...

  11. Laboratory verification of the Active Particle-induced X-ray Spectrometer (APXS) on the Chang'e-3 mission

    In the Chang'e-3 mission, the Active Particle-induced X-ray Spectrometer (APXS) on the Yutu rover is used to analyze the chemical composition of lunar soil and rock samples. APXS data are only valid are only if the sensor head gets close to the target and integration time lasts long enough. Therefore, working distance and integration time are the dominant factors that affect APXS results. This study confirms the ability of APXS to detect elements and investigates the effects of distance and time on the measurements. We make use of a backup APXS instrument to determine the chemical composition of both powder and bulk samples under the conditions of different working distances and integration times. The results indicate that APXS can detect seven major elements, including Mg, Al, Si, K, Ca, Ti and Fe under the condition that the working distance is less than 30 mm and having an integration time of 30 min. The statistical deviation is smaller than 15%. This demonstrates the instrument's ability to detect major elements in the sample. Our measurements also indicate the increase of integration time could reduce the measurement error of peak area, which is useful for detecting the elements Mg, Al and Si. However, an increase in working distance can result in larger errors in measurement, which significantly affects the detection of the element Mg. (paper)

  12. Verification of anticlockwise gyre in the semi-closed water area of Lake Nakaumi, southwest Japan, by using 224Ra/228Ra activity ratios

    The Honjyo area in Lake Nakaumi is a semi-closed brackish water area where some mixing of up-flowing marine water and down-flowing lake water take place. A large-scale gyre that caused by the residual circulation was once indicated by a temporal algal blooming that spread over the semi-closed Honjyo area in brackish Lake Nakaumi. In order to verify this type of water circulation, we examined 224Ra (t1/2=3.66 d)/228Ra (t1/2=5.75 y) activity ratios of both upper and lower waters that differentiated by a well-developed halocline. The 224Ra/228Ra ratios in the upper water were lowest in the central area, suggesting the formation of anticlockwise gyre. The ratios in the lower water were rather uniform, but a basin-wide anticlockwise flow of water is also indicated. The 224Ra/228Ra ratio is clearly effective to trace the water flow for both the deep and surface waters. (author)

  13. INDEPENDENT VERIFICATION OF THE BUILDING 3550 SLAB AT OAK RIDGE NATIONAL LABORATORY OAK RIDGE, TENNESSEE

    Weaver, Phyllis C.

    2012-05-08

    The Oak Ridge Institute for Science and Education (ORISE) has completed the independent verification survey of the Building 3550 Slab. The results of this effort are provided. The objective of this verification survey is to provide independent review and field assessment of remediation actions conducted by Safety and Ecology Corporation (SEC) to document that the final radiological condition of the slab meets the release guidelines. Verification survey activities on the Building 3550 Slab that included scans, measurements, and the collection of smears. Scans for alpha, alpha plus beta, and gamma activity identified several areas that were investigated.

  14. Independent Verification Survey Report For Zone 1 Of The East Tennessee Technology Park In Oak Ridge, Tennessee

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs)

  15. Design verification and validation plan for the cold vacuum drying facility

    The Cold Vacuum Drying Facility (CVDF) provides the required process systems, supporting equipment, and facilities needed for drying spent nuclear fuel removed from the K Basins. This document presents the both completed and planned design verification and validation activities

  16. Societal Verification: Intellectual Game or International Game-Changer

    Within the nuclear nonproliferation and arms control field, there is an increasing appreciation for the potential of open source information technologies to supplement existing verification and compliance regimes. While clearly not a substitute for on-site inspections or national technical means, it may be possible to better leverage information gleaned from commercial satellite imagery, international trade records and the vast amount of data being exchanged online and between publics (including social media) so as to develop a more comprehensive set of tools and practices for monitoring and verifying a state’s nuclear activities and helping judge compliance with international obligations. The next generation “toolkit” for monitoring and verifying items, facility operations and activities will likely include a more diverse set of analytical tools and technologies than are currently used internationally. To explore these and other issues, the Nuclear Threat Initiative has launched an effort that examines, in part, the role that emerging technologies and “citizen scientists” might play in future verification regimes. This paper will include an assessment of past proliferation and security “events” and whether emerging tools and technologies would have provided indicators concurrently or in advance of these actions. Such case studies will be instrumental in understanding the reliability of these technologies and practices and in thinking through the requirements of a 21st century verification regime. Keywords: Verification, social media, open-source information, arms control, disarmament.

  17. Formal Verification of Quantum Protocols

    Nagarajan, R; Nagarajan, Rajagopal; Gay, Simon

    2002-01-01

    We propose to analyse quantum protocols by applying formal verification techniques developed in classical computing for the analysis of communicating concurrent systems. One area of successful application of these techniques is that of classical security protocols, exemplified by Lowe's discovery and fix of a flaw in the well-known Needham-Schroeder authentication protocol. Secure quantum cryptographic protocols are also notoriously difficult to design. Quantum cryptography is therefore an interesting target for formal verification, and provides our first example; we expect the approach to be transferable to more general quantum information processing scenarios. The example we use is the quantum key distribution protocol proposed by Bennett and Brassard, commonly referred to as BB84. We present a model of the protocol in the process calculus CCS and the results of some initial analyses using the Concurrency Workbench of the New Century (CWB-NC).

  18. FETAL FOOT LENGTH AND HAND LENGTH: RELATIONSHIP WITH CROWN RUMP LENGTH AND GESTATIONAL AGE

    Garima

    2015-12-01

    Full Text Available BACKGROUND Estimation of gestational age of fetus is of great medicolegal importance. Multiple parameters of the fetal anatomical measurements are in use. However, gestational age assessment may be difficult in fetus with anencephaly, hydrocephalus, short limb dysplasia, post mortem destruction or in mutilated case. Study of literature suggests that fetal foot has a characteristic pattern of normal growth and the fetal foot shows gradual increase in length relative to the length of the embryo and could be used to estimate gestational age. The purpose of the present study is to determine the accuracy in estimating gestational age using fetal foot and hand length by studying its relation with crown rump length in the foetuses of Manipuri origin. AIMS AND OBJECTIVES 1 To study the relationship between fetal crown rump length and fetal hand and foot length, thereby determining the accuracy in estimating gestational age by a cross-sectional study. MATERIALS AND METHODS A total of 100 formalin fixed fetuses of Manipuri origin, obtained from the Department of Obstetrics and Gynaecology, Regional Institute of Medical Sciences, Imphal, were included in the study, carried out in the Department of Anatomy, from February 2015 to July 2015. The parameters studied were crown rump length, foot length and hand length of fetuses. The data was analysed using SPSS software by regression analysis. Graphs were also plotted to determine pattern of growth and their correlation with crown rump length if any. RESULTS A total of 100 fetuses were studied, of which 43 were females and 57 were males. The mean foot length and hand length progressively increased with increase in crown rump length. Measurements were not significantly different in right or left side or among male and female fetuses. A statistically significant linear relationship was seen between foot length and crown rump length of the fetus (r=0.980, p<0.0001 and hand length and crown rump length of the fetus

  19. Verification of Uncurated Protein Annotations

    Rebholz-Schuhmann, Dietrich; Kirsch, Harald; Apweiler, Rolf; Camon, Evelyn; Dimmer, Emily; Lee, Vivian; Silva, Mário J.; Couto, Francisco M.

    2009-01-01

    Molecular Biology research projects produced vast amounts of data, part of which has been preserved in a variety of public databases. However, a large portion of the data contains a significant number of errors and therefore requires careful verification by curators, a painful and costly task, before being reliable enough to derive valid conclusions from it. On the other hand, research in biomedical information retrieval and information extraction are nowadays delivering Text Mining solutions...

  20. Ultrasonic Verification of Composite Structures

    Pelt, Maurice; Boer, Robert Jan,; Schoemaker, Christiaan; Sprik, Rudolf

    2014-01-01

    International audience Ultrasonic Verification is a new method for the monitoring large surface areas of CFRP by ultrasound with few sensors. The echo response of a transmitted pulse through the structure is compared with the response of an earlier obtained reference signal to calculate a fidelity parameter. A change in fidelity over time is indicative for a new defect in the structure. This paper presents an experimental assessment of the effectiveness and reproducibility of the method.

  1. Verification of Stochastic Process Calculi

    Skrypnyuk, Nataliya

    performed with the purpose to verify the system. In this dissertation it is argued that the verification techniques that have their origin in the analysis of programming code with the purpose to deduce the properties of the code's execution, i.e. Static Analysis techniques, are transferable to stochastic...... description of a system. The presented methods have a clear application in the areas of embedded systems, (randomised) protocols run between a fixed number of parties etc....

  2. An Effective Fingerprint Verification Technique

    Gogoi, Minakshi; Bhattacharyya, D K

    2010-01-01

    This paper presents an effective method for fingerprint verification based on a data mining technique called minutiae clustering and a graph-theoretic approach to analyze the process of fingerprint comparison to give a feature space representation of minutiae and to produce a lower bound on the number of detectably distinct fingerprints. The method also proving the invariance of each individual fingerprint by using both the topological behavior of the minutiae graph and also using a distance ...

  3. Verification and transparency in future arms control

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  4. Nuclear Data Verification and Standardization

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  5. Measuring verification device error rates

    A verification device generates a Type I (II) error when it recommends to reject (accept) a valid (false) identity claim. For a given identity, the rates or probabilities of these errors quantify random variations of the device from claim to claim. These are intra-identity variations. To some degree, these rates depend on the particular identity being challenged, and there exists a distribution of error rates characterizing inter-identity variations. However, for most security system applications we only need to know averages of this distribution. These averages are called the pooled error rates. In this paper the authors present the statistical underpinnings for the measurement of pooled Type I and Type II error rates. The authors consider a conceptual experiment, ''a crate of biased coins''. This model illustrates the effects of sampling both within trials of the same individual and among trials from different individuals. Application of this simple model to verification devices yields pooled error rate estimates and confidence limits for these estimates. A sample certification procedure for verification devices is given in the appendix

  6. Online Signature Verification Using Fourier Descriptors

    Yanikoglu, Berrin; Kholmatov, Alisher

    2009-12-01

    We present a novel online signature verification system based on the Fast Fourier Transform. The advantage of using the Fourier domain is the ability to compactly represent an online signature using a fixed number of coefficients. The fixed-length representation leads to fast matching algorithms and is essential in certain applications. The challenge on the other hand is to find the right preprocessing steps and matching algorithm for this representation. We report on the effectiveness of the proposed method, along with the effects of individual preprocessing and normalization steps, based on comprehensive tests over two public signature databases. We also propose to use the pen-up duration information in identifying forgeries. The best results obtained on the SUSIG-Visual subcorpus and the MCYT-100 database are 6.2% and 12.1% error rate on skilled forgeries, respectively. The fusion of the proposed system with our state-of-the-art Dynamic Time Warping (DTW) system lowers the error rate of the DTW system by up to about 25%. While the current error rates are higher than state-of-the-art results for these databases, as an approach using global features, the system possesses many advantages. Considering also the suggested improvements, the FFT system shows promise both as a stand-alone system and especially in combination with approaches that are based on local features.

  7. Gender verification in competitive sports.

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  8. Appendix: Conjectures concerning proof, design, and verification.

    Wos, L.

    2000-05-31

    This article focuses on an esoteric but practical use of automated reasoning that may indeed be new to many, especially those concerned primarily with verification of both hardware and software. Specifically, featured are a discussion and some methodology for taking an existing design -- of a circuit, a chip, a program, or the like--and refining and improving it in various ways. Although the methodology is general and does not require the use of a specific program, McCune's program OTTER does offer what is needed. OTTER has played and continues to play the key role in my research, and an interested person can gain access to this program in various ways, not the least of which is through the included CD-ROM in [3]. When success occurs, the result is a new design that may require fewer components, avoid the use of certain costly components, offer more reliability and ease of verification, and, perhaps most important, be more efficient in the contexts of speed and heat generation. Although the author has minimal experience in circuit design, circuit validation, program synthesis, program verification, and similar concerns, (at the encouragement of colleagues based on successes to be cited) he presents materials that might indeed be of substantial interest to manufacturers and programmers. He writes this article in part prompted by the recent activities of chip designers that include Intel and AMD, activities heavily emphasizing the proving of theorems. As for his research that appears to the author to be relevant, he has made an intense and most profitable study of finding proofs that are shorter [2,3], some that avoid the use of various types of term, some that are far less complex than previously known, and the like. Those results suggest to me a strong possible connection between more appealing proofs (in mathematics and in logic) and enhanced and improved design of both hardware and software. Here the author explores diverse conjectures that elucidate some of

  9. MOV reliability evaluation and periodic verification scheduling

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  10. Verification and testing automation of UML projects

    Nikita, Voinov; Vsevolod, Kotlyarov

    2009-01-01

    This paper presents an integrated approach to verification and testing automation of UML projects. It consists of automatic model creation from UML specifications in the formal language of basic protocols, model’s verification by the means of VRS technology and automatic tests generation in TTCN language using TAT. The actuality of this task arises from necessity of software functionality’s correctness checking, including verification and testing, but there is lack of industrial technologies ...

  11. Verification-based Software-fault Detection

    Gladisch, Christoph David

    2011-01-01

    Software is used in many safety- and security-critical systems. Software development is, however, an error-prone task. In this dissertation new techniques for the detection of software faults (or software "bugs") are described which are based on a formal deductive verification technology. The described techniques take advantage of information obtained during verification and combine verification technology with deductive fault detection and test generation in a very unified way.

  12. An approach to evaluate product verifications

    Kenger, Patrik; Coda, Mariana

    2004-01-01

    Companies implement a module product assortment as a part of their strategy to, among others, shorten lead-times, increase the product quality and to create more product variants with fever parts. However, the increased number of variants becomes a challenging task for the personnel responsible for the product verifications. By implementing verifications at module level, so called MPV (Module Property Verification) several advantages ensue. The advantages is not only a decrease in cost of ver...

  13. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  14. The CTBT verification system. Entering rough waters?

    Five years after the Comprehensive Nuclear Test Ban Treaty (CTBT) was opened for signature, progress towards entry into force has been slowing. - Political uncertainty about the timing of entry into force is complicating the work of the CTBT Organization's Preparatory Commission (PrepCom). - The US decision, announced on 21 August 2001, not to pay its full share of financial contributions to the PrepCom and withdraw from activities not related to the International Monitoring System (IMS) may put it in non-compliance as a signatory to the treaty. - States need to continue to support the work of the PrepCom by providing it with the necessary financial and technical means. Gaps left by the US decision need to be filled by other states. - Completing the IMS remains a priority task which will need patience and support from all member states of the PrepCom. - Establishing an effective regime for on-site inspections is greatly complicated by the new US policy. Those states in favour of a flexible regime need to redouble their efforts, including increased input into the development of an Operational Manual. - States need to overcome undue concerns about confidentiality and create an open verification regime that makes its data available to scientific and humanitarian relief organisations. - Taken together, these efforts will enable the PrepCom to complete its task of setting up the CTBT's verification system in the foreseeable future. - Washington should live up to its commitment as a signatory to the CTBT and support the whole range of PrepCom activities. - The Article XIV conference should urge the US to reconsider its new policy of reducing support to the PrepCom

  15. Verification of the BISON fuel performance code

    Highlights: • Reviews the accepted definitions of verification and validation. • Reviews verification papers from the literature. • Gives several example of verification tests appropriate for a fuel performance code. • Shows results from a set of validation tests. - Abstract: Complex multiphysics simulations such as those used in nuclear fuel performance analysis are composed of many submodels used to describe specific phenomena. These phenomena include, for example, mechanical material constitutive behavior, heat transfer across a gas gap, and mechanical contact. These submodels work in concert to simulate real-world events, like the behavior of a fuel rod in a reactor. If a simulation tool is able to represent real-world behavior, the tool is said to be validated. While much emphasis is rightly placed on validation, model verification is equally important. Verification involves showing that a submodel computes results consistent with its mathematical description. This paper reviews the differences between verification, validation, and calibration as well as their dependencies on one another. Verification problems specific to nuclear fuel analysis are presented. Other verification problems suitable to assess the correctness of a finite element-based nuclear fuel application such as BISON (written to be applicable to many fuel forms and arbitrary geometry) are also presented. BISON calculates the correct solution to each of the verification tests, laying the foundation for subsequent validation

  16. Tactile length contraction as Bayesian inference.

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574

  17. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  18. Turbulence Modeling Verification and Validation

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  19. Numerical Verification of Industrial Numerical Codes

    Montan Sethy

    2012-04-01

    Full Text Available Several approximations occur during a numerical simulation: physical effects mapy be discarded, continuous functions replaced by discretized ones and real numbers replaced by finite-precision representations. The use of the floating point arithmetic generates round-off errors at each arithmetical expression and some mathematical properties are lost. The aim of the numerical verification activity at EDF R&D is to study the effect of the round-off error propagation on the results of a numerical simulation. It is indeed crucial to perform a numerical verification of industrial codes such as developped at EDF R&D even more for code running in HPC environments. This paper presents some recent studies around the numerical verification at EDF R&D. Le résultat d’un code de simulation numérique subit plusieurs approximations effectuées lors de la modélisation mathématique du problème physique, de la discrétisation du modèle mathématique et de la résolution numérique en arithmétique flottante. L’utilisation de l’arithmétique flottante génère en effet des erreurs d’arrondi lors de chaque opération flottante et des propriétés mathématiques sont perdues. Il existe à EDF R&D une activité transverse de vérification numérique consistant à étudier l’effet de la propagation des erreurs d’arrondi sur les résultats des simulations. Il est en effet important de vérifier numériquement des codes industriels et ce d’autant plus s’ils sont éxécutés dans environnements de calcul haute performance. Ce papier présente des études récentes autour de la vérification numérique à EDF R&D.

  20. Science verification results from PMAS

    Roth, M. M.; Becker, T.; Böhm, P.; Kelz, A.

    2004-02-01

    PMAS, the Potsdam Multi-Aperture Spectrophotometer, is a new integral field instrument which was commissioned at the Calar Alto 3.5m Telescope in May 2001. We report on results obtained from a science verification run in October 2001. We present observations of the low-metallicity blue compact dwarf galaxy SBS0335-052, the ultra-luminous X-ray Source X-1 in the Holmberg;II galaxy, the quadruple gravitational lens system Q2237+0305 (the ``Einstein Cross''), the Galactic planetary nebula NGC7027, and extragalactic planetary nebulae in M31. PMAS is now available as a common user instrument at Calar Alto Observatory.

  1. Science Verification Results from PMAS

    Roth, M. M.; Becker, T; Boehm, P.; Kelz, A.

    2003-01-01

    PMAS, the Potsdam Multi-Aperture Spectrophotometer, is a new integral field instrument which was commissioned at the Calar Alto 3.5m Telescope in May 2001. We report on results obtained from a science verification run in October 2001. We present observations of the low-metallicity blue compact dwarf galaxy SBS0335-052, the ultra-luminous X-ray Source X-1 in the Holmberg II galaxy, the quadruple gravitational lens system Q2237+0305 (the "Einstein Cross"), the Galactic planetary nebula NGC7027,...

  2. Science Verification Results from PMAS

    Roth, M M; Böhm, P; Kelz, A

    2003-01-01

    PMAS, the Potsdam Multi-Aperture Spectrophotometer, is a new integral field instrument which was commissioned at the Calar Alto 3.5m Telescope in May 2001. We report on results obtained from a science verification run in October 2001. We present observations of the low-metallicity blue compact dwarf galaxy SBS0335-052, the ultra-luminous X-ray Source X-1 in the Holmberg II galaxy, the quadruple gravitational lens system Q2237+0305 (the "Einstein Cross"), the Galactic planetary nebula NGC7027, and extragalactic planetary nebulae in M31. PMAS is now available as a common user instrument at Calar Alto Observatory.

  3. FEFTRA TM verification. Update 2013

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  4. Hot cell verification facility update

    The Hot Cell Verification Facility (HCVF) is operated by the Westinghouse Hanford Co. in the 300 Area of Hanford. The HCVF provides a prototype hot cell mock-up for use in checking equipment and operations for functional and remote operation. The facility can also be used for hands-on training of operating personnel prior to actual hot operation of the equipment. A broad spectrum of testing and development functions is performed in HCVF, including: equipment operability testing, maintainability and compatibility testing, system integration, and remote maintenance capability testing. An updated description of the HCVF is presented in this paper

  5. An Effective Fingerprint Verification Technique

    Gogoi, Minakshi

    2010-01-01

    This paper presents an effective method for fingerprint verification based on a data mining technique called minutiae clustering and a graph-theoretic approach to analyze the process of fingerprint comparison to give a feature space representation of minutiae and to produce a lower bound on the number of detectably distinct fingerprints. The method also proving the invariance of each individual fingerprint by using both the topological behavior of the minutiae graph and also using a distance measure called Hausdorff distance.The method provides a graph based index generation mechanism of fingerprint biometric data. The self-organizing map neural network is also used for classifying the fingerprints.

  6. Verification and nuclear material security

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  7. SHIELD verification and validation report

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation.

  8. Minimum Length from First Principles

    Calmet, X; Hsu, S D H; Calmet, Xavier; Graesser, Michael; Hsu, Stephen D. H.

    2005-01-01

    We show that no device or gedanken experiment is capable of measuring a distance less than the Planck length. By "measuring a distance less than the Planck length" we mean, technically, resolve the eigenvalues of the position operator to within that accuracy. The only assumptions in our argument are causality, the uncertainty principle from quantum mechanics and a dynamical criteria for gravitational collapse from classical general relativity called the hoop conjecture. The inability of any gedanken experiment to measure a sub-Planckian distance suggests the existence of a minimal length.

  9. Particularities of Verification Processes for Distributed Informatics Applications

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  10. Methods of Verification, Accountability and Control of Special Nuclear Material

    Stewart, J.E.

    1999-05-03

    This session demonstrates nondestructive assay (NDA) measurement, surveillance and analysis technology required to protect, control and account (MPC and A) for special nuclear materials (SNM) in sealed containers. These measurements, observations and analyses comprise state-of-the art, strengthened, SNM safeguards systems. Staff member specialists, actively involved in research, development, training and implementation worldwide, will present six NDA verification systems and two software tools for integration and analysis of facility MPC and A data.

  11. Facial Verification Technology for Use In Atm Transactions

    Aru, Okereke Eze

    2013-01-01

    There is an urgent need for improving security in banking region. With the birth of the Automatic Teller Machines, banking became a lot easier though with its own troubles of insecurity. Due to tremendousincrease in the number of criminals and their activities, the ATM has become insecure. ATM systems today use no more than an access card and PIN for identity verification. The recent progress in biometric identification techniques, including finger printing, retina scanning, and facial recogn...

  12. Cleanup Verification Package for the 118-F-2 Burial Ground

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-2 Burial Ground. This burial ground, formerly called Solid Waste Burial Ground No. 1, was the original solid waste disposal site for the 100-F Area. Eight trenches contained miscellaneous solid waste from the 105-F Reactor and one trench contained solid waste from the biology facilities

  13. Definition of Magnetic Exchange Length

    Abo, GS; Hong, YK; Park, J; Lee, J; Lee, W; Choi, BC

    2013-08-01

    The magnetostatic exchange length is an important parameter in magnetics as it measures the relative strength of exchange and self-magnetostatic energies. Its use can be found in areas of magnetics including micromagnetics, soft and hard magnetic materials, and information storage. The exchange length is of primary importance because it governs the width of the transition between magnetic domains. Unfortunately, there is some confusion in the literature between the magnetostatic exchange length and a similar distance concerning magnetization reversal mechanisms in particles known as the characteristic length. This confusion is aggravated by the common usage of two different systems of units, SI and cgs. This paper attempts to clarify the situation and recommends equations in both systems of units.

  14. Minimum Length from First Principles

    Calmet, Xavier; Graesser, Michael; Hsu, Stephen D. H.

    2005-01-01

    We show that no device or gedanken experiment is capable of measuring a distance less than the Planck length. By "measuring a distance less than the Planck length" we mean, technically, resolve the eigenvalues of the position operator to within that accuracy. The only assumptions in our argument are causality, the uncertainty principle from quantum mechanics and a dynamical criteria for gravitational collapse from classical general relativity called the hoop conjecture. The inability of any g...

  15. Cognitive Bias in Systems Verification

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  16. RISKIND verification and benchmark comparisons

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  17. Video-Based Fingerprint Verification

    Lili Liu

    2013-09-01

    Full Text Available Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR when the false rejection rate (FRR is quite low.

  18. The monitoring and verification of nuclear weapons

    Garwin, Richard L., E-mail: RLG2@us.ibm.com [IBM Fellow Emeritus, IBM Thomas J. Watson Research Center, P.O. Box 218, Yorktown Heights, NY 10598 (United States)

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  19. On the organisation of program verification competitions

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    2012-01-01

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to sho

  20. HTGR analytical methods and design verification

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  1. 9 CFR 417.8 - Agency verification.

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan;...

  2. On Verification Modelling of Embedded Systems

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio

  3. Verification and Performance Analysis for Embedded Systems

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  4. Telomere length correlates with life span of dog breeds.

    Fick, Laura J; Fick, Gordon H; Li, Zichen; Cao, Eric; Bao, Bo; Heffelfinger, Doug; Parker, Heidi G; Ostrander, Elaine A; Riabowol, Karl

    2012-12-27

    Telomeric DNA repeats are lost as normal somatic cells replicate. When telomeres reach a critically short length, a DNA damage signal is initiated, inducing cell senescence. Some studies have indicated that telomere length correlates with mortality, suggesting that telomere length contributes to human life span; however, other studies report no correlation, and thus the issue remains controversial. Domestic dogs show parallels in telomere biology to humans, with similar telomere length, telomere attrition, and absence of somatic cell telomerase activity. Using this model, we find that peripheral blood mononuclear cell (PBMC) telomere length is a strong predictor of average life span among 15 different breeds (p Dogs lose telomeric DNA ~10-fold faster than humans, which is similar to the ratio of average life spans between these species. Breeds with shorter mean telomere lengths show an increased probability of death from cardiovascular disease, which was previously correlated with short telomere length in humans. PMID:23260664

  5. Verification Survey of the Building 315 Zero Power Reactor-6 Facility, Argonne National Laboratory-East, Argonne, Illinois

    W. C. Adams

    2007-05-25

    Oak Ridge Institute for Science and Education (ORISE) conducted independent verification radiological survey activities at Argonne National Laboratory’s Building 315, Zero Power Reactor-6 facility in Argonne, Illinois. Independent verification survey activities included document and data reviews, alpha plus beta and gamma surface scans, alpha and beta surface activity measurements, and instrumentation comparisons. An interim letter report and a draft report, documenting the verification survey findings, were submitted to the DOE on November 8, 2006 and February 22, 2007, respectively (ORISE 2006b and 2007).

  6. New method of verificating optical flat flatness

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  7. SU-E-J-138: On the Ion Beam Range and Dose Verification in Hadron Therapy Using Sound Waves

    Purpose: Accurate range verification is of great importance to fully exploit the potential benefits of ion beam therapies. Current research efforts on this topic include the use of PET imaging of induced activity, detection of emerging prompt gamma rays or secondary particles. It has also been suggested recently to detect the ultrasound waves emitted through the ion energy absorption process. The energy absorbed in a medium is dissipated as heat, followed by thermal expansion that leads to generation of acoustic waves. By using an array of ultrasound transducers the precise spatial location of the Bragg peak can be obtained. The shape and intensity of the emitted ultrasound pulse depend on several variables including the absorbed energy and the pulse length. The main objective of this work is to understand how the ultrasound wave amplitude and shape depend on the initial ion energy and intensity. This would help guide future experiments in ionoacoustic imaging. Methods: The absorbed energy density for protons and carbon ions of different energy and field sizes were obtained using Fluka Monte Carlo code. Subsequently, the system of coupled equations for temperature and pressure is solved for different ion pulse intensities and lengths to obtain the pressure wave shape, amplitude and spectral distribution. Results: The proposed calculations show that the excited pressure wave amplitude is proportional to the absorbed energy density and for longer ion pulses inversely proportional to the ion pulse duration. It is also shown that the resulting ionoacoustic pressure distribution depends on both ion pulse duration and time between the pulses. Conclusion: The Bragg peak localization using ionoacoustic signal may eventually lead to the development of an alternative imaging method with sub-millimeter resolution. It may also open a way for in-vivo dose verification from the measured acoustic signal

  8. Concepts of Model Verification and Validation

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  9. Verification and validation process for the safety software in KNICS

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  10. DIMINISHED FATIGUE AT REDUCED MUSCLE LENGTH IN HUMAN SKELETAL MUSCLE

    Lee, Samuel C. K.; Braim, Anthony; Becker, Cara N.; Prosser, Laura A.; Tokay, Ann M.; Binder-Macleod, Stuart A.

    2007-01-01

    Understanding muscle fatigue properties at different muscle lengths is essential to improve electrical stimulation applications in which impaired muscle is activated to produce function or to serve as an orthotic assist. This study examined the effects of muscle length on fatigue in human quadriceps muscle. Twelve healthy subjects were tested at short and long muscle lengths (15° and 90° of knee flexion, respectively) before and after a fatigue-producing protocol using low-, high-, and variab...

  11. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  12. Telomere length in human liver diseases.

    Urabe, Y; Nouso, K; Higashi, T; Nakatsukasa, H; Hino, N; Ashida, K; Kinugasa, N; Yoshida, K; Uematsu, S; Tsuji, T

    1996-10-01

    To determine the role of telomere-mediated gene stability in hepatocarcinogenesis, we examined the telomere length of human liver with or without chronic liver diseases and hepatocellular carcinomas (HCC). The mean telomere restriction fragment (TRF) length of normal liver (n = 13), chronic hepatitis (n = 11), liver cirrhosis (n = 24) and HCC (n = 24) was 7.8 +/- 0.2, 7.1 +/- 0.3, 6.4 +/- 0.2 and 5.2 +/- 0.2 kb, respectively (mean +/- standard error). TRF length decreased with a progression of chronic liver diseases and that in HCC was significantly shorter than that in other chronic liver diseases (p HCC to that of corresponding surrounding liver of well differentiated (n = 7), moderately differentiated (n = 10) and poorly differentiated (n = 4) HCCs were 0.83 +/- 0.06, 0.75 +/- 0.05 and 0.98 +/- 0.09, respectively. The ratio of poorly differentiated HCC was significantly higher than that of moderately differentiated HCC (p telomere length ratio of moderately differentiated HCCs revealed a decrease of the ratio with size until it reached 50 mm in diameter. In contrast, the ratio increased as the size enlarged over 50 mm. These findings suggest that the gene stability of the liver cells mediated by the telomere is reduced as chronic liver disease progresses and that telomerase is activated in poorly differentiated HCC and moderately differentiated HCC over 50 mm in diameter. PMID:8938628

  13. When Does Length Cause the Word Length Effect?

    Jalbert, Annie; Neath, Ian; Bireta, Tamra J.; Surprenant, Aimee M.

    2011-01-01

    The word length effect, the finding that lists of short words are better recalled than lists of long words, has been termed one of the benchmark findings that any theory of immediate memory must account for. Indeed, the effect led directly to the development of working memory and the phonological loop, and it is viewed as the best remaining…

  14. How regional non-proliferation arrangements complement international verification

    This presentation focuses on international verification in the form of IAEA Safeguards, and discusses the relationship between IAEA safeguards and the relevant regional arrangements, both the existing and the future. For most States the political commitment against acquisition of nuclear weapons has been carefully reached and strongly held. Their observance of treaty commitments does not depend on the deterrent effect of verification activities. Safeguards serve to assist States who recognise it is in their own interest to demonstrate their compliance to others. Thus safeguards are a vital confidence building measure in their own right, as well as being a major complement to the broader range of international confidence building measures. Safeguards can both complement other confidence building measures and in turn be complemented by them. Within consideration of how it could work it is useful to consider briefly current developments of IAEA safeguards, i.e. existing regional arrangements and nuclear weapon free zones

  15. Conducting Verification and Validation of Multi- Agent Systems

    Nedhal Al Saiyd

    2012-10-01

    Full Text Available Verification and Validation (V&V is a series of activities ,technical and managerial ,which performed bysystem tester not the system developer in order to improve the system quality ,system reliability andassure that product satisfies the users operational needs. Verification is the assurance that the products ofa particular development phase are consistent with the requirements of that phase and preceding phase(s,while validation is the assurance that the final product meets system requirements. an outside agency canbe used to performed V&V, which is indicate by Independent V&V, or IV&V, or by a group within theorganization but not the developer, referred to as Internal V&V. Use of V&V often accompanies testing,can improve quality assurance, and can reduce risk. This paper putting guidelines for performing V&V ofMulti-Agent Systems (MAS.

  16. Continuously variable focal length lens

    Adams, Bernhard W; Chollet, Matthieu C

    2013-12-17

    A material preferably in crystal form having a low atomic number such as beryllium (Z=4) provides for the focusing of x-rays in a continuously variable manner. The material is provided with plural spaced curvilinear, optically matched slots and/or recesses through which an x-ray beam is directed. The focal length of the material may be decreased or increased by increasing or decreasing, respectively, the number of slots (or recesses) through which the x-ray beam is directed, while fine tuning of the focal length is accomplished by rotation of the material so as to change the path length of the x-ray beam through the aligned cylindrical slows. X-ray analysis of a fixed point in a solid material may be performed by scanning the energy of the x-ray beam while rotating the material to maintain the beam's focal point at a fixed point in the specimen undergoing analysis.

  17. Automated claim and payment verification.

    Segal, Mark J; Morris, Susan; Rubin, James M O

    2002-01-01

    Since the start of managed care, there has been steady deterioration in the ability of physicians, hospitals, payors, and patients to understand reimbursement and the contracts and payment policies that drive it. This lack of transparency has generated administrative costs, confusion, and mistrust. It is therefore essential that physicians, hospitals, and payors have rapid access to accurate information on contractual payment terms. This article summarizes problems with contract-based reimbursement and needed responses by medical practices. It describes an innovative, Internet-based claims and payment verification service, Phynance, which automatically verifies the accuracy of all claims and payments by payor, contract and line item. This service enables practices to know and apply the one, true, contractually obligated allowable. The article details implementation costs and processes and anticipated return on investment. The resulting transparency improves business processes throughout health care, increasing efficiency and lowering costs for physicians, hospitals, payors, employers--and patients. PMID:12122814

  18. Graduated compression stockings: knee length or thigh length.

    Benkö, T; Cooke, E A; McNally, M A; Mollan, R A

    2001-02-01

    The mechanisms by which graduated compression stockings prevent deep venous thrombosis are not completely understood. In the current study the physiologic effect of low-pressure graduated compression stockings on the venous blood flow in the lower limb and the practical aspects of their use were assessed. Patients having elective orthopaedic surgery at a university orthopaedic department were randomized into five groups to wear two different types of graduated compression stockings in thigh and knee lengths. Patients in the fifth control group did not wear graduated compression stockings. Venous occlusion strain gauge plethysmography was used to measure venous flow. After 20-minutes bed rest there was a highly significant increase in venous capacitance and venous outflow in patients in all of the four groups wearing stockings. There was no difference in the mean of the percentage change of venous capacitance in patients in the four groups wearing stockings. The knee length Brevet stockings were less efficient in increasing the venous outflow. There was no significant change in the venous capacitance and venous outflow in patients in the control group. Visual assessment of the fit and use of stockings was done, and patients' subjective opinion of comfort was sought. The knee length graduated compression stockings wrinkled significantly less, and significantly fewer patients reported discomfort with them. All stockings were reported to be difficult to use. Thigh and knee length stockings have a significant effect on decreasing venous stasis of the lower limb. Knee length graduated compression stockings are similarly efficient in decreasing venous stasis, but they are more comfortable to wear, and they wrinkle less. PMID:11210954

  19. Supporting operational design information verification

    The role of operational DIV is to confirm that the facility is operated in accordance with its declared design or operational footprint. The system described in SRDP-R279 goes part way in meeting this objective for the main plutonium path in the chemical separation area; this report seeks to address some of the remaining issues in this area of the plant by proposing a complementary system for its secondary process areas. With the exception of the re-work and sentencing tanks, most of these process areas are of only secondary interest to safeguards because their contents have low plutonium concentration. Nevertheless, simply knowing that their operation conforms to the operational footprint would represent a contribution to operational DIV. By continually examining data recorded from both the main process line and the secondary areas, the provision of assurances during plant shutdowns would also be addressed. This complementary system would be founded on the same databases and tools as in SRDP- R279. In addition it would provide the following benefits: clarification of events identified as an issue by the R279 system; verification that the input and output accountancy tanks are operated as declared; and verification that process units are shut-down, when declared to be so. As with SRDP-R279, one of the aims of this report is to give an overall picture of the kind of implementation that would be required. The purpose is to give an idea about the scale of the task that would be required to produce a realistic implementation. It is now possible to acquire vast amounts of data from modem process plants; the challenge is in drawing any conclusions from it. What is of fundamental importance is that the plant data is recorded sensibly and that appropriate data structures are in place. It will always be possible to refine the analysis tools, but not the data and their structures; one cannot go back in time. (author)

  20. Verification for PWSCC mitigation technologies

    In order to prevent damages or leakage due to PWSCC (Primary Water Stress Corrosion Cracking) and to improve the reliability of power plants, various technologies, inspection, mitigation, replacement and repair have been developed and performed. Water jet peening (WJP) and ultrasonic shot peening (USP) have been developed and verified as mitigation technologies and these techniques have been applied to the operating reactor vessels (RV) and the steam generators (SG). So far, the effect of WJP/USP on materials without cracks was confirmed by a verification test. However, there are detection limits in pre-inspections for WJP/USP. Therefore, it should be confirmed that the WJP/USP can be applied in cases where cracks shallower than the detection limit are present that it will not cause any negative impact on those cracks, such as crack growth. This paper describes the verification test for the applicability of WJP and USP to materials with shallow cracks beyond pre-inspection detection. First of all, plate mockups with shallow cracks were prepared. The WJP or USP was conducted on the mockups. Then, the residual stress adjacent to the cracks was measured and the presence of any shallow crack growth was inspected. From the test results, the effect of WJP/USP application on residual stress (compressive stress) on the surfaces with shallow cracks was confirmed and the harmlessness of WJP/USP application on shallow cracks, which means no cracks growth, was also observed. This means that the application of WJP and USP is effective in the case where shallow cracks, which are beyond detection, are present. Therefore, the WJP and USP are confirmed as mitigation technologies for PWSCC even in cases where there is no indication of cracks detected during the pre-inspection. (authors)

  1. Summary of neutron scattering lengths

    All available neutron-nuclei scattering lengths are collected together with their error bars in a uniform way. Bound scattering lengths are given for the elements, the isotopes, and the various spin-states. They are discussed in the sense of their use as basic parameters for many investigations in the field of nuclear and solid state physics. The data bank is available on magnetic tape, too. Recommended values and a map of these data serve for an uncomplicated use of these quantities. (orig.)

  2. Slip length measurement using BBM

    Ahmadzadegan, Adib; Snoeyink, Craig

    2015-11-01

    We will be presenting experimental characterizations of slip lengths of fluids in nano/micro channels. These channels are becoming increasingly important in sensor and separations applications. However, crucial questions still remain on the mechanisms that govern slip-length behavior. We used Bessel Beam microscopy (BBM), a novel super-resolution imaging system, in conjunction with TIRF system. These two, together led us to be able to do Particle Tracking Velocimetry with significantly higher accuracy than previously possible. We will be presenting results demonstrating the feasibility of this approach and advantages that make this method unique.

  3. Result of UT verification test for stainless steel through weld deposit

    BWR owners and plant fabricators have been making efforts to decrease the ISI parts where UT is difficult to be conducted. For that purpose, UT verification test for both detection and sizing qualification through deposit of pipe weld joint started in 2009 and will last until 2011. In 2009, UT verification test for stainless steel pipe weld joint has been performed. Test samples are stainless steel pipes with welds and sizes are 600A.150A with EDM notches and/or SCCs. Angle beam methods and phased-array angle beam methods have been applied. The detection test shows that there are no failure to detect and no false cell. The length sizing test shows that RSM errors of SCC lengths are within the ASME acceptance criteria. So, it is clarified that each method has the enough qualification for UT through deposit of stainless steel pipe weld joint. These tests have been witnessed and evaluated by JAPEIC as the third body. (author)

  4. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    Van Buren, Kendra L. [Los Alamos National Laboratory; Canfield, Jesse M. [Los Alamos National Laboratory; Hemez, Francois M. [Los Alamos National Laboratory; Sauer, Jeremy A. [Los Alamos National Laboratory

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verification test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.

  5. An analysis of clinical activity, admission rates, length of hospital stay, and economic impact after a temporary loss of 50% of the non-operative podiatrists from a tertiary specialist foot clinic in the United Kingdom

    Catherine Gooday

    2013-09-01

    Full Text Available Introduction: Podiatrists form an integral part of the multidisciplinary foot team in the treatment of diabetic foot–related complications. A set of unforeseen circumstances within our specialist diabetes foot service in the United Kingdom caused a loss of 50% of our non-operative podiatry team for almost 7 months during 2010. Some of this time was filled by non-specialist community non-operative podiatrists. Methods: We assessed the economic impact of this loss by examining data for the 5 years prior to this 7-month interruption, and for the 2 years after ‘normal service’ was resumed. Results: Our data show that the loss of the non-operative podiatrists led to a significant rise in the numbers of admissions into hospital, and hospital length of stay also increased. At our institution a single bed day cost is £275. During the time that the numbers of specialist non-operative podiatry staff were depleted, and for up to 6 months after they returned to normal activities, the extra costs increased by just less than £90,000. The number of people admitted directly from specialist vascular and orthopaedic clinics is likely to have increased due to the lack of capacity to manage them in the diabetic foot clinic. Our data were unable to assess these individuals and did not look at the costs saved from avoiding surgery. Thus the actual costs incurred are likely to be higher. Conclusions: Our data suggest that specialist non-operative podiatrists involved in the treatment of the diabetic foot may prevent unwarranted hospital admission and increased hospitalisation rates by providing skilled assessment and care in the outpatient clinical settings.

  6. Comparing formal verification approaches of interlocking systems

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these...... approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  7. Remark on pion scattering lengths

    Black, Deirdre; Jora, Renata; Park, Nae Woong; Schechter, Joseph; Shahid, M Naeem

    2009-01-01

    It is noted that the pattern of chiral perturbation theory predictions for both the isotopic spin 0 and isotopic spin 2 s-wave pion-pion scattering lengths to orders $p^2$, $p^4$ and $p^6$ seems to agree with the corresponding pattern of the tree level predictions of the SU(2) linear sigma model.

  8. Cavity length below chute aerators

    WU JianHua; RUAN ShiPing

    2008-01-01

    It is proved that air entrainment is one of the efficient measures dealing with cavitation control for the release works of hydropower projects. There are many factors to be considered in designing a chute aerator. One of the most important factors concerns the cavity length below the aerator, which has outstanding effects on air entrainment against cavitation damage. It is crucial to determine reasonable emergence angle for the calculation of the cavity length. In the present paper the overall effects of structural and hydraulic parameters on the emergence angle of the flow from the aerator were analyzed. Four improved expressions of the emergence angle with weight coefficient were investigated through experimental data of 68 points observed from 12 aerators of 6 hydropower projects, of both model and prototype, on the basis of error theory. A method to calculate the cavity length below aerators was suggested, which considers overall effects of the above mentioned parameters. Comparison between the method in this paper and the other five methods of calculating the cavity length showed that the present method is much more reliable than the existing methods while the mean error of the method is less than others.

  9. Cavity length below chute aerators

    2008-01-01

    It is proved that air entrainment is one of the efficient measures dealing with cavi-tation control for the release works of hydropower projects. There are many factors to be considered in designing a chute aerator. One of the most important factors concerns the cavity length below the aerator,which has outstanding effects on air entrainment against cavitation damage. It is crucial to determine reasonable emergence angle for the calculation of the cavity length. In the present paper the overall effects of structural and hydraulic parameters on the emergence angle of the flow from the aerator were analyzed. Four improved expressions of the emer-gence angle with weight coefficient were investigated through experimental data of 68 points observed from 12 aerators of 6 hydropower projects,of both model and prototype,on the basis of error theory. A method to calculate the cavity length be-low aerators was suggested,which considers overall effects of the above men-tioned parameters. Comparison between the method in this paper and the other five methods of calculating the cavity length showed that the present method is much more reliable than the existing methods while the mean error of the method is less than others.

  10. Cyclic Codes of Length 2

    Manju Pruthi

    2001-11-01

    In this paper explicit expressions of + 1 idempotents in the ring $R = F_q[X]/\\langle X^{2^m}-1\\rangle$ are given. Cyclic codes of length 2 over the finite field , of odd characteristic, are defined in terms of their generator polynomials. The exact minimum distance and the dimension of the codes are obtained.

  11. Seismic Hazard and Fault Length

    Black, N. M.; Jackson, D. D.; Mualchin, L.

    2005-12-01

    If mx is the largest earthquake magnitude that can occur on a fault, then what is mp, the largest magnitude that should be expected during the planned lifetime of a particular structure? Most approaches to these questions rely on an estimate of the Maximum Credible Earthquake, obtained by regression (e.g. Wells and Coppersmith, 1994) of fault length (or area) and magnitude. Our work differs in two ways. First, we modify the traditional approach to measuring fault length, to allow for hidden fault complexity and multi-fault rupture. Second, we use a magnitude-frequency relationship to calculate the largest magnitude expected to occur within a given time interval. Often fault length is poorly defined and multiple faults rupture together in a single event. Therefore, we need to expand the definition of a mapped fault length to obtain a more accurate estimate of the maximum magnitude. In previous work, we compared fault length vs. rupture length for post-1975 earthquakes in Southern California. In this study, we found that mapped fault length and rupture length are often unequal, and in several cases rupture broke beyond the previously mapped fault traces. To expand the geologic definition of fault length we outlined several guidelines: 1) if a fault truncates at young Quaternary alluvium, the fault line should be inferred underneath the younger sediments 2) faults striking within 45° of one another should be treated as a continuous fault line and 3) a step-over can link together faults at least 5 km apart. These definitions were applied to fault lines in Southern California. For example, many of the along-strike faults lines in the Mojave Desert are treated as a single fault trending from the Pinto Mountain to the Garlock fault. In addition, the Rose Canyon and Newport-Inglewood faults are treated as a single fault line. We used these more generous fault lengths, and the Wells and Coppersmith regression, to estimate the maximum magnitude (mx) for the major faults in

  12. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks

  13. Age and disease at an arms length

    Lassen, Aske Juul

    of life. A new ethics is emerging, focused on longevity and spreading through healthcare policies and technologies . At activity centres active elderly talk about health in old age, share experiences with health technologies and reflect on longevity, while working out. Chronic diseases are common in...... old age, but this does not mean that you give up or accept decreased quality of life. Ends change as new means emerge . The technological and medical abilities change the reflexive longevity. One expects to live a long and healthy life. Thus, technology can be conceived as a world-changing mediation...... from a chronic (previously fatal) disease. The active elderly often stick to their image of themselves as active, youthful and energetic in spite of a chronic disease. Old age and disease is not what they identify with and seems to be conceived at an arms length. In the paper the author explores how...

  14. Performing Verification and Validation in Reuse-Based Software Engineering

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  15. Standard practice for verification and classification of extensometer systems

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  16. Standard Verification System Lite (SVS Lite)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  17. Procedure Verification and Validation Toolset Project

    National Aeronautics and Space Administration — The proposed research is aimed at investigating a procedure verification and validation toolset, which will allow the engineers who are responsible for developing...

  18. MAMA Software Features: Quantification Verification Documentation-1

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  19. Data Exchanges and Verifications Online (DEVO)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  20. Seismic design verification of LMFBR structures

    1977-07-01

    The report provides an assessment of the seismic design verification procedures currently used for nuclear power plant structures, a comparison of dynamic test methods available, and conclusions and recommendations for future LMFB structures.

  1. 10 CFR 300.11 - Independent verification.

    2010-01-01

    ... managing an auditing or verification process, including the recruitment and allocation of other individual...) Greenhouse gas emission and emission reduction quantification; (iv) Data and information auditing sampling methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by...

  2. Engineering drawing field verification program. Revision 3

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  3. Tackling Verification and Validation for Prognostics

    National Aeronautics and Space Administration — Verification and validation (V&V) has been identified as a critical phase in fielding systems with Integrated Systems Health Management (ISHM) solutions to...

  4. Particularities of Verification Processes for Distributed Informatics Applications

    IVAN, ION; Cristian CIUREA; Bogdan VINTILA; Gheorghe NOSCA

    2013-01-01

    This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are d...

  5. Spacelab Mission 2 Pallet-Only Mode Verification Flight

    Lester, R. C.; Witt, W. R., Jr.

    1982-02-01

    The Spacelab is a flexible laboratory system, featuring an array of interchangeable components -- pressurized manned laboratories, unpressurized platforms, and related support systems -- that can be assembled in several different configurations of pallets and pressurized modules depending on the specific scientific requirements of the mission. The first two flights of Spacelab are designed to verify the flexibility and utility of all the elements of the Spacelab inventory. Spacelab Mission 2 will constitute the first flight of the pallet-only configuration of Spacelab. The major objective of Mission 2 is the verification of the performance of Space-lab systems and subsystems in this operating mode. The system performance will be verified using a complement of verification flight instrumentation and by operating a complement of scientific instrumentation to obtain scientific data. This paper describes the evolution of Spacelab Mission 2 including a discussion of the verification requirements and instrumentation, the experiments requirements and instrumentation, the major mission peculiar equipment to integrate the payload, and the general mission planning for the flight. Finally, the current status of the mission will be discussed with emphasis on hardware and software development, and on major activities yet to be completed.

  6. An international cooperative verification agenda for arms reduction

    The biggest challenge to the overall verification and monitoring agenda for future arms reductions may be that posed by uncertainties regarding the quantities of existing stocks of fissile material and nuclear weapons. We must develop strategies to reduce the residual uncertainties regarding completeness of initial declarations as all declared weapons-related inventories go to zero. Establishing this confidence in countries' initial baseline declarations will likely be a key point in all states' decisions to move to very low numbers, much less zero. The author reviews the questions and challenges that need to be addressed if there is to be significant progress in negotiating and implementing a verifiable fissile material cutoff treaty (FMCT) and a policy of nuclear weapon dismantling. In support of greater security as the world works towards the elimination of nuclear weapons, individual States could begin immediately by increasing the transparency of their nuclear activities. The International Verification Project is designed to bring experts from a wide array of related backgrounds together to build capacity for verification internationally in support of arms control goals (and in support of the larger objective of a world without nuclear weapons), build confidence between nuclear and non-nuclear-weapon states, promote freer flow of information among governments and between governments and non-governmental organizations (NGOs) and solve technical problems that could be barriers to progress. The paper is followed by the slides of the presentation. (A.C.)

  7. Verification and validation guidelines for high integrity systems. Volume 1

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  8. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  9. Verification and validation guidelines for high integrity systems. Volume 1

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  10. An Assembler Driven Verification Methodology (ADVM)

    Macbeth, John S.; Heinz, Dietmar; Gray, Ken

    2005-01-01

    Submitted on behalf of EDAA (http://www.edaa.com/) International audience This paper presents an overview of an assembler driven verification methodology (ADVM) that was created and implemented for a chip card project at Infineon Technologies AG. The primary advantage of this methodology is that it enables rapid porting of directed tests to new targets and derivatives, with only a minimum amount of code refactoring. As a consequence, considerable verification development time and effort...

  11. An Assembler Driven Verification Methodology (ADVM)

    Macbeth, John S; Gray, Ken

    2011-01-01

    This paper presents an overview of an assembler driven verification methodology (ADVM) that was created and implemented for a chip card project at Infineon Technologies AG. The primary advantage of this methodology is that it enables rapid porting of directed tests to new targets and derivatives, with only a minimum amount of code refactoring. As a consequence, considerable verification development time and effort was saved.

  12. A verification library for multibody simulation software

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  13. Inventory verification measurements using neutron multiplicity counting

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-12-31

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification.

  14. Verification and Validation in Systems Engineering

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  15. Probabilistic Anomaly Detection Method for Authorship Verification

    Boukhaled, Mohamed Amine; Ganascia, Jean-Gabriel

    2014-01-01

    Authorship verification is the task of determining if a given text is written by a candidate author or not. In this paper, we present a first study on using an anomaly detection method for the authorship verification task. We have considered a weakly supervised probabilistic model based on a multivari-ate Gaussian distribution. To evaluate the effectiveness of the proposed method, we conducted experiments on a classic French corpus. Our preliminary results show that the probabilistic method c...

  16. Code Formal Verification of Operation System

    Yu Zhang; Yunwei Dong; Huo Hong; Fan Zhang

    2010-01-01

    with the increasing pressure on non-function attributes (security, safety and reliability) requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operatio...

  17. On Integrating Deductive Synthesis and Verification Systems

    Kneuss, Etienne; Kuncak, Viktor; Kuraj, Ivan; Suter, Philippe

    2013-01-01

    We describe techniques for synthesis and verification of recursive functional programs over unbounded domains. Our techniques build on top of an algorithm for satisfiability modulo recursive functions, a framework for deductive synthesis, and complete synthesis procedures for algebraic data types. We present new counterexample-guided algorithms for constructing verified programs. We have implemented these algorithms in an integrated environment for interactive verification and synthesis from ...

  18. Online Fingerprint Verification Algorithm and Distributed System

    Xi Guo; Jyotirmay Gadedadikar; Ping Zhang

    2011-01-01

    In this paper, a novel online fingerprint verification algorithm and distribution system is proposed. In the beginning, fingerprint acquisition, image preprocessing, and feature extraction are conducted on workstations. Then, the extracted feature is transmitted over the internet. Finally, fingerprint verification is processed on a server through web-based database query. For the fingerprint feature extraction, a template is imposed on the fingerprint image to calculate the type and direction...

  19. A Continuous Verification Process in Concurrent Engineering

    Schaus, Volker; Tiede, Michael; Fischer, Philipp M.; Lüdtke, Daniel; Gerndt, Andreas

    2013-01-01

    This paper presents how a continuous mission verification process similar than in software engineering can be applied in early spacecraft design and Concurrent Engineering. Following the Model-based Systems Engineering paradigm, all engineers contribute to one single centralized data model of the system. The data model is enriched with some extra information to create an executable representation of the spacecraft and its mission. That executable scenario allows for verifications agains...

  20. Verification of MPI-based Computations

    Siegel, Stephen F.

    2008-01-01

    The Message Passing Interface is a widely-used parallel programming model and is the effective standard for high-performance scientific computing. It has also been used in parallel model checkers, such as DiVinE. In this talk we discuss the verification problem for MPI-based programs. The MPI is quite large and the semantics complex. Nevertheless, by restricting to a certain subset of MPI, the verification problem becomes tractable. Certain constructs outside of this subset (such as wildc...

  1. INTERPOLATION WITH RESTRICTED ARC LENGTH

    Petar Petrov

    2003-01-01

    For given data (ti,yi), I= 0,1,…,n,0 = t0 <t1 <…<tn = 1we study constrained interpolation problem of Favard type inf{‖f"‖∞|f∈W2∞[0,1],f(ti)=yi,i=0,…,n,l(f;[0,1])≤l0}, wherel(f";[0,1])=∫1 0 / 1+f'2(x)dx is the arc length off in [0,1]. We prove the existence of a solution f* of the above problem, that is a quadratic spline with a second derivative f"* , which coincides with one of the constants - ‖f"*‖∞,0,‖f"*‖∞ between every two consecutive knots. Thus, we extend a result ofKarlin concerning Favard problem, to the case of restricted length interpolation.

  2. Length of a Hanging Cable

    Eric Costello

    2011-01-01

    Full Text Available The shape of a cable hanging under its own weight and uniform horizontal tension between two power poles is a catenary. The catenary is a curve which has an equation defined by a hyperbolic cosine function and a scaling factor. The scaling factor for power cables hanging under their own weight is equal to the horizontal tension on the cable divided by the weight of the cable. Both of these values are unknown for this problem. Newton's method was used to approximate the scaling factor and the arc length function to determine the length of the cable. A script was written using the Python programming language in order to quickly perform several iterations of Newton's method to get a good approximation for the scaling factor.

  3. Variable focal length deformable mirror

    Headley, Daniel; Ramsey, Marc; Schwarz, Jens

    2007-06-12

    A variable focal length deformable mirror has an inner ring and an outer ring that simply support and push axially on opposite sides of a mirror plate. The resulting variable clamping force deforms the mirror plate to provide a parabolic mirror shape. The rings are parallel planar sections of a single paraboloid and can provide an on-axis focus, if the rings are circular, or an off-axis focus, if the rings are elliptical. The focal length of the deformable mirror can be varied by changing the variable clamping force. The deformable mirror can generally be used in any application requiring the focusing or defocusing of light, including with both coherent and incoherent light sources.

  4. Concepts for inventory verification in critical facilities

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  5. Distributed Verification and Hardness of Distributed Approximation

    Sarma, Atish Das; Kor, Liah; Korman, Amos; Nanongkai, Danupon; Pandurangan, Gopal; Peleg, David; Wattenhofer, Roger

    2010-01-01

    We study the {\\em verification} problem in distributed networks, stated as follows. Let $H$ be a subgraph of a network $G$ where each vertex of $G$ knows which edges incident on it are in $H$. We would like to verify whether $H$ has some properties, e.g., if it is a tree or if it is connected. We would like to perform this verification in a decentralized fashion via a distributed algorithm. The time complexity of verification is measured as the number of rounds of distributed communication. In this paper we initiate a systematic study of distributed verification, and give almost tight lower bounds on the running time of distributed verification algorithms for many fundamental problems such as connectivity, spanning connected subgraph, and $s-t$ cut verification. We then show applications of these results in deriving strong unconditional time lower bounds on the {\\em hardness of distributed approximation} for many classical optimization problems including minimum spanning tree, shortest paths, and minimum cut....

  6. Minimal Length, Measurability and Gravity

    Alexander Shalyt-Margolin

    2016-03-01

    Full Text Available The present work is a continuation of the previous papers written by the author on the subject. In terms of the measurability (or measurable quantities notion introduced in a minimal length theory, first the consideration is given to a quantum theory in the momentum representation. The same terms are used to consider the Markov gravity model that here illustrates the general approach to studies of gravity in terms of measurable quantities.

  7. Verification of important cross section data

    Full text: Continuing efforts in nuclear data development have made the design of a fusion power system less uncertain. The fusion evaluated nuclear data library (FENDL) development effort since 1987 under the leadership of the IAEA Nuclear Data Section has provided a credible international library for the investigation and design of the International Thermonuclear Engineering Reactor (ITER). Integral neutronics experiments are being carried out for ITER and fusion power plant blanket and shield assemblies to validate the available nuclear database and to identify deficiencies for further improvement. Important cross section data need experimental verifications if these data are evaluated based on physics model calculations and there are no measured data points available. A particular reaction cross section is Si28(n,x)Al27, which is the important cross section to determine whether the low activation SiC composite structure can be qualified as low level nuclear waste after life time exposure in the first wall neutron environment in a fusion power plant. Measurements of helium production data for candidate fusion materials are also needed, particularly at energies above 14 MeV for the assessment of materials damage in the IFMIF neutron spectrum. To a less extent, it appears that V51(n,x)Ti50 reaction cross section also needs to be measured to further confirm a recent new evaluation of vanadium for ENDF/B-VII. (author)

  8. Context-aware approach for formal verification

    Amel Benabbou

    2016-02-01

    Full Text Available The Context-aware approach has proven to be an effective technique for software model-checking verification. It focuses on the explicit modelling of environment as one or more contexts. In this area, specifying precise requirement is a challenged task for engineer since often environmental conditions lack of precision. A DSL, called CDL, has been proposed to facilitate the specification of requirement and context. However, such language is still low-level and error prone, difficult to grasp on complex models and assessment about its usability is still mitigated. In this paper, we propose a high level formalism of CDL to facilitate specifying contexts based on interaction overview diagrams that orchestrate activity diagrams automatically transformed from textual use cases. Our approach highlights the boundaries between the system and its environment. It is qualified as model checking context-aware that aims to reduce the semantic gap between informal and formal requirements, hence the objective is to assist and encourage engineers to put sufficient details to accomplish effectively the specification process.

  9. Field verification of CO sub 2 -foam

    Martin, F.D.; Heller, J.P.; Weiss, W.W.

    1992-05-01

    In September 1989, the Petroleum Recovery Research Center (PRRC), a division of New Mexico Institute of Mining and Technology, received a grant from the US Department of Energy (DOE) for a project entitled Field Verification of CO{sub 2} Foam.'' The grant provided for an extension of the PRRC laboratory work to a field testing stage to be performed in collaboration with an oil producer actively conducting a CO{sub 2} flood. The objectives of this project are to: (1) conduct reservoir studies, laboratory tests, simulation runs, and field tests to evaluate the use of foam for mobility control or fluid diversion in a New Mexico CO{sub 2} flood, and (2) evaluate the concept of CO{sub 2}-foam in the field by using a reservoir where CO{sub 2} flooding is ongoing, characterizing the reservoir, modeling the process, and monitoring performance of the field test. Seven tasks were identified for the successful completion of the project: (1) evaluate and select a field site, (2) develop an initial site- specific plan, (3) conduct laboratory CO{sub 2}-foam mobility tests, (4) perform reservoir simulations, (5) design the foam slug, (6) implement a field test, and (7) evaluate results.

  10. Verification for radiological decommissioning - Lessons learned

    During the past 10 years, the Environmental Survey and Site Assessment Program (ESSAP) at Oak ridge Associated Universities has performed radiological surveys to confirm the adequacy of cleanup and/or decommissioning actions at sites and facilities where radioactive materials have been handled. These surveys are part of the independent oversight programs of the US Department of Energy (DOE) and the US Nuclear Regulatory Commission (NRC). Results of verification activities have been discouraging. Numerous independent surveys have identified residual contamination requiring further remediation; in some cases, initial decontamination and postremedial action monitoring were totally inadequate. While participating in decommission projects, ESSAP learned valuable lessons and has given this information to regulating agencies and decommissioning sites. The goal of this presentation is to highlight the difficulties encountered by ESSAP in its involvement with NRC and DOE decommissioning projects. Decommissioning projects require teamwork, and success depends to a large degree on the communication, cooperation, and coordination of efforts among the individual organizations involved. This information could be used by organizations involved in future decontamination projects to avoid some of the pitfalls associated with this process

  11. Visual inspection for CTBT verification

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  12. Formal verification and validation of the safety-critical software in a digital reactor protection system

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  13. National Verification System of National Meteorological Center , China

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  14. Production of enzymatically active recombinant full-length barley high pI alpha-glucosidase of glycoside family 31 by high cell-density fermentation of Pichia pastoris and affinity purification

    Næsted, Henrik; Kramhøft, Birte; Lok, F.; Bojsen, K.; Yu, S.; Svensson, Birte

    2006-01-01

    Recombinant barley high pI alpha-glucosidase was produced by high cell-density fermentation of Pichia pastoris expressing the cloned full-length gene. The gene was amplified from a genomic clone and exons (coding regions) were assembled by overlap PCR. The resulting cDNA was expressed under control...

  15. Monitoring and verification R&D

    Pilat, Joseph F [Los Alamos National Laboratory; Budlong - Sylvester, Kory W [Los Alamos National Laboratory; Fearey, Bryan L [Los Alamos National Laboratory

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  16. Bayes Estimation of Queue Length

    Dohnal, Pavel

    Praha : ÚTIA AV ČR, 2006 - (Přikryl, J.; Šmídl, V.). s. 47-48 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. 25.09.2006-30.09.2006, Hrubá Skála] R&D Projects: GA AV ČR 1ET100750401; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : traffic flow * queue length * occupancy * intensity Subject RIV: BC - Control Systems Theory

  17. Bayes Estimation of Queue Length

    Dohnal, Pavel

    Praha : ÚTIA AV ČR, 2006 - ( And rýsek, J.), s. 1-8 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. Hrubá Skála (CZ), 25.09.2006-30.09.2006] R&D Projects: GA MŠk 1M0572; GA AV ČR 1ET100750401 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayes estimation * queue length * traffic flow * occupancy * intensity Subject RIV: BC - Control Systems Theory

  18. Minimal Length, Measurability and Gravity

    Shalyt-Margolin, A E

    2016-01-01

    The present work is a continuation of the previous papers written by the author on the subject. In terms of the measurability (or measurable quantities) notion introduced in a minimal length theory, first the consideration is given to a quantum theory in the momentum representation. The same terms are used to consider the Markov gravity model that here illustrates the general approach to studies of gravity in terms of measurable quantities. This paper is dedicated to the 75th Anniversary of Professor Vladimir Grigor'evich Baryshevsky.

  19. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  20. Automated verification of system configuration

    Andrews, W. H., Jr.; Baker, S. P.; Blalock, A. V.

    1991-05-01

    Errors in field wiring can result in significant correction costs (if the errors are discovered prior to use), in erroneous or unusable data (if the errors are not discovered in time), or in serious accidents (if the errors corrupt critical data). Detailed field wiring checkout rework are tedious and expensive, but they are essential steps in the quality assurance process for large, complex instrumentation and control systems. A recent Oak Ridge National Laboratory (ORNL) development, the CONFiguration IDEnification System (CONFIDES) automates verification of field wiring. In CONFIDES, an identifier module is installed on or integrated into each component (e.g., sensor, actuator, cable, distribution panel) to be verified. Interrogator modules, controlled by a personal computer (PC), are installed at the connections of the field wiring to the inputs of the data acquisition and control system (DACS). Interrogator modules poll the components connected to each channel of the DACS and can determine the path taken by each channel's signal to or from the end device for that channel. The system will provide not only the identification (ID) code for the cables and patch panels in the path to a particular sensor or actuator but for individual cable conductor IDs as well. One version of the system uses existing signal wires for communications between CONFIDES modules. Another, more powerful version requires a single dedicated conductor in each cable. Both version can operate with or without instrument power applied and neither interferes with the normal operation of the DACS. Identifier modules can provide a variety of information including status and calibration data.

  1. Ozone Monitoring Instrument geolocation verification

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  2. Evolution of IAEA verification in relation to nuclear disarmament

    The Agency has over forty years of experience in applying safeguards in 70 States. This experience has been used to provide safeguards to the 'excess material', nuclear material irreversibly released from the nuclear weapons program in the United States. The IAEA safeguards experience has also helped to put the Trilateral Initiative on a fast forward track. The basic work on an agreement and on technical verification details is well on the way and may feed seamless into the Plutonium Management and Disposition Agreement (PMDA). Since fissile material remains the most essential part of a nuclear weapon, technology and approaches currently used for safeguards in non-nuclear weapon States may be utilized, or further developed, to assure the international community that such material remains irreversibly removed from weapons programs. The IAEA experience in understanding relevant processes from the nuclear fuel cycle permit the application of monitoring regimes in nuclear facilities and their operation to assure that these facilities cannot be misused for proscribed activities under an international treaty that would ban the production of weapons-usable material. It must be remembered that the application of safeguards pursuant to the NPT is an Agency's core activity. There is no such explicit and forceful mandate for the Agency in respect of nuclear disarmament, unless an international agreement or treaty would designate the Agency to become the verification organization for that agreement, too. The Agency Statute requests the Agency to 'Conduct its activities' in conformity with policies of the UN furthering the establishment of safeguarded worldwide disarmament'. Technical skills and experience exist. A path from the IAEA international safeguards regime of today leading to a verification arrangement under an Fissile Material Cut-off Treaty (FMCT) may be possible

  3. Symposium on International Safeguards: Preparing for Future Verification Challenges

    The purpose of the symposium is to foster dialogue and information exchange involving Member States, the nuclear industry and members of the broader nuclear non-proliferation community to prepare for future verification challenges. Topics addressed during the 2010 symposium include the following: - Supporting the global nuclear non-proliferation regime: Building support for strengthening international safeguards; Enhancing confidence in compliance with safeguards obligations; Legal authority as a means to enhance effectiveness and efficiency; Verification roles in support of arms control and disarmament. - Building collaboration and partnerships with other international forums: Other verification and non-proliferation regimes; Synergies between safety, security and safeguards regimes. - Improving cooperation between IAEA and States for safeguards implementation: Strengthening State systems for meeting safeguards obligations; Enhancing safeguards effectiveness and efficiency through greater cooperation; Lessons learned: recommendations for enhancing integrated safeguards implementation. - Addressing safeguards challenges in an increasingly interconnected world: Non-State actors and covert trade networks; Globalization of nuclear information and technology. - Preparing for the global nuclear expansion and increasing safeguards workload: Furthering implementation of the State-level concept and integrated safeguards; Information-driven safeguards; Remote data-driven safeguards inspections; Safeguards in States without comprehensive safeguards agreements. - Safeguarding advanced nuclear facilities and innovative fuel cycles: Proliferation resistance; Safeguards by design; Safeguards approaches for advanced facilities. - Advanced technologies and methodologies: For verifying nuclear material and activities; For detecting undeclared nuclear material and activities; For information collection, analysis and integration. - Enhancing the development and use of safeguards

  4. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  5. Inspection and verification of waste packages for near surface disposal

    depending upon the individual Member State's QA/QC system for waste management. In this context, this publication is a collection of current information about various Member States' QA/QC programmes. It reviews them in terms of common approaches and technical procedures as well as applicable technologies. This TECDOC will benefit Member States, especially developing countries, that are planning, establishing or upgrading existing near surface repository systems. This publication is intended to provide technical guidance and current technical information about assuring compliance of waste packages with near surface disposal facility acceptance requirements by means of inspection and verification. It, therefore, discusses concepts of waste package inspection and verification, waste acceptance requirements, establishment of a waste package QA/QC programme, technical activities, inspection and verification procedures, and waste generator/disposal facility operator interface i

  6. Cleanup Verification Package for the 116-K-2 Effluent Trench

    J. M. Capron

    2006-04-04

    This cleanup verification package documents completion of remedial action for the 116-K-2 effluent trench, also referred to as the 116-K-2 mile-long trench and the 116-K-2 site. During its period of operation, the 116-K-2 site was used to dispose of cooling water effluent from the 105-KE and 105-KW Reactors by percolation into the soil. This site also received mixed liquid wastes from the 105-KW and 105-KE fuel storage basins, reactor floor drains, and miscellaneous decontamination activities.

  7. Dosimetric verification in intensity modulated radiation therapy

    As part of dosimetric verification for IMRT intensity modulated radiation therapy, we examined the selection of a dosimeter in accordance with the purpose of physical measurement and the process of data analysis. Because of the high dose conformation in the target volume and minimum dose in the organs at risk (OAR) in IMRT, dosimetric verification is essential. Because the performance of dosimetric verification in a patient is not allowed, a physical phantom and dosimeter must be used. Dose verification using a physical phantom, from which the beam data optimized for a patient slated for IMRT are transferred, may cause latent error as a result of change in the depth of each beam toward an isocenter. This effect may change the dose distribution and prescription dose. The basic methods of dosimetric verification with physical measurement are point dosimetry, when the reference dose is given at a point by planning software, and volumetric dosimetry, when planning software gives the dose as a volumetric configuration. While the most accurate dosimetry is done using a calibrated ionization chamber, IMRT requires volumetric dosimetry using some kind of portal film or a polymer gel dosimeter, because of the need for dosimetric verification for an irregular dose distribution in IMRT. The importance of indirect dosimetry using these methods is to provide calibration as a dosimeter, absolute dose, and preservation of calibration. In our study, the verification of dose distribution for IMRT using portal film and RANDO phantom could be performed with an error of less than 2% in all cases. The measurement error for the central dose using a JARP-type ionization chamber and MixDP was less than 3% in all cases except for the case with the maximum error. At the moment, IMRT requires a great deal of effort in the processes of planning, dosimetric verification, and isocenter checking in every fraction to maintain high accuracy. Although the need for a large amount of effort in the

  8. Attribute Verification Systems: Concepts and Status

    Verification of the characteristics of large pieces of nuclear material is relatively straightforward if that material is not classified. However, this type of radiation measurement is, almost by definition, very intrusive. An alternative is to measure selected attributes of the material; an attribute of an object is an unclassified characteristic (e.g. exceeding a negotiated mass threshold) that is related to a classified quantity (e.g., the mass of an object). Such an attribute verification system must meet two criteria: 1) classified information cannot be released to the inspecting party, and 2) the inspecting party must be able to reach credible and independent conclusions. The attribute verification system for use in international agreements must satisfy both requirements simultaneously, to the satisfaction of all parties concerned. One key point in the design of such systems is that while the measurement data itself may be classified, the measurement system cannot be. A specific example of a 'three attribute' verification system is the 'Attribute Verification System with Information Barrier for Plutonium with Classified Characteristics utilizing Neutron Multiplicity Counting and High-Resolution Gamma-Ray Spectrometry' (or AVNG), which is currently being designed and built as part of ongoing cooperation within the trilateral format (IAEA, Russia, and USA)

  9. Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

    Abhishek Jain

    2014-01-01

    Full Text Available With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

  10. Ligand chain length conveys thermochromism.

    Ganguly, Mainak; Panigrahi, Sudipa; Chandrakumar, K R S; Sasmal, Anup Kumar; Pal, Anjali; Pal, Tarasankar

    2014-08-14

    Thermochromic properties of a series of non-ionic copper compounds have been reported. Herein, we demonstrate that Cu(II) ion with straight-chain primary amine (A) and alpha-linolenic (fatty acid, AL) co-jointly exhibit thermochromic properties. In the current case, we determined that thermochromism becomes ligand chain length-dependent and at least one of the ligands (A or AL) must be long chain. Thermochromism is attributed to a balanced competition between the fatty acids and amines for the copper(II) centre. The structure-property relationship of the non-ionic copper compounds Cu(AL)2(A)2 has been substantiated by various physical measurements along with detailed theoretical studies based on time-dependent density functional theory. It is presumed from our results that the compound would be a useful material for temperature-sensor applications. PMID:24943491

  11. Geometry of area without length

    Ho, Pei-Ming; Inami, Takeo

    2016-01-01

    To define a free string by the Nambu-Goto action, all we need is the notion of area, and mathematically the area can be defined directly in the absence of a metric. Motivated by the possibility that string theory admits backgrounds where the notion of length is not well defined but a definition of area is given, we study space-time geometries based on the generalization of a metric to an area metric. In analogy with Riemannian geometry, we define the analogues of connections, curvatures, and Einstein tensor. We propose a formulation generalizing Einstein's theory that will be useful if at a certain stage or a certain scale the metric is ill defined and the space-time is better characterized by the notion of area. Static spherical solutions are found for the generalized Einstein equation in vacuum, including the Schwarzschild solution as a special case.

  12. Geometry of Area Without Length

    Ho, Pei-Ming

    2015-01-01

    To define a free string by the Nambu-Goto action, all we need is the notion of area, and mathematically the area can be defined directly in the absence of a metric. Motivated by the possibility that string theory admits backgrounds where the notion of length is not well defined but a definition of area is given, we study space-time geometries based on the generalization of metric to area metric. In analogy with Riemannian geometry, we define the analogues of connections, curvatures and Einstein tensor. We propose a formulation generalizing Einstein's theory that will be useful if at a certain stage or a certain scale the metric is ill-defined and the space-time is better characterized by the notion of area. Static spherical solutions are found for the generalized Einstein equation in vacuum, including the Schwarzschild solution as a special case.

  13. PECULIARITIES OF MORPHOLOGICAL VERIFICATION IN BREAST CANCER

    L.F. Zhandarova

    2008-06-01

    Full Text Available 80 case histories of patients with breast cancer were analyzed. During the preoperative examination with objective and instrumental examination methods used the malignant process was suspected but no morphological verification was received. Physical examination revealed 75% cases of cancer. Roentgenologic evidence of malignant tumor was found in 43.5% women. Ultrasound examination of mammary glands showed that 57.7% of patients had cancer symptoms. Despite the repeated puncture aspiration biopsy, preoperative morphological examination proved to be negative. The reasons of morphological verification failure are connected with technical difficulties and morphological features of tumor structure. Negative malignant process verification necessitated the diagnostic partial mastectomy. To achieve ablasticity ofexcisional biopsyit is necessary to keep 2 cm from the tumor. Staged morphological diagnosis verifies the diagnosis in all patients, allowing to choose the adequate extentof surgical procedures.

  14. Code Formal Verification of Operation System

    Yu Zhang

    2010-12-01

    Full Text Available with the increasing pressure on non-function attributes (security, safety and reliability requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operation system kernel in C code level. We present a case study to the verification of real-world C systems code derived from an implementation of μC/OS – II in the end.

  15. Energy verification in Ion Beam Therapy

    Moser, F; Dorda, U

    2011-01-01

    The adoption of synchrotrons for medical applications necessitates a comprehensive on-line verification of all beam parameters, autonomous of common beam monitors. In particular for energy verification, the required precision of down to 0.1MeV in absolute terms, poses a special challenge regarding the betatron-core driven 3rd order extraction mechanism which is intended to be used at MedAustron [1]. Two different energy verification options have been studied and their limiting factors were investigated: 1) A time-of-flight measurement in the synchrotron, limited by the orbit circumference information and measurement duration as well as extraction uncertainties. 2) A calorimeter-style system in the extraction line, limited by radiation hardness and statistical fluctuations. The paper discusses in detail the benefits and specific aspects of each method.

  16. Probabilistic Model for Dynamic Signature Verification System

    Chai Tong Yuen

    2011-11-01

    Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.

  17. Numerical Weather Predictions Evaluation Using Spatial Verification Methods

    Tegoulias, I.; Pytharoulis, I.; Kotsopoulos, S.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2014-12-01

    During the last years high-resolution numerical weather prediction simulations have been used to examine meteorological events with increased convective activity. Traditional verification methods do not provide the desired level of information to evaluate those high-resolution simulations. To assess those limitations new spatial verification methods have been proposed. In the present study an attempt is made to estimate the ability of the WRF model (WRF -ARW ver3.5.1) to reproduce selected days with high convective activity during the year 2010 using those feature-based verification methods. Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. By alternating microphysics (Ferrier, WSM6, Goddard), boundary layer (YSU, MYJ) and cumulus convection (Kain-­-Fritsch, BMJ) schemes, a set of twelve model setups is obtained. The results of those simulations are evaluated against data obtained using a C-Band (5cm) radar located at the centre of the innermost domain. Spatial characteristics are well captured but with a variable time lag between simulation results and radar data. Acknowledgements: This research is co­financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-­-2013).

  18. Telomere Rapid Deletion Regulates Telomere Length in Arabidopsis thaliana▿

    Watson, J. Matthew; Dorothy E Shippen

    2006-01-01

    Telomere length is maintained in species-specific equilibrium primarily through a competition between telomerase-mediated elongation and the loss of terminal DNA through the end-replication problem. Recombinational activities are also capable of both lengthening and shortening telomeres. Here we demonstrate that elongated telomeres in Arabidopsis Ku70 mutants reach a new length set point after three generations. Restoration of wild-type Ku70 in these mutants leads to discrete telomere-shorten...

  19. On Backward-Style Anonymity Verification

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  20. Key Nuclear Verification Priorities - Safeguards and Beyond

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  1. Time Optimal Reachability Analysis Using Swarm Verification

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  2. 340 and 310 drawing field verification

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  3. A New Signature Scheme with Shared Verification

    JIA Xiao-yun; LUO Shou-shan; YUAN Chao-wei

    2006-01-01

    With expanding user demands, digital signature techniques are also being expanded greatly, from single signature and single verification techniques to techniques supporting multi-users. This paper presents a new digital signature scheme vith shared verification based on the fiat-shamir signature scheme. This scheme is suitable not only for digital signatures of one public key, but also for situations where multiple public keys are required. In addition, the scheme can resist all kinds of collusion, making it more practicable and safer. Additionally it is more efficient than other schemes.

  4. Safety Verification for Probabilistic Hybrid Systems

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification of...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...... number of case studies, tackled using a prototypical implementation....

  5. Design and ground verification of proximity operations

    Tobias, A.; Ankersen, F.; Fehse, W.; Pauvert, C.; Pairot, J.

    This paper describes the approach to guidance, navigation, and control (GNC) design and verification for proximity operations. The most critical part of the rendezvous mission is the proximity operations phase when the distance between chaser and target is below approximately 20 m. Safety is the overriding consideration in the design of the GNC system. Requirements on the GNC system also stem from the allocation of performance between proximity operations and the mating process, docking, or capture for berthing. Whereas the design process follows a top down approach, the verification process goes bottom up in a stepwise way according to the development stage.

  6. Fuel Retrieval System Design Verification Report

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000)

  7. Fast, simple, and informative patient-specific dose verification method for intensity modulated total marrow irradiation with helical tomotherapy

    Patient-specific dose verification for treatment planning in helical tomotherapy is routinely performed using a homogeneous virtual water cylindrical phantom of 30 cm diameter and 18 cm length (Cheese phantom). Because of this small length, treatment with total marrow irradiation (TMI) requires multiple deliveries of the dose verification procedures to cover a wide range of the target volumes, which significantly prolongs the dose verification process. We propose a fast, simple, and informative patient-specific dose verification method which reduce dose verification time for TMI with helical tomotherapy. We constructed a two-step solid water slab phantom (length 110 cm, height 8 cm, and two-step width of 30 cm and 15 cm), termed the Whole Body Phantom (WB phantom). Three ionization chambers and three EDR-2 films can be inserted to cover extended field TMI treatment delivery. Three TMI treatment plans were conducted with a TomoTherapy HiArt Planning Station and verified using the WB phantom with ion chambers and films. Three regions simulating the head and neck, thorax, and pelvis were covered in a single treatment delivery. The results were compared to those with the cheese phantom supplied by Accuray, Inc. following three treatment deliveries to cover the body from head to pelvis. Use of the WB phantom provided point doses or dose distributions from head and neck to femur in a single treatment delivery of TMI. Patient-specific dose verification with the WB phantom was 62% faster than with the cheese phantom. The average pass rate in gamma analysis with the criteria of a 3-mm distance-to-agreement and 3% dose differences was 94% ± 2% for the three TMI treatment plans. The differences in pass rates between the WB and cheese phantoms at the upper thorax to abdomen regions were within 2%. The calculated dose agreed with the measured dose within 3% for all points in all five cases in both the WB and cheese phantoms. Our dose verification method with the WB phantom

  8. VERIFICATION OF A TOXIC ORGANIC SUBSTANCE TRANSPORT AND BIOACCUMULATION MODEL

    A field verification of the Toxic Organic Substance Transport and Bioaccumulation Model (TOXIC) was conducted using the insecticide dieldrin and the herbicides alachlor and atrazine as the test compounds. The test sites were two Iowa reservoirs. The verification procedure include...

  9. OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGES

    Kushal Anjaria

    2015-08-01

    Full Text Available Formal verification of an operating system kernel manifests absence of errors in the kernel and establishes trust in it. This paper evaluates various projects on operating system kernel verification and presents indepth survey of them. The methodologies and contributions of operating system verification projects have been discussed in the present work. At the end, few unattended and interesting future challenges in operating system verification area have been discussed and possible directions towards the challenge solution have been described in brief.

  10. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  11. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    Chetvertkova, Vera [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Mustafin, Edil; Strasik, Ivan [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Ratzinger, Ulrich [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); Latysheva, Ludmila; Sobolevskiy, Nikolai [Institute for Nuclear Research RAS, Moscow (Russian Federation)

    2011-07-01

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  12. A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems

    Haxthausen, Anne Elisabeth

    2009-01-01

    elaborate safety mechanisms in order to keep the risk at the same low level that has been established for European railways until today. The challenge is further increased by the demand for shorter time-to-market periods and higher competition among suppliers of the railway domain; both factors resulting in...... a demand for a higher degree of automation for the development verification, validation and test phases of projects, without impairing the thoroughness of safety-related quality measures and certification activities. Motivated by these considerations, this presentation describes an approach for...... automated construction and verification of railway control systems....

  13. A verification calculation of drum and pulley overhead travelling crane on gamma irradiator ISG-500

    It has been verified the calculation of drum and pulleys on cranes as facility the gamma irradiator ISG-500. Drum is a device for rolling steel ropes while the pulley is a circular pieces called disks, and both of which are made from metal or non-metal to transmit motion and force. It has been verified the calculation of forces on the drum, drum diameter and length, and pressuring force occurred on the drums. Likewise to the pulley, the pulley diameter calculations verification, size of disc and shaft power pulleys. From the verification results, it will be obtained whether the data drums and pulley device are safe or not safe to be used. (author)

  14. Definition of ground test for Large Space Structure (LSS) control verification, appendix G

    1984-01-01

    A Large Space Structure (LSS) ground test facility was developed to help verify LSS passive and active control theories. The facility also perform: (1) subsystem and component testing; (2) remote sensing and control; (3) parameter estimation and model verification; and (4) evolutionary modeling and control. The program is examined as is and looks at the first experiment to be performed in the laboratory.

  15. Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit

    L. D. Habel

    2008-05-20

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.

  16. The probabilistic distribution of metal whisker lengths

    Niraula, D., E-mail: Dipesh.Niraula@rockets.utoledo.edu; Karpov, V. G., E-mail: victor.karpov@utoledo.edu [Department of Physics and Astronomy, University of Toledo, Toledo, Ohio 43606 (United States)

    2015-11-28

    Significant reliability concerns in multiple industries are related to metal whiskers, which are random high aspect ratio filaments growing on metal surfaces and causing shorts in electronic packages. We derive a closed form expression for the probabilistic distribution of metal whisker lengths. Our consideration is based on the electrostatic theory of metal whiskers, according to which whisker growth is interrupted when its tip enters a random local “dead region” of a weak electric field. Here, we use the approximation neglecting the possibility of thermally activated escapes from the “dead regions,” which is later justified. We predict a one-parameter distribution with a peak at a length that depends on the metal surface charge density and surface tension. In the intermediate range, it fits well the log-normal distribution used in the experimental studies, although it decays more rapidly in the range of very long whiskers. In addition, our theory quantitatively explains how the typical whisker concentration is much lower than that of surface grains. Finally, it predicts the stop-and-go phenomenon for some of the whiskers growth.

  17. Static and Dynamic Verification of Critical Software for Space Applications

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  18. QA's role in the independent verification of plant readiness for startup from planned outages

    Quality Assurance (QA) personnel at the N Reactor, located near Richland, Washington, USA, perform many activities to independently verify the readiness for startup of the reactor from planned outages. The verifications are performed as inspections, test witnessing, audits, surveillance, and followup on identified corrective action needs. The results of these verifications are used in a formal readiness review process. The formal reviews are administered by a Review Board of representatives from several major components of the Company and are conducted using systematically structured analytical techniques. The N Reactor QA staff includes 26 persons (excluding managers) who are involved in independent verifications of reactor-related work as part or all of their assigned functions

  19. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  20. Monitoring/Verification using DMS: TATP Example

    Stephan Weeks; Kevin Kyle

    2008-03-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the use of explosives or chemical and biological weapons in terrorist activities. Two peroxide-based liquid explosives, triacetone triperoxide (TATP) and hexamethylene triperoxide diamine (HMTD), are synthesized from common chemicals such as hydrogen peroxide, acetone, sulfuric acid, ammonia, and citric acid (Figure 1). Recipes can be readily found on the Internet by anyone seeking to generate sufficient quantities of these highly explosive chemicals to cause considerable collateral damage. Detection of TATP and HMTD by advanced sensing systems can provide the early warning necessary to prevent terror plots from coming to fruition. DMS is currently one of the foremost emerging technologies for the separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. DMS separates and identifies ions at ambient pressures by utilizing the non-linear dependence of an ion's mobility on the radio frequency (rf) electric field strength. GC is widely considered to be one of the leading analytical methods for the separation of chemical species in complex mixtures. Advances in the technique have led to the development of low-thermal-mass fast GC columns. These columns are

  1. 40 CFR 1065.395 - Inertial PM balance verifications.

    2010-07-01

    ... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM... verifications. Perform other verifications using good engineering judgment and instrument manufacturer... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance...

  2. Main Timber Legality Verification Schemes in the World

    2010-01-01

    Based on introduction of the timber legality verification schemes,the article provides a detailed review of existing legality verification schemes covering aspects such as definition of legality,verification process.It aims to help Chinese companies understand the different requirements and evidence of compliance required by legislation,public and private procurement policies.

  3. 46 CFR 61.40-3 - Design verification testing.

    2010-10-01

    ... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in... 46 Shipping 2 2010-10-01 2010-10-01 false Design verification testing. 61.40-3 Section...

  4. 24 CFR 5.512 - Verification of eligible immigration status.

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  5. The Construction of Verification Models for Embedded Systems

    Mader, A.H.; Wupper, H.; Boon, M.

    2007-01-01

    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, w

  6. 7 CFR 983.67 - Random verification audits.

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Random verification audits. 983.67 Section 983.67..., ARIZONA, AND NEW MEXICO Reports, Books and Records § 983.67 Random verification audits. (a) All handlers' pistachio inventory shall be subject to random verification audits by the committee to ensure...

  7. IAEA inspectors complete verification of nuclear material in Iraq

    Full text: A team of IAEA inspectors has returned from Iraq to Vienna after completing the annual Physical Inventory Verification of declared nuclear material. The material - natural or low-enriched uranium - is consolidated at a storage facility near the Tuwaitha complex, south of Baghdad. The inspectors found no diversion of nuclear material. The two-day inspection was conducted with the logistical and security assistance of the Multinational Force, the Office of the UN Security Coordinator, and the UN Assistance Mission for Iraq. Every non-nuclear-weapon State party to the NPT that has declared holdings of nuclear material is required to undergo such inspections. The inspectors verify the correctness of the State's declaration, and that material has not been diverted to any undeclared activity. Such inspections have been performed in Iraq on a continuing basis. NPT safeguards inspections are limited in scope and coverage as compared to the verification activities carried out in 1991-98 and 2002-03 by the IAEA under Security Council resolution 687 and related resolutions. (IAEA)

  8. Internet-based dimensional verification system for reverse engineering processes

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  9. Formalization and Verification of Business Process Modeling Based on UML and Petri Nets

    YAN Zhi-jun; GAN Ren-chu

    2005-01-01

    In order to provide a quantitative analysis and verification method for activity diagrams based business process modeling, a formal definition of activity diagrams is introduced. And the basic requirements for activity diagrams based business process models are proposed. Furthermore, the standardized transformation technique between business process models and basic Petri nets is presented and the analysis method for the soundness and well-structured properties of business processes is introduced.

  10. Stamp Verification for Automated Document Authentication

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps is...

  11. Safety Verification for Probabilistic Hybrid Systems

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587. ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid systems * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  12. Verification of Autonomous Systems for Space Applications

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  13. Evaluating software verification systems: benchmarks and competitions

    Beyer, Dirk; Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary

    2014-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14171 “Evaluating Software Verification Systems: Benchmarks and Competitions”. The seminar brought together a large group of current and future competition organizers and participants, benchmark maintainers, as well as practition

  14. Behaviour Protocols Verification: Fighting State Explosion

    Mach, M.; Plášil, František; Kofroň, Jan

    2005-01-01

    Roč. 6, č. 2 (2005), s. 22-30. ISSN 1525-9293 R&D Projects: GA ČR(CZ) GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : formal verification * software components * stateexplos ion * behavior protocols * parse trees Subject RIV: JC - Computer Hardware ; Software

  15. The COST IC0701 verification competition 2011

    Bormer, T.; Brockschmidt, M.; Distefano, D.; Ernst, G.; Filliatre, J.-C.; Grigore, R.; Huisman, M.; Klebanov, V.; Marche, C.; Monahan, R.; Mostowski, W.I.; Poiikarpova, N.; Scheben, C.; Schellhorn, G.; Tofan, B.; Tschannen, J.; Ulbrich, M.; Beckert, B.; Damiani, F.; Gurov, D.

    2012-01-01

    This paper reports on the experiences with the program verification competition held during the FoVeOOS conference in October 2011. There were 6 teams participating in this competition. We discuss the three different challenges that were posed and the solutions developed by the teams. We conclude wi

  16. A Comparison of Modular Verification Techniques

    Andersen, Henrik Reif; Staunstrup, Jørgen; Maretti, Niels

    This paper presents and compares three techniques for mechanized verification of state oriented design descriptions. One is a traditional forwardgeneration of a fixed point characterizing the reachable states. The two others can utilize a modular structure provided by the designer. Onerequires a...

  17. Integrating automated verification into interactive systems development

    Campos, J. Creissac

    1998-01-01

    Our field of research is the application of automated reasoning techniques during interactor based interactive systems development. The aim being to ensure that the developed systems embody appropriate properties and principles. In this report we identify some of the pitfalls of current approaches and propose a new way to integrate verification into interactive systems development.

  18. RELAP-7 SOFTWARE VERIFICATION AND VALIDATION PLAN

    Smith, Curtis L [Idaho National Laboratory; Choi, Yong-Joon [Idaho National Laboratory; Zou, Ling [Idaho National Laboratory

    2014-09-01

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  19. SRS Software Verification Preoperational and Startup Test

    This document defines testing for the software used to control the Sodium Removal System (SRS). The testing is conducted off-line from the. physical plant by using a simulator built-in to the software. This provides verification of proper software operation prior to performing the operational acceptance tests with the actual plant hardware

  20. Verification of Software Components: Addressing Unbounded Paralelism

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309. ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  1. Verification Techniques for Graph Rewriting (Tutorial)

    Rensink, Arend; Abdulla, Parosh Aziz; Gadducci, Fabio; König, Barbara; Vafeiadis, Viktor

    2016-01-01

    This tutorial paints a high-level picture of the concepts involved in verification of graph transformation systems. We distinguish three fundamentally different application scenarios for graph rewriting: (1) as grammars (in which case we are interested in the language, or set, of terminal graphs for

  2. An eclectic quadrant of rule based system verification: work grounded in verification of fuzzy rule bases

    Viaene, Stijn; Wets, G.; Vanthienen, Jan; Dedene, Guido

    1999-01-01

    In this paper, we used a research approach based on grounded theory in order to classify methods proposed in literature that try to extend the verification of classical rule bases to the case of fuzzy knowledge modeling. Within this area of verification we identify two dual lines of thought respectively leading to what is termed respectively static and dynamic anomaly detection methods. The major outcome of the confrontation of both approaches is that their results, most often stated in terms...

  3. Information, polarization and term length in democracy

    Schultz, Christian

    2008-01-01

    This paper considers term lengths in a representative democracy where the political issue divides the population on the left-right scale. Parties are ideologically different and better informed about the consequences of policies than voters are. A short term length makes the government more accou...... the uncertainty is large and parties are not very polarized. Partisan voters always prefer a long term length. When politicians learn while in office a long term length becomes more attractive for swing voters....

  4. Length Mutations in Human Mitochondrial DNA

    Cann, R. L.; Wilson, A. C.

    1983-01-01

    By high-resolution, restriction mapping of mitochondrial DNAs purified from 112 human individuals, we have identified 14 length variants caused by small additions and deletions (from about 6 to 14 base pairs in length). Three of the 14 length differences are due to mutations at two locations within the D loop, whereas the remaining 11 occur at seven sites that are probably within other noncoding sequences and at junctions between coding sequences. In five of the nine regions of length polymor...

  5. Scaling of avian primary feather length

    Nudds, Robert L.; Kaiser, Gary V.; Dyke, Gareth J.

    2011-01-01

    The evolution of the avian wing has long fascinated biologists, yet almost no work includes the length of primary feathers in consideration of overall wing length variation. Here we show that the length of the longest primary feather ( ) contributing to overall wing length scales with negative allometry against total arm (ta = humerus+ulna+manus). The scaling exponent varied slightly, although not significantly so, depending on whether a species level analysis was used or phylogeny was contro...

  6. Telomere Length Correlates with Life Span of Dog Breeds

    Laura J. Fick

    2012-12-01

    Full Text Available Telomeric DNA repeats are lost as normal somatic cells replicate. When telomeres reach a critically short length, a DNA damage signal is initiated, inducing cell senescence. Some studies have indicated that telomere length correlates with mortality, suggesting that telomere length contributes to human life span; however, other studies report no correlation, and thus the issue remains controversial. Domestic dogs show parallels in telomere biology to humans, with similar telomere length, telomere attrition, and absence of somatic cell telomerase activity. Using this model, we find that peripheral blood mononuclear cell (PBMC telomere length is a strong predictor of average life span among 15 different breeds (p < 0.0001, consistent with telomeres playing a role in life span determination. Dogs lose telomeric DNA ∼10-fold faster than humans, which is similar to the ratio of average life spans between these species. Breeds with shorter mean telomere lengths show an increased probability of death from cardiovascular disease, which was previously correlated with short telomere length in humans.

  7. Environmental radiation measurement in CTBT verification system

    This paper introduces the technical requirements of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Radionuclide Stations, the CTBT-related activities carried out by the Japan Atomic Energy Research Institute (JAERI), and the ripple effects of such acquired radionuclide data on general researches. The International Monitoring System (IMS), which is one of the CTBT verification regime. Consists of 80 radionuclide air monitoring stations (of those, 40 stations monitor noble gas as well) and 16 certified laboratories that support these stations throughout the world. For radionuclide air monitoring under the CTBT, the stations collect particulates in the atmosphere on a filter and determine by gamma-ray spectrometry the presence or absence of any radionuclides (e.g. 140Ba, 131I, 99Mo, 132Te, 103Ru, 141Ce, 147Nd, 95Zr, etc.) that offer clear evidence of possible nuclear explosion. Minimum technical requirements are stringently set for the radionuclide air monitoring stations: 500 m3/h air flow rate, 24-hour acquisition time, 10 to 30 Bq/m3 of detection sensitivity for 140Ba, and less than 7 consecutive days, or total of 15 days, a year of shutdown at the stations. For noble gas monitoring, on the other hand, the stations separate Xe from gas elements in the atmosphere and, after purifying and concentrating it, measure 4 nuclides, 131mXe, 133Xe, 133mXe, and 135Xe, by gamma-ray spectrometry or beta-gamma coincidence method. Minimum technical requirements are also set for the noble gas measurement: 0.4 m3/h air flow rate, a full capacity of 10 m3, and 1 Bq/m3 of detection sensitivity for 133Xe, etc. On the request of the Ministry of Education, Culture, Sports and Technology, the JAERI is currently undertaking the establishment of the CTBT radionuclide monitoring stations at both Takasaki (both particle and noble gas) and Okinawa (particle), the certified laboratory at JAERI Tokai, and the National Data Center (NDC 2) at JAERI Tokai, which handles radionuclide data, as

  8. 28 CFR 551.4 - Hair length.

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Hair length. 551.4 Section 551.4 Judicial... Hair length. (a) The Warden may not restrict hair length if the inmate keeps it neat and clean. (b) The Warden shall require an inmate with long hair to wear a cap or hair net when working in food service...

  9. Post-silicon and runtime verification for modern processors

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  10. Telomere Length – a New Biomarker in Medicine

    Agnieszka Kozłowska

    2015-12-01

    Full Text Available A number of xenobiotics in the environment and workplace influences on our health and life. Biomarkers are tools for measuring such exposures and their effects in the organism. Nowadays, telomere length, epigenetic changes, mutations and changes in gene expression pattern have become new molecular biomarkers. Telomeres play the role of molecular clock, which influences on expectancy of cell life and thus aging, the formation of damages, development diseases and carcinogenesis. The telomere length depends on mechanisms of replication and the activity of telomerase. Telomere length is currently used as a biomarker of susceptibility and/or exposure. This paper describes the role of telomere length as a biomarker of aging cells, oxidative stress, a marker of many diseases including cancer, and as a marker of environmental and occupational exposure.

  11. Verification of the dispersion model by airborne carbon 14C

    This paper provides insight in the verification of the Lagrangean dispersion model for dose calculation in the environment. The verification method was based on the measurement of the airborne carbon 14C concentration which can be slightly increased close to the nuclear power plant. The results proved that this method is sensitive enough and that the sensitivity analysis can be used for model verification or for identification of possible improvements of the used meteorological data. The Lagrangean model is used at Krsko nuclear power plant (NPP) for calculation of dispersion coefficients and dose in the environment. To show compliance with the authorized dose limits it is required to present a realistic calculation of the dose to the public. This is a numerical model designed to calculate air pollution dispersion in the area of 25km x 25km. The model uses on-line local meteorological measurements. The same model was already verified for another location around a coal- fired power plant based on emission and environmental measurements of SO2. Krsko NPP is placed near the Sava River in a semiopened basin surrounded by several hills. The region is characterized by low winds and frequent thermal inversions. This paper presents a verification of the short range dispersion model based on the fact that the airborne carbon 14C concentration can be slightly increased close to the nuclear power plant. Other radioactive effluents are not detectable in the environment and carbon 14C measurements are accurate enough to detect small deviations from natural 14C levels and to compare them with the calculated concentration based on 14C effluents. The most of airborne 14C is released during the refuelling outage. Within the pre-selected period of ten days, increased effluents of 14C in the form of CO2 were sampled from the plant ventilation. The average atmospheric dispersion parameters were calculated for two locations in the environment where CO2 sampling plates were installed

  12. SHINE Vacuum Pump Test Verification

    Morgan, Gregg A; Peters, Brent

    2013-09-30

    scroll pump will be used to back the booster pump. In this case the ''booster pump'' is an Adixen Molecular Drag Pump (MDP 5011) and the backing pump is an Edwards (nXDS15iC) scroll pump. Various configurations of the two pumps and associated lengths of 3/4 inch tubing (0 feet to 300 feet) were used in combination with hydrogen and nitrogen flow rates ranging from 25-400 standard cubic centimeters per minute (sccm) to determine whether the proposed pump configuration meets the design criteria for SHINE. The results of this study indicate that even under the most severe conditions (300 feet of tubing and 400 sccm flow rate) the Adixen 5011 MDP can serve as a booster pump to transport gases from the accelerator (NDAS) to the TPS. The Target Gas Receiving System pump (Edwards nXDS15iC) located approximately 300 feet from the accelerator can effectively back the Adixen MDP. The molecular drag pump was able to maintain its full rotational speed even when the flow rate was 400 sccm hydrogen or nitrogen and 300 feet of tubing was installed between the drag pump and the Edwards scroll pump. In addition to maintaining adequate rotation, the pressure in the system was maintained below the target pressure of 30 torr for all flow rates, lengths of tubing, and process gases. This configuration is therefore adequate to meet the SHINE design requirements in terms of flow and pressure.

  13. SHINE Vacuum Pump Test Verification

    case the ''booster pump'' is an Adixen Molecular Drag Pump (MDP 5011) and the backing pump is an Edwards (nXDS15iC) scroll pump. Various configurations of the two pumps and associated lengths of 3/4 inch tubing (0 feet to 300 feet) were used in combination with hydrogen and nitrogen flow rates ranging from 25-400 standard cubic centimeters per minute (sccm) to determine whether the proposed pump configuration meets the design criteria for SHINE. The results of this study indicate that even under the most severe conditions (300 feet of tubing and 400 sccm flow rate) the Adixen 5011 MDP can serve as a booster pump to transport gases from the accelerator (NDAS) to the TPS. The Target Gas Receiving System pump (Edwards nXDS15iC) located approximately 300 feet from the accelerator can effectively back the Adixen MDP. The molecular drag pump was able to maintain its full rotational speed even when the flow rate was 400 sccm hydrogen or nitrogen and 300 feet of tubing was installed between the drag pump and the Edwards scroll pump. In addition to maintaining adequate rotation, the pressure in the system was maintained below the target pressure of 30 torr for all flow rates, lengths of tubing, and process gases. This configuration is therefore adequate to meet the SHINE design requirements in terms of flow and pressure

  14. Development of new verification technique of COBRA seal

    Fiber optic seals have been replacing the Type E metal cap seals which are in routine use of the IAEA. They provide in situ verification and permit the repeated use without being removed for verification. The fiber optic 'COBRA' seal that has been developed at the Sandia National Laboratories is widely used due to its easy installation and low price. The defect of the COBRA seal is visual comparison of seal images in verification. An electronic verifier which has a simpler and quantitative means for verification of COBRA seal has been developed at JAERI. This paper describes a new verification way which adopts recent image processing technique. The experimental results obtained by using prototype verifiers have demonstrated that this verification technique is reliable and effective for automatic verification of COBRA seal. (author)

  15. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    Weaver, Phyllis C.

    2012-08-29

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  16. Letter Report - Verification Results for the Non-Real Property Radiological Release Program at the West Valley Demonstration Project, Ashford, New York

    The objective of the verification activities is to provide an independent review of the design, implementation, and performance of the radiological unrestricted release program for personal property, materials, and equipment (non-real property).

  17. Data Storage Accounting and Verification at LHC experiments

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  18. Neutron spectrometry for UF6 enrichment verification in storage cylinders

    Verification of declared UF6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra were analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF6 enrichment in storage cylinders. The results from the present study also showed that difficulties associated with the UF6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented

  19. Calibration of laboratory equipment and its intermediate verification

    When a laboratory wants to prove that he has technical competence to carry out tests or calibrations must demonstrate that it has complied with certain requirements that establish , among others, the mandatory : calibrate or verify equipment before putting it into service in order to ensure that it meets to the specifications of laboratory equipment to keep records evidencing the checks that equipment complies with the specification ; perform intermediate checks for maintain confidence in the calibration status of the equipment , ensure that the operation is checked and calibration status of equipment when the equipment goes outside the direct control of the laboratory , before be returned to service, establish a program and procedure for the calibration of equipment; show how determined the calibration periods of their equipment as well as evidence that intermediate checks are suitable for the calibration periods. However, some confusion is observed as to the meaning of the terms 'calibration' and 'verification' of a computer. This paper analyzes applicable documentation and suggests that the differences are generated in part by translations and by characterization concepts upon its usage, that is, if it is legal metrology or assessment conformity. Therefore, this study aims to characterize both concepts , fundamentals to zoom distinguish , outline appropriate strategies for calibration and verification activities to ensure the compliance with regulatory requirements

  20. Verification and Validation Issues in Systems of Systems

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  1. Non-intrusive verification attributes for excess fissile materials

    Under U.S. initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The U.S. has expressed the intent to place these materials under international safeguards as soon as practicable. To support these commitments, members of the U.S. technical community are examining a variety of non-intrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of our work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.). We are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status. (author)

  2. INS/EKF-based stride length, height and direction intent detection for walking assistance robots.

    Brescianini, Dario; Jung, Jun-Young; Jang, In-Hun; Park, Hyun Sub; Riener, Robert

    2011-01-01

    We propose an algorithm used to obtain the information on stride length, height difference, and direction based on user's intent during walking. For exoskeleton robots used to assist paraplegic patients' walking, this information is used to generate gait patterns by themselves in on-line. To obtain this information, we attach an inertial measurement unit(IMU) on crutches and apply an extended kalman filter-based error correction method to reduce the phenomena of drift due to bias of the IMU. The proposed method is verifed in real walking scenarios including walking, climbing up-stairs, and changing direction of walking with normal. PMID:22275567

  3. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  4. Verification of Multiphysics software: Space and time convergence studies for nonlinearly coupled applications

    Jean C. Ragusa; Vijay Mahadevan; Vincent A. Mousseau

    2009-05-01

    High-fidelity modeling of nuclear reactors requires the solution of a nonlinear coupled multi-physics stiff problem with widely varying time and length scales that need to be resolved correctly. A numerical method that converges the implicit nonlinear terms to a small tolerance is often referred to as nonlinearly consistent (or tightly coupled). This nonlinear consistency is still lacking in the vast majority of coupling techniques today. We present a tightly coupled multiphysics framework that tackles this issue and present code-verification and convergence analyses in space and time for several models of nonlinear coupled physics.

  5. Burnout among physiotherapists and length of service

    Zbigniew Śliwiński

    2014-04-01

    Full Text Available Objectives: The aim of this study was to identify factors that contribute to the development of burnout among physiotherapists with different length of service in physiotherapy. Material and Methods: The following research tools were used to study burnout: the Life Satisfaction Questionnaire (LSQ, based on FLZ (Fragebogen zur Lebenszufriedenheit by Frahrenberg, Myrtek, Schumacher, and Brähler; the Burnout Scale Inventory (BSI by Steuden and Okła; and an ad hoc questionnaire to collect socio-demographic data. The survey was anonymous and voluntary and involved a group of 200 active physiotherapists working in Poland. Results: A statistical analysis revealed significant differences in overall life satisfaction between length-of-service groups (p = 0.03. Physiotherapists with more than 15 years of service reported greater satisfaction than those with less than 5 years and between 5 and 15 years of service. The results suggest that burnout in those with 5-15 years of service is higher in physiotherapists working in health care centers and increases with age and greater financial satisfaction, while it decreases with greater satisfaction with friend and family relations and greater satisfaction with one's work and profession. In those with more than 15 years of service, burnout increases in the case of working in a setting other than a health care or educational center and decreases with greater satisfaction with one's work and profession. Conclusions: Job satisfaction and a satisfying family life prevent burnout among physiotherapists with 5-15 years of service in the profession. Financial satisfaction, age and being employed in health care may cause burnout among physiotherapists with 5-15 years of service. Physiotherapists with more than 15 years of service experience more burnout if they work in a setting other than a health care or educational center and less burnout if they are satisfied with their profession.

  6. Getting ready for final disposal in Finland - Independent verification of spent fuel

    Full text: Final disposal of spent nuclear fuel has been known to be the solution for the back-end of the fuel cycle in Finland already for a long time. This has allowed the State system for accounting and control (SSAC) to prepare for the safeguards requirements in time. The Finnish SSAC includes the operator, the State authority STUK and the parties above them e.g. the Ministry for Trade and Industry. Undisputed responsibility of the safe disposal of spent fuel is on the operator. The role of the safety authority STUK. is to set up detailed requirements, to inspect the operator plans and by using different tools of a quality audit approach to verity that the requirements will be complied with in practice. Responsibility on the safeguards issues is similar with the addition of the role of the regional and the international verification organizations represented by Euratom and the IAEA, As the competent safeguards authority, STUK has decided to maintain its active role also in the future. This will be reflected in the future in the increasing cooperation between the SSAC and the IAEA in the new safeguards activities related to the Additional Protocol. The role of Euratom will remain the same concerning the implementation of conventional safeguards. Based on its SSAC role, STUK has continued carrying out safeguards inspections including independent verification measurements on spent fuel also after joining the EU and Euratom safeguards in 1995. Verification of the operator declared data is the key verification element of safeguards. This will remain to be the case also under the Integrated Safeguards (IS) in the future. It is believed that the importance of high quality measurements will rather increase than decrease when the frequency of interim inspections will decrease. Maintaining the continuity of knowledge makes sense only when the knowledge is reliable and independently verified. One of the corner stones of the high quality of the Finnish SSAC activities is

  7. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Harpeneau, Evan M. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  8. Final Report - Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions

  9. Analytical benchmarks for verification of thermal-hydraulic codes based on sub-channel approach

    Over the last year (2007), preliminary tests have been performed on the Moroccan TRIGA MARK II research reactor to show that, under all operating conditions, the coolant parameters fall within the ranges allowing the safe working conditions of the reactor core. In parallel, a sub-channel thermal-hydraulic code, named SACATRI (Sub-channel Analysis Code for Application to TRIGA reactors), was developed to satisfy the needs of numerical simulation tools, able to predict the coolant flow parameters. The thermal-hydraulic model of SACATRI code is based on four partial differential equations that describe the conservation of mass, energy, axial and transversal momentum. However, to achieve the full task of any numerical code, verification is a highly recommended activity for assessing the accuracy of computational simulations. This paper presents a new procedure which can be used during code and solution verification activities of thermal-hydraulic tools based on sub-channel approach. The technique of verification proposed is based mainly on the combination of the method of manufactured solution and the order of accuracy test. The verification of SACATRI code allowed the elaboration of exact analytical benchmarks that can be used to assess the mathematical correctness of the numerical solution to the elaborated model

  10. Analytical benchmarks for verification of thermal-hydraulic codes based on sub-channel approach

    Merroun, O. [LMR/ERSN, Department of Physics, Faculty of Sciences, Abdelmalek Essaadi University, B.P. 2121, Tetouan 93002 (Morocco)], E-mail: meroun.ossama@gmail.com; Almers, A. [Department of Energetics, Ecole Nationale Superieure d' Arts et Metiers, Moulay Ismail University, B.P. 4024, Meknes (Morocco); El Bardouni, T.; El Bakkari, B. [LMR/ERSN, Department of Physics, Faculty of Sciences, Abdelmalek Essaadi University, B.P. 2121, Tetouan 93002 (Morocco); Chakir, E. [LRM/EPTN, Department of Physics, Faculty of Sciences, Kenitra (Morocco)

    2009-04-15

    Over the last year (2007), preliminary tests have been performed on the Moroccan TRIGA MARK II research reactor to show that, under all operating conditions, the coolant parameters fall within the ranges allowing the safe working conditions of the reactor core. In parallel, a sub-channel thermal-hydraulic code, named SACATRI (Sub-channel Analysis Code for Application to TRIGA reactors), was developed to satisfy the needs of numerical simulation tools, able to predict the coolant flow parameters. The thermal-hydraulic model of SACATRI code is based on four partial differential equations that describe the conservation of mass, energy, axial and transversal momentum. However, to achieve the full task of any numerical code, verification is a highly recommended activity for assessing the accuracy of computational simulations. This paper presents a new procedure which can be used during code and solution verification activities of thermal-hydraulic tools based on sub-channel approach. The technique of verification proposed is based mainly on the combination of the method of manufactured solution and the order of accuracy test. The verification of SACATRI code allowed the elaboration of exact analytical benchmarks that can be used to assess the mathematical correctness of the numerical solution to the elaborated model.

  11. Hydrodynamic length-scale selection in microswimmer suspensions

    Heidenreich, Sebastian; Dunkel, Jörn; Klapp, Sabine H. L.; Bär, Markus

    2016-08-01

    A universal characteristic of mesoscale turbulence in active suspensions is the emergence of a typical vortex length scale, distinctly different from the scale invariance of turbulent high-Reynolds number flows. Collective length-scale selection has been observed in bacterial fluids, endothelial tissue, and active colloids, yet the physical origins of this phenomenon remain elusive. Here, we systematically derive an effective fourth-order field theory from a generic microscopic model that allows us to predict the typical vortex size in microswimmer suspensions. Building on a self-consistent closure condition, the derivation shows that the vortex length scale is determined by the competition between local alignment forces, rotational diffusion, and intermediate-range hydrodynamic interactions. Vortex structures found in simulations of the theory agree with recent measurements in Bacillus subtilis suspensions. Moreover, our approach yields an effective viscosity enhancement (reduction), as reported experimentally for puller (pusher) microorganisms.

  12. Establishment of verification system for solid waste

    Solid wastes generated from MOX Facility have to be verified as same as nuclear fuel materials according to the IAEA safeguards criteria. On the other hand, from storing efficiency point of view, solid waste drums must be piled up (3 layers). However, it was very difficult to take out the drums randomly selected for verification of piled up drums. So it was necessary to develop new verification system which measures the selected drum easily and speedily without moving it. The system measuring the waste drum directly in narrow space of pallet for forklift-nails. This system consists of NaI(Tl) detector, collimator with wheels, PMCA (Portable Multichannel Analyzer), rails and cables. This system can confirm existence of Pu in drums by counting γ-Ray of Pu-241 (208 keV). This system is very small and light because of easy operating in narrow space and high position. (author)

  13. Spatial Verification Using Wavelet Transforms: A Review

    Weniger, Michael; Friederichs, Petra

    2016-01-01

    Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the stat...

  14. Packaged low-level waste verification system

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  15. Formal verification of industrial control systems

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  16. Sensor-fusion-based biometric identity verification

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  17. Automated Formal Verification for PLC Control Systems

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  18. Collaborative Localization and Location Verification in WSNs

    Chunyu Miao

    2015-05-01

    Full Text Available Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive.

  19. Sensor-fusion-based biometric identity verification

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  20. Safety Verification for Probabilistic Hybrid Systems

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2012-01-01

    The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... probabilistic hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based on...... abstractions computed by tools for the analysis of non-probabilistic hybrid systems, improvements in effectivity of such tools directly carry over to improvements in effectivity of the technique we describe. We demonstrate the applicability of our approach on a number of case studies, tackled using a...